1
|
Lu Y, Lu X, Zheng L, Sun M, Chen S, Chen B, Wang T, Yang J, Lv C. Application of Multimodal Transformer Model in Intelligent Agricultural Disease Detection and Question-Answering Systems. PLANTS (BASEL, SWITZERLAND) 2024; 13:972. [PMID: 38611501 PMCID: PMC11013167 DOI: 10.3390/plants13070972] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/30/2024] [Revised: 03/22/2024] [Accepted: 03/24/2024] [Indexed: 04/14/2024]
Abstract
In this study, an innovative approach based on multimodal data and the transformer model was proposed to address challenges in agricultural disease detection and question-answering systems. This method effectively integrates image, text, and sensor data, utilizing deep learning technologies to profoundly analyze and process complex agriculture-related issues. The study achieved technical breakthroughs and provides new perspectives and tools for the development of intelligent agriculture. In the task of agricultural disease detection, the proposed method demonstrated outstanding performance, achieving a precision, recall, and accuracy of 0.95, 0.92, and 0.94, respectively, significantly outperforming the other conventional deep learning models. These results indicate the method's effectiveness in identifying and accurately classifying various agricultural diseases, particularly excelling in handling subtle features and complex data. In the task of generating descriptive text from agricultural images, the method also exhibited impressive performance, with a precision, recall, and accuracy of 0.92, 0.88, and 0.91, respectively. This demonstrates that the method can not only deeply understand the content of agricultural images but also generate accurate and rich descriptive texts. The object detection experiment further validated the effectiveness of our approach, where the method achieved a precision, recall, and accuracy of 0.96, 0.91, and 0.94. This achievement highlights the method's capability for accurately locating and identifying agricultural targets, especially in complex environments. Overall, the approach in this study not only demonstrated exceptional performance in multiple tasks such as agricultural disease detection, image captioning, and object detection but also showcased the immense potential of multimodal data and deep learning technologies in the application of intelligent agriculture.
Collapse
Affiliation(s)
| | | | | | | | | | | | | | | | - Chunli Lv
- China Agricultural University, Beijing 100083, China
| |
Collapse
|
2
|
Carlier A, Dandrifosse S, Dumont B, Mercatoris B. Comparing CNNs and PLSr for estimating wheat organs biophysical variables using proximal sensing. FRONTIERS IN PLANT SCIENCE 2023; 14:1204791. [PMID: 38053768 PMCID: PMC10694231 DOI: 10.3389/fpls.2023.1204791] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/14/2023] [Accepted: 10/30/2023] [Indexed: 12/07/2023]
Abstract
Estimation of biophysical vegetation variables is of interest for diverse applications, such as monitoring of crop growth and health or yield prediction. However, remote estimation of these variables remains challenging due to the inherent complexity of plant architecture, biology and surrounding environment, and the need for features engineering. Recent advancements in deep learning, particularly convolutional neural networks (CNN), offer promising solutions to address this challenge. Unfortunately, the limited availability of labeled data has hindered the exploration of CNNs for regression tasks, especially in the frame of crop phenotyping. In this study, the effectiveness of various CNN models in predicting wheat dry matter, nitrogen uptake, and nitrogen concentration from RGB and multispectral images taken from tillering to maturity was examined. To overcome the scarcity of labeled data, a training pipeline was devised. This pipeline involves transfer learning, pseudo-labeling of unlabeled data and temporal relationship correction. The results demonstrated that CNN models significantly benefit from the pseudolabeling method, while the machine learning approach employing a PLSr did not show comparable performance. Among the models evaluated, EfficientNetB4 achieved the highest accuracy for predicting above-ground biomass, with an R² value of 0.92. In contrast, Resnet50 demonstrated superior performance in predicting LAI, nitrogen uptake, and nitrogen concentration, with R² values of 0.82, 0.73, and 0.80, respectively. Moreover, the study explored multi-output models to predict the distribution of dry matter and nitrogen uptake between stem, inferior leaves, flag leaf, and ear. The findings indicate that CNNs hold promise as accessible and promising tools for phenotyping quantitative biophysical variables of crops. However, further research is required to harness their full potential.
Collapse
Affiliation(s)
- Alexis Carlier
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, Gembloux, Belgium
| | - Sébastien Dandrifosse
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, Gembloux, Belgium
| | - Benjamin Dumont
- Plant Sciences, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, Gembloux, Belgium
| | - Benoit Mercatoris
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, Gembloux, Belgium
| |
Collapse
|
3
|
Carlier A, Dandrifosse S, Dumont B, Mercatoris B. To What Extent Does Yellow Rust Infestation Affect Remotely Sensed Nitrogen Status? PLANT PHENOMICS (WASHINGTON, D.C.) 2023; 5:0083. [PMID: 37681000 PMCID: PMC10482323 DOI: 10.34133/plantphenomics.0083] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/22/2022] [Accepted: 08/03/2023] [Indexed: 09/09/2023]
Abstract
The utilization of high-throughput in-field phenotyping systems presents new opportunities for evaluating crop stress. However, existing studies have primarily focused on individual stresses, overlooking the fact that crops in field conditions frequently encounter multiple stresses, which can display similar symptoms or interfere with the detection of other stress factors. Therefore, this study aimed to investigate the impact of wheat yellow rust on reflectance measurements and nitrogen status assessment. A multi-sensor mobile platform was utilized to capture RGB and multispectral images throughout a 2-year fertilization-fungicide trial. To identify disease-induced damage, the SegVeg approach, which combines a U-NET architecture and a pixel-wise classifier, was applied to RGB images, generating a mask capable of distinguishing between healthy and damaged areas of the leaves. The observed proportion of damage in the images demonstrated similar effectiveness to visual scoring methods in explaining grain yield. Furthermore, the study discovered that the disease not only affected reflectance through leaf damage but also influenced the reflectance of healthy areas by disrupting the overall nitrogen status of the plants. This emphasizes the importance of incorporating disease impact into reflectance-based decision support tools to account for its effects on spectral data. This effect was successfully mitigated by employing the NDRE vegetation index calculated exclusively from the healthy portions of the leaves or by incorporating the proportion of damage into the model. However, these findings also highlight the necessity for further research specifically addressing the challenges presented by multiple stresses in crop phenotyping.
Collapse
Affiliation(s)
- Alexis Carlier
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech,
University of Liège, 5030 Gembloux, Belgium
| | - Sebastien Dandrifosse
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech,
University of Liège, 5030 Gembloux, Belgium
| | - Benjamin Dumont
- Plant Sciences, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech,
University of Liège, 5030 Gembloux, Belgium
| | - Benoît Mercatoris
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech,
University of Liège, 5030 Gembloux, Belgium
| |
Collapse
|
4
|
Smeesters L, Verbaenen J, Schifano L, Vervaeke M, Thienpont H, Teti G, Forconi A, Lulli F. Wide-Field-of-View Multispectral Camera Design for Continuous Turfgrass Monitoring. SENSORS (BASEL, SWITZERLAND) 2023; 23:2470. [PMID: 36904674 PMCID: PMC10007062 DOI: 10.3390/s23052470] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/15/2022] [Revised: 02/10/2023] [Accepted: 02/20/2023] [Indexed: 06/18/2023]
Abstract
Sustainably using resources, while reducing the use of chemicals, is of major importance in agriculture, including turfgrass monitoring. Today, crop monitoring often uses camera-based drone sensing, offering an accurate evaluation but typically requiring a technical operator. To enable autonomous and continuous monitoring, we propose a novel five-channel multispectral camera design suitable for integrating it inside lighting fixtures and enabling the sensing of a multitude of vegetation indices by covering visible, near-infrared and thermal wavelength bands. To limit the number of cameras, and in contrast to the drone-sensing systems that show a small field of view, a novel wide-field-of-view imaging design is proposed, featuring a field of view exceeding 164°. This paper presents the development of the five-channel wide-field-of-view imaging design, starting from the optimization of the design parameters and moving toward a demonstrator setup and optical characterization. All imaging channels show an excellent image quality, indicated by an MTF exceeding 0.5 at a spatial frequency of 72 lp/mm for the visible and near-infrared imaging designs and 27 lp/mm for the thermal channel. Consequently, we believe our novel five-channel imaging design paves the way toward autonomous crop monitoring while optimizing resource usage.
Collapse
Affiliation(s)
- Lien Smeesters
- Brussels Photonics (B-PHOT) and Flanders Make, Department of Applied Physics and Photonics, Vrije Universiteit Brussel, Pleinlaan 2, 1050 Brussels, Belgium
| | - Jef Verbaenen
- Brussels Photonics (B-PHOT) and Flanders Make, Department of Applied Physics and Photonics, Vrije Universiteit Brussel, Pleinlaan 2, 1050 Brussels, Belgium
| | - Luca Schifano
- Brussels Photonics (B-PHOT) and Flanders Make, Department of Applied Physics and Photonics, Vrije Universiteit Brussel, Pleinlaan 2, 1050 Brussels, Belgium
| | - Michael Vervaeke
- Brussels Photonics (B-PHOT) and Flanders Make, Department of Applied Physics and Photonics, Vrije Universiteit Brussel, Pleinlaan 2, 1050 Brussels, Belgium
| | - Hugo Thienpont
- Brussels Photonics (B-PHOT) and Flanders Make, Department of Applied Physics and Photonics, Vrije Universiteit Brussel, Pleinlaan 2, 1050 Brussels, Belgium
| | | | | | | |
Collapse
|
5
|
Wang L, Miao Y, Han Y, Li H, Zhang M, Peng C. Extraction of 3D distribution of potato plant CWSI based on thermal infrared image and binocular stereovision system. FRONTIERS IN PLANT SCIENCE 2023; 13:1104390. [PMID: 36762177 PMCID: PMC9903339 DOI: 10.3389/fpls.2022.1104390] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/21/2022] [Accepted: 12/13/2022] [Indexed: 06/18/2023]
Abstract
As the largest component of crops, water has an important impact on the growth and development of crops. Timely, rapid, continuous, and non-destructive detection of crop water stress status is crucial for crop water-saving irrigation, production, and breeding. Indices based on leaf or canopy temperature acquired by thermal imaging are widely used for crop water stress diagnosis. However, most studies fail to achieve high-throughput, continuous water stress detection and mostly focus on two-dimension measurements. This study developed a low-cost three-dimension (3D) motion robotic system, which is equipped with a designed 3D imaging system to automatically collect potato plant data, including thermal and binocular RGB data. A method is developed to obtain 3D plant fusion point cloud with depth, temperature, and RGB color information using the acquired thermal and binocular RGB data. Firstly, the developed system is used to automatically collect the data of the potato plants in the scene. Secondly, the collected data was processed, and the green canopy was extracted from the color image, which is convenient for the speeded-up robust features algorithm to detect more effective matching features. Photogrammetry combined with structural similarity index was applied to calculate the optimal homography transform matrix between thermal and color images and used for image registration. Thirdly, based on the registration of the two images, 3D reconstruction was carried out using binocular stereo vision technology to generate the original 3D point cloud with temperature information. The original 3D point cloud data were further processed through canopy extraction, denoising, and k-means based temperature clustering steps to optimize the data. Finally, the crop water stress index (CWSI) of each point and average CWSI in the canopy were calculated, and its daily variation and influencing factors were analyzed in combination with environmental parameters. The developed system and the proposed method can effectively detect the water stress status of potato plants in 3D, which can provide support for analyzing the differences in the three-dimensional distribution and spatial and temporal variation patterns of CWSI in potato.
Collapse
Affiliation(s)
- Liuyang Wang
- Key Laboratory of Agricultural Information Acquisition Technology, Ministry of Agriculture and Rural Affairs, China Agricultural University, Beijing, China
| | - Yanlong Miao
- Key Laboratory of Smart Agriculture System Integration Research, Ministry of Education, China Agricultural University, Beijing, China
| | - Yuxiao Han
- Key Laboratory of Agricultural Information Acquisition Technology, Ministry of Agriculture and Rural Affairs, China Agricultural University, Beijing, China
| | - Han Li
- Key Laboratory of Agricultural Information Acquisition Technology, Ministry of Agriculture and Rural Affairs, China Agricultural University, Beijing, China
| | - Man Zhang
- Key Laboratory of Smart Agriculture System Integration Research, Ministry of Education, China Agricultural University, Beijing, China
| | - Cheng Peng
- Key Laboratory of Agricultural Information Acquisition Technology, Ministry of Agriculture and Rural Affairs, China Agricultural University, Beijing, China
| |
Collapse
|
6
|
Dandrifosse S, Carlier A, Dumont B, Mercatoris B. In-Field Wheat Reflectance: How to Reach the Organ Scale? SENSORS (BASEL, SWITZERLAND) 2022; 22:3342. [PMID: 35591041 PMCID: PMC9101491 DOI: 10.3390/s22093342] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/23/2022] [Revised: 04/20/2022] [Accepted: 04/26/2022] [Indexed: 06/15/2023]
Abstract
The reflectance of wheat crops provides information on their architecture or physiology. However, the methods currently used for close-range reflectance computation do not allow for the separation of the wheat canopy organs: the leaves and the ears. This study details a method to achieve high-throughput measurements of wheat reflectance at the organ scale. A nadir multispectral camera array and an incident light spectrometer were used to compute bi-directional reflectance factor (BRF) maps. Image thresholding and deep learning ear detection allowed for the segmentation of the ears and the leaves in the maps. The results showed that the BRF measured on reference targets was constant throughout the day but varied with the acquisition date. The wheat organ BRF was constant throughout the day in very cloudy conditions and with high sun altitudes but showed gradual variations in the morning under sunny or partially cloudy sky. As a consequence, measurements should be performed close to solar noon and the reference panel should be captured at the beginning and end of each field trip to correct the BRF. The method, with such precautions, was tested all throughout the wheat growing season on two varieties and various canopy architectures generated by a fertilization gradient. The method yielded consistent reflectance dynamics in all scenarios.
Collapse
Affiliation(s)
- Sébastien Dandrifosse
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium; (A.C.); (B.M.)
| | - Alexis Carlier
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium; (A.C.); (B.M.)
| | - Benjamin Dumont
- Plant Sciences, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium;
| | - Benoît Mercatoris
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium; (A.C.); (B.M.)
| |
Collapse
|
7
|
Carlier A, Dandrifosse S, Dumont B, Mercatoris B. Wheat Ear Segmentation Based on a Multisensor System and Superpixel Classification. PLANT PHENOMICS (WASHINGTON, D.C.) 2022; 2022:9841985. [PMID: 35169713 PMCID: PMC8817947 DOI: 10.34133/2022/9841985] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/20/2021] [Accepted: 12/24/2021] [Indexed: 05/05/2023]
Abstract
The automatic segmentation of ears in wheat canopy images is an important step to measure ear density or extract relevant plant traits separately for the different organs. Recent deep learning algorithms appear as promising tools to accurately detect ears in a wide diversity of conditions. However, they remain complicated to implement and necessitate a huge training database. This paper is aimed at proposing an easy and quick to train and robust alternative to segment wheat ears from heading to maturity growth stage. The tested method was based on superpixel classification exploiting features from RGB and multispectral cameras. Three classifiers were trained with wheat images acquired from heading to maturity on two cultivars at different levels of fertilizer. The best classifier, the support vector machine (SVM), yielded satisfactory segmentation and reached 94% accuracy. However, the segmentation at the pixel level could not be assessed only by the superpixel classification accuracy. For this reason, a second assessment method was proposed to consider the entire process. A simple graphical tool was developed to annotate pixels. The strategy was to annotate a few pixels per image to be able to quickly annotate the entire image set, and thus account for very diverse conditions. Results showed a lesser segmentation score (F1-score) for the heading and flowering stages and for the zero nitrogen input object. The methodology appeared appropriate for further work on the growth dynamics of the different wheat organs and in the frame of other segmentation challenges.
Collapse
Affiliation(s)
- Alexis Carlier
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Centre, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium
| | - Sébastien Dandrifosse
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Centre, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium
| | - Benjamin Dumont
- Plant Sciences, TERRA Teaching and Research Centre, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium
| | - Benoît Mercatoris
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Centre, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium
| |
Collapse
|
8
|
David E, Serouart M, Smith D, Madec S, Velumani K, Liu S, Wang X, Pinto F, Shafiee S, Tahir ISA, Tsujimoto H, Nasuda S, Zheng B, Kirchgessner N, Aasen H, Hund A, Sadhegi-Tehran P, Nagasawa K, Ishikawa G, Dandrifosse S, Carlier A, Dumont B, Mercatoris B, Evers B, Kuroki K, Wang H, Ishii M, Badhon MA, Pozniak C, LeBauer DS, Lillemo M, Poland J, Chapman S, de Solan B, Baret F, Stavness I, Guo W. Global Wheat Head Detection 2021: An Improved Dataset for Benchmarking Wheat Head Detection Methods. PLANT PHENOMICS (WASHINGTON, D.C.) 2021; 2021:9846158. [PMID: 34778804 PMCID: PMC8548052 DOI: 10.34133/2021/9846158] [Citation(s) in RCA: 30] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/31/2021] [Accepted: 08/11/2021] [Indexed: 05/03/2023]
Abstract
The Global Wheat Head Detection (GWHD) dataset was created in 2020 and has assembled 193,634 labelled wheat heads from 4700 RGB images acquired from various acquisition platforms and 7 countries/institutions. With an associated competition hosted in Kaggle, GWHD_2020 has successfully attracted attention from both the computer vision and agricultural science communities. From this first experience, a few avenues for improvements have been identified regarding data size, head diversity, and label reliability. To address these issues, the 2020 dataset has been reexamined, relabeled, and complemented by adding 1722 images from 5 additional countries, allowing for 81,553 additional wheat heads. We now release in 2021 a new version of the Global Wheat Head Detection dataset, which is bigger, more diverse, and less noisy than the GWHD_2020 version.
Collapse
Affiliation(s)
- Etienne David
- Arvalis, Institut du Végétal, 3 Rue Joseph et Marie Hackin, 75116 Paris, France
- UMR1114 EMMAH, INRAE, Centre PACA, Bâtiment Climat, Domaine Saint-Paul, 228 Route de l'Aérodrome, CS 40509, 84914 Avignon Cedex, France
| | - Mario Serouart
- Arvalis, Institut du Végétal, 3 Rue Joseph et Marie Hackin, 75116 Paris, France
- UMR1114 EMMAH, INRAE, Centre PACA, Bâtiment Climat, Domaine Saint-Paul, 228 Route de l'Aérodrome, CS 40509, 84914 Avignon Cedex, France
| | - Daniel Smith
- School of Food and Agricultural Sciences, The University of Queensland, Gatton, 4343 QLD, Australia
| | - Simon Madec
- Arvalis, Institut du Végétal, 3 Rue Joseph et Marie Hackin, 75116 Paris, France
- School of Food and Agricultural Sciences, The University of Queensland, Gatton, 4343 QLD, Australia
| | - Kaaviya Velumani
- UMR1114 EMMAH, INRAE, Centre PACA, Bâtiment Climat, Domaine Saint-Paul, 228 Route de l'Aérodrome, CS 40509, 84914 Avignon Cedex, France
- Hiphen SAS, 120 Rue Jean Dausset, Agroparc, Bâtiment Technicité, 84140 Avignon, France
| | - Shouyang Liu
- Plant Phenomics Research Center, Nanjing Agricultural University, Nanjing, China
| | - Xu Wang
- Wheat Genetics Resource Center, Dep. of Plant Pathology, Kansas State Univ., 4024 Throckmorton Plant Sciences Center, Manhattan, Kansas, USA
| | - Francisco Pinto
- Global Wheat Program, International Maize and Wheat Improvement Centre (CIMMYT), Mexico, D.F., Mexico
| | - Shahameh Shafiee
- Faculty of Biosciences, Norwegian University of Life Sciences, P.O. Box 5003, NO-1432 Ås, Norway
| | - Izzat S. A. Tahir
- Agricultural Research Corporation, Wheat Research Program, P.O. Box 126, Wad Medani, Sudan
| | - Hisashi Tsujimoto
- Arid Land Research Center, Tottori University, Tottori 680-0001, Japan
| | - Shuhei Nasuda
- Laboratories of Plant Genetics and Plant Breeding, Graduate School of Agriculture, Kyoto University, Japan
| | - Bangyou Zheng
- CSIRO Agriculture and Food, Queensland Biosciences Precinct, 306 Carmody Road, St Lucia, 4067 QLD, Australia
| | - Norbert Kirchgessner
- Institute of Agricultural Sciences, ETH Zurich, Universitätstrasse 2, 8092 Zurich, Switzerland
| | - Helge Aasen
- Institute of Agricultural Sciences, ETH Zurich, Universitätstrasse 2, 8092 Zurich, Switzerland
| | - Andreas Hund
- Institute of Agricultural Sciences, ETH Zurich, Universitätstrasse 2, 8092 Zurich, Switzerland
| | | | - Koichi Nagasawa
- Institute of Crop Science, National Agriculture and Food Research Organization, Japan
| | - Goro Ishikawa
- Hokkaido Agricultural Research Center, National Agriculture and Food Research Organization, Japan
| | - Sébastien Dandrifosse
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium
| | - Alexis Carlier
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium
| | - Benjamin Dumont
- Plant Sciences, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium
| | - Benoit Mercatoris
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium
| | - Byron Evers
- Wheat Genetics Resource Center, Dep. of Plant Pathology, Kansas State Univ., 4024 Throckmorton Plant Sciences Center, Manhattan, Kansas, USA
| | - Ken Kuroki
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1 Midori-cho, Nishitokyo City, Tokyo, Japan
| | - Haozhou Wang
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1 Midori-cho, Nishitokyo City, Tokyo, Japan
| | - Masanori Ishii
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1 Midori-cho, Nishitokyo City, Tokyo, Japan
| | | | - Curtis Pozniak
- Department of Plant Sciences, University of Saskatchewan, Canada
| | - David Shaner LeBauer
- College of Agriculture and Life Sciences, University of Arizona, Tucson, Arizona, USA
| | - Morten Lillemo
- Faculty of Biosciences, Norwegian University of Life Sciences, P.O. Box 5003, NO-1432 Ås, Norway
| | - Jesse Poland
- Wheat Genetics Resource Center, Dep. of Plant Pathology, Kansas State Univ., 4024 Throckmorton Plant Sciences Center, Manhattan, Kansas, USA
| | - Scott Chapman
- School of Food and Agricultural Sciences, The University of Queensland, Gatton, 4343 QLD, Australia
- CSIRO Agriculture and Food, Queensland Biosciences Precinct, 306 Carmody Road, St Lucia, 4067 QLD, Australia
| | - Benoit de Solan
- Arvalis, Institut du Végétal, 3 Rue Joseph et Marie Hackin, 75116 Paris, France
| | - Frédéric Baret
- UMR1114 EMMAH, INRAE, Centre PACA, Bâtiment Climat, Domaine Saint-Paul, 228 Route de l'Aérodrome, CS 40509, 84914 Avignon Cedex, France
| | - Ian Stavness
- Department of Computer Science, University of Saskatchewan, Canada
| | - Wei Guo
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1 Midori-cho, Nishitokyo City, Tokyo, Japan
| |
Collapse
|