1
|
Pipatsitee P, Tisarum R, Taota K, Samphumphuang T, Eiumnoh A, Singh HP, Cha-Um S. Effectiveness of vegetation indices and UAV-multispectral imageries in assessing the response of hybrid maize (Zea mays L.) to water deficit stress under field environment. Environ Monit Assess 2022; 195:128. [PMID: 36402920 DOI: 10.1007/s10661-022-10766-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/09/2021] [Accepted: 11/12/2022] [Indexed: 06/16/2023]
Abstract
Unmanned aerial vehicles (UAVs) equipped with multi-sensors are one of the most innovative technologies for measuring plant health and predicting final yield in field conditions, especially in the water deficit situation in rain-deprived regions. The objective of this investigation was to evaluate the individual plant and canopy-level measurements using UAV imageries in three different genotypes, Suwan4452 (drought-tolerant), Pac339, and S7328 (drought-sensitive) of maize (Zea mays L.) at vegetative and reproductive stages under WW (well-watered) and WD (water deficit) conditions. At the vegetative stage, only CWSI (crop water stress index) of Pac339 and S7328 under WD increased significantly by 1.86- and 1.69-fold over WW, whereas the vegetation indices (EVI2 (Enhanced Vegetation Index 2), OSAVI (Optimized Soil-Adjusted Vegetation Index), GNDVI (Green Normalized Difference Vegetation Index), NDRE (Normalized Difference Red Edge Index), and NDVI (Normalized Difference Vegetation Index)) derived from UAV multi-sensors did not vary. At the reproductive stage, CWSI in drought-sensitive genotype (S7328) under WD increased by 1.92-fold over WW. All the vegetation indices (EVI2, OSAVI, GNDVI, NDRE, and NDVI) of Pac339 and S7328 under WD decreased when compared with those of Suwan4452. NDVI derived from GreenSeeker® handheld and NDVI from UAV data was closely related (R2 = 0.5924). An increase in leaf temperature (Tleaf) and reduction in NDVI of WD stressed maize plants was observed (R2 = 0.5829) leading to yield loss (R2 = 0.5198). In summary, a close correlation was observed between the physiological data of individual plants and vegetation indices of canopy level (collected using a UAV platform) in drought-sensitive genotypes of maize crops under WD conditions, thus indicating its effectiveness in the classification of drought-tolerant genotypes.
Collapse
Affiliation(s)
- Piyanan Pipatsitee
- National Center for Genetic Engineering and Biotechnology (BIOTEC), National Science and Technology Development Agency (NSTDA), 113 Thailand Science Park, Paholyothin Road, Khlong Nueng, Khlong Luang, Pathum Thani, 12120, Thailand
| | - Rujira Tisarum
- National Center for Genetic Engineering and Biotechnology (BIOTEC), National Science and Technology Development Agency (NSTDA), 113 Thailand Science Park, Paholyothin Road, Khlong Nueng, Khlong Luang, Pathum Thani, 12120, Thailand
| | - Kanyarat Taota
- National Center for Genetic Engineering and Biotechnology (BIOTEC), National Science and Technology Development Agency (NSTDA), 113 Thailand Science Park, Paholyothin Road, Khlong Nueng, Khlong Luang, Pathum Thani, 12120, Thailand
| | - Thapanee Samphumphuang
- National Center for Genetic Engineering and Biotechnology (BIOTEC), National Science and Technology Development Agency (NSTDA), 113 Thailand Science Park, Paholyothin Road, Khlong Nueng, Khlong Luang, Pathum Thani, 12120, Thailand
| | - Apisit Eiumnoh
- National Center for Genetic Engineering and Biotechnology (BIOTEC), National Science and Technology Development Agency (NSTDA), 113 Thailand Science Park, Paholyothin Road, Khlong Nueng, Khlong Luang, Pathum Thani, 12120, Thailand
| | - Harminder Pal Singh
- Department of Environment Studies, Faculty of Science, Panjab University, Chandigarh, 160014, India
| | - Suriyan Cha-Um
- National Center for Genetic Engineering and Biotechnology (BIOTEC), National Science and Technology Development Agency (NSTDA), 113 Thailand Science Park, Paholyothin Road, Khlong Nueng, Khlong Luang, Pathum Thani, 12120, Thailand.
| |
Collapse
|
2
|
Zhou C, Ye H, Hu J, Shi X, Hua S, Yue J, Xu Z, Yang G. Automated Counting of Rice Panicle by Applying Deep Learning Model to Images from Unmanned Aerial Vehicle Platform. Sensors (Basel) 2019; 19:s19143106. [PMID: 31337086 PMCID: PMC6679257 DOI: 10.3390/s19143106] [Citation(s) in RCA: 25] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/03/2019] [Revised: 07/08/2019] [Accepted: 07/11/2019] [Indexed: 12/03/2022]
Abstract
The number of panicles per unit area is a common indicator of rice yield and is of great significance to yield estimation, breeding, and phenotype analysis. Traditional counting methods have various drawbacks, such as long delay times and high subjectivity, and they are easily perturbed by noise. To improve the accuracy of rice detection and counting in the field, we developed and implemented a panicle detection and counting system that is based on improved region-based fully convolutional networks, and we use the system to automate rice-phenotype measurements. The field experiments were conducted in target areas to train and test the system and used a rotor light unmanned aerial vehicle equipped with a high-definition RGB camera to collect images. The trained model achieved a precision of 0.868 on a held-out test set, which demonstrates the feasibility of this approach. The algorithm can deal with the irregular edge of the rice panicle, the significantly different appearance between the different varieties and growing periods, the interference due to color overlapping between panicle and leaves, and the variations in illumination intensity and shading effects in the field. The result is more accurate and efficient recognition of rice-panicles, which facilitates rice breeding. Overall, the approach of training deep learning models on increasingly large and publicly available image datasets presents a clear path toward smartphone-assisted crop disease diagnosis on a global scale.
Collapse
Affiliation(s)
- Chengquan Zhou
- Institute of Agricultural Equipment, Zhejiang Academy of Agricultural Sciences (ZAAS), Hangzhou 310000, China
| | - Hongbao Ye
- Institute of Agricultural Equipment, Zhejiang Academy of Agricultural Sciences (ZAAS), Hangzhou 310000, China
| | - Jun Hu
- Institute of Agricultural Equipment, Zhejiang Academy of Agricultural Sciences (ZAAS), Hangzhou 310000, China
| | - Xiaoyan Shi
- Institute of Agricultural Equipment, Zhejiang Academy of Agricultural Sciences (ZAAS), Hangzhou 310000, China
| | - Shan Hua
- Institute of Agricultural Equipment, Zhejiang Academy of Agricultural Sciences (ZAAS), Hangzhou 310000, China
| | - Jibo Yue
- Key Laboratory of Quantitative Remote Sensing in Agriculture of Ministry of Agriculture P. R. China, Beijing Research Center for Information Technology in Agriculture, Beijing 100089, China
- Key Laboratory of Agri-informatics, Ministry of Agriculture, Beijing 100089, China
| | - Zhifu Xu
- Institute of Agricultural Equipment, Zhejiang Academy of Agricultural Sciences (ZAAS), Hangzhou 310000, China.
| | - Guijun Yang
- Key Laboratory of Quantitative Remote Sensing in Agriculture of Ministry of Agriculture P. R. China, Beijing Research Center for Information Technology in Agriculture, Beijing 100089, China.
- Key Laboratory of Agri-informatics, Ministry of Agriculture, Beijing 100089, China.
| |
Collapse
|