1
|
Li H, Yan X, Su P, Su Y, Li J, Xu Z, Gao C, Zhao Y, Feng M, Shafiq F, Xiao L, Yang W, Qiao X, Wang C. Estimation of winter wheat LAI based on color indices and texture features of RGB images taken by UAV. JOURNAL OF THE SCIENCE OF FOOD AND AGRICULTURE 2025; 105:189-200. [PMID: 39149861 DOI: 10.1002/jsfa.13817] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/20/2024] [Revised: 07/09/2024] [Accepted: 07/22/2024] [Indexed: 08/17/2024]
Abstract
BACKGROUND Leaf area index (LAI) is an important indicator for assessing plant growth and development, and is also closely related to photosynthesis in plants. The realization of rapid accurate estimation of crop LAI plays an important role in guiding farmland production. In study, the UAV-RGB technology was used to estimate LAI based on 65 winter wheat varieties at different fertility periods, the wheat varieties including farm varieties, main cultivars, new lines, core germplasm and foreign varieties. Color indices (CIs) and texture features were extracted from RGB images to determine their quantitative link to LAI. RESULTS The results revealed that among the extracted image features, LAI exhibited a significant positive correlation with CIs (r = 0.801), whereas there was a significant negative correlation with texture features (r = -0.783). Furthermore, the visible atmospheric resistance index, the green-red vegetation index, the modified green-red vegetation index in the CIs, and the mean in the texture features demonstrated a strong correlation with the LAI with r > 0.8. With reference to the model input variables, the backpropagation neural network (BPNN) model of LAI based on the CIs and texture features (R2 = 0.730, RMSE = 0.691, RPD = 1.927) outperformed other models constructed by individual variables. CONCLUSION This study offers a theoretical basis and technical reference for precise monitor on winter wheat LAI based on consumer-level UAVs. The BPNN model, incorporating CIs and texture features, proved to be superior in estimating LAI, and offered a reliable method for monitoring the growth of winter wheat. © 2024 Society of Chemical Industry.
Collapse
Affiliation(s)
- Hao Li
- College of Agriculture, Shanxi Agricultural University, Taigu, China
| | - Xiaobin Yan
- College of Agriculture, Shanxi Agricultural University, Taigu, China
| | - Pengyan Su
- College of Agriculture, Shanxi Agricultural University, Taigu, China
| | - Yiming Su
- College of Agriculture, Shanxi Agricultural University, Taigu, China
| | - Junfeng Li
- College of Horticulture, Shanxi Agricultural University, Taigu, China
| | - Zixin Xu
- College of Horticulture, Shanxi Agricultural University, Taigu, China
| | - Chunrui Gao
- College of Agriculture, Shanxi Agricultural University, Taigu, China
| | - Yu Zhao
- College of Agriculture, Shanxi Agricultural University, Taigu, China
| | - Meichen Feng
- College of Agriculture, Shanxi Agricultural University, Taigu, China
| | - Fahad Shafiq
- Department of Botany, Government College University Lahore, Punjab, Pakistan
| | - Lujie Xiao
- College of Agriculture, Shanxi Agricultural University, Taigu, China
| | - Wude Yang
- College of Agriculture, Shanxi Agricultural University, Taigu, China
| | - Xingxing Qiao
- College of Agriculture, Shanxi Agricultural University, Taigu, China
| | - Chao Wang
- College of Agriculture, Shanxi Agricultural University, Taigu, China
| |
Collapse
|
2
|
Parida PK, Eagan S, Ramanujam K, Sengodan R, Uthandi S, Ettiyagounder P, Rajagounder R. Machine learning approaches for estimation of the fraction of absorbed photosynthetically active radiation and net photosynthesis rate of maize using multi-spectral sensor. Heliyon 2024; 10:e34117. [PMID: 39091949 PMCID: PMC11292552 DOI: 10.1016/j.heliyon.2024.e34117] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2024] [Revised: 07/03/2024] [Accepted: 07/03/2024] [Indexed: 08/04/2024] Open
Abstract
The fraction of absorbed photosynthetically active radiation (FAPAR) and the photosynthesis rate (Pn) of maize canopies were identified as essential photosynthetic parameters for accurately estimating vegetation growth and productivity using multispectral vegetation indices (VIs). Despite their importance, few studies have compared the effectiveness of multispectral imagery and various machine learning techniques in estimating these photosynthetic traits under high vegetation coverage. In this study, seventeen multispectral VIs and four machine learning (ML) algorithms were utilized to determine the most suitable model for estimating maize FAPAR and Pn during the kharif and rabi seasons at Tamil Nadu Agricultural University, Coimbatore, India. Results demonstrate that indices such as OSAVI, SAVI, EVI-2, and MSAVI-2 during the kharif and MNDVIRE and MSRRE during the rabi season outperformed others in estimating FAPAR and Pn values. Among the four ML methods of random forest (RF), extreme gradient boosting (XGBoost), support vector regression (SVR), and multiple linear regression (MLR) considered, RF consistently showed the most effective fitting effect and XGBoost demonstrated the least fitting accuracy for FAPAR and Pn estimation. However, SVR with R2 = 0.873 and RMSE = 0.045 during the kharif and MLR with R2 = 0.838 and RMSE = 0.053 during the rabi season demonstrated higher fitting accuracy, particularly notable for FAPAR prediction. Similarly, in the prediction of Pn, MLR showed higher fitting accuracy with R2 = 0.741 and RMSE = 2.531 during the kharif and R2 = 0.955 and RMSE = 1.070 during the rabi season. This study demonstrated the potential of combining UAV-derived VIs with ML to develop accurate FAPAR and Pn prediction models, overcoming VI saturation in dense vegetation. It underscores the importance of optimizing these models to improve the accuracy of maize vegetation assessments during various growing seasons.
Collapse
Affiliation(s)
- Pradosh Kumar Parida
- Department of Agronomy, Tamil Nadu Agricultural University, Coimbatore, 641003, Tamil Nadu, India
| | - Somasundaram Eagan
- Directorate of Agribusiness Development (DABD), Tamil Nadu Agricultural University, Coimbatore, 641003, Tamil Nadu, India
| | - Krishnan Ramanujam
- Nammazhvar Organic Farming Research Centre, Tamil Nadu Agricultural University, Coimbatore, 641003, Tamil Nadu, India
| | - Radhamani Sengodan
- Department of Agronomy, Tamil Nadu Agricultural University, Coimbatore, 641003, Tamil Nadu, India
| | - Sivakumar Uthandi
- Department of Agricultural Microbiology, Tamil Nadu Agricultural University, Coimbatore, 641003, Tamil Nadu, India
| | - Parameswari Ettiyagounder
- Nammazhvar Organic Farming Research Centre, Tamil Nadu Agricultural University, Coimbatore, 641003, Tamil Nadu, India
| | - Raja Rajagounder
- ICAR-Central Institute for Cotton Research (CICR) Regional Station, Coimbatore, 641003, Tamil Nadu, India
| |
Collapse
|
3
|
Khalesi F, Ahmed I, Daponte P, Picariello F, De Vito L, Tudosa I. The Uncertainty Assessment by the Monte Carlo Analysis of NDVI Measurements Based on Multispectral UAV Imagery. SENSORS (BASEL, SWITZERLAND) 2024; 24:2696. [PMID: 38732802 PMCID: PMC11086219 DOI: 10.3390/s24092696] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/07/2024] [Revised: 04/09/2024] [Accepted: 04/22/2024] [Indexed: 05/13/2024]
Abstract
This paper proposes a workflow to assess the uncertainty of the Normalized Difference Vegetation Index (NDVI), a critical index used in precision agriculture to determine plant health. From a metrological perspective, it is crucial to evaluate the quality of vegetation indices, which are usually obtained by processing multispectral images for measuring vegetation, soil, and environmental parameters. For this reason, it is important to assess how the NVDI measurement is affected by the camera characteristics, light environmental conditions, as well as atmospheric and seasonal/weather conditions. The proposed study investigates the impact of atmospheric conditions on solar irradiation and vegetation reflection captured by a multispectral UAV camera in the red and near-infrared bands and the variation of the nominal wavelengths of the camera in these bands. Specifically, the study examines the influence of atmospheric conditions in three scenarios: dry-clear, humid-hazy, and a combination of both. Furthermore, this investigation takes into account solar irradiance variability and the signal-to-noise ratio (SNR) of the camera. Through Monte Carlo simulations, a sensitivity analysis is carried out against each of the above-mentioned uncertainty sources and their combination. The obtained results demonstrate that the main contributors to the NVDI uncertainty are the atmospheric conditions, the nominal wavelength tolerance of the camera, and the variability of the NDVI values within the considered leaf conditions (dry and fresh).
Collapse
Affiliation(s)
- Fatemeh Khalesi
- Department of Engineering, University of Sannio, 82100 Benevento, Italy; (I.A.); (P.D.); (F.P.); (L.D.V.); (I.T.)
| | | | | | | | | | | |
Collapse
|
4
|
Kaimaris D. Measurement Accuracy and Improvement of Thematic Information from Unmanned Aerial System Sensor Products in Cultural Heritage Applications. J Imaging 2024; 10:34. [PMID: 38392083 PMCID: PMC10890236 DOI: 10.3390/jimaging10020034] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2023] [Revised: 01/16/2024] [Accepted: 01/26/2024] [Indexed: 02/24/2024] Open
Abstract
In the context of producing a digital surface model (DSM) and an orthophotomosaic of a study area, a modern Unmanned Aerial System (UAS) allows us to reduce the time required both for primary data collection in the field and for data processing in the office. It features sophisticated sensors and systems, is easy to use and its products come with excellent horizontal and vertical accuracy. In this study, the UAS WingtraOne GEN II with RGB sensor (42 Mpixel), multispectral (MS) sensor (1.2 Mpixel) and built-in multi-frequency PPK GNSS antenna (for the high accuracy calculation of the coordinates of the centers of the received images) is used. The first objective is to test and compare the accuracy of the DSMs and orthophotomosaics generated from the UAS RGB sensor images when image processing is performed using only the PPK system measurements (without Ground Control Points (GCPs)), or when processing is performed using only GCPs. For this purpose, 20 GCPs and 20 Check Points (CPs) were measured in the field. The results show that the horizontal accuracy of orthophotomosaics is similar in both processing cases. The vertical accuracy is better in the case of image processing using only the GCPs, but that is subject to change, as the survey was only conducted at one location. The second objective is to perform image fusion using the images of the above two UAS sensors and to control the spectral information transferred from the MS to the fused images. The study was carried out at three archaeological sites (Northern Greece). The combined study of the correlation matrix and the ERGAS index value at each location reveals that the process of improving the spatial resolution of MS orthophotomosaics leads to suitable fused images for classification, and therefore image fusion can be performed by utilizing the images from the two sensors.
Collapse
Affiliation(s)
- Dimitris Kaimaris
- School of Spatial Planning and Development, Aristotle University of Thessaloniki, 54124 Thessaloniki, Greece
| |
Collapse
|
5
|
Herr AW, Carter AH. Remote sensing continuity: a comparison of HTP platforms and potential challenges with field applications. FRONTIERS IN PLANT SCIENCE 2023; 14:1233892. [PMID: 37790786 PMCID: PMC10544974 DOI: 10.3389/fpls.2023.1233892] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/02/2023] [Accepted: 08/29/2023] [Indexed: 10/05/2023]
Abstract
In an era of climate change and increased environmental variability, breeders are looking for tools to maintain and increase genetic gain and overall efficiency. In recent years the field of high throughput phenotyping (HTP) has received increased attention as an option to meet this need. There are many platform options in HTP, but ground-based handheld and remote aerial systems are two popular options. While many HTP setups have similar specifications, it is not always clear if data from different systems can be treated interchangeably. In this research, we evaluated two handheld radiometer platforms, Cropscan MSR16R and Spectra Vista Corp (SVC) HR-1024i, as well as a UAS-based system with a Sentera Quad Multispectral Sensor. Each handheld radiometer was used for two years simultaneously with the unoccupied aircraft systems (UAS) in collecting winter wheat breeding trials between 2018-2021. Spectral reflectance indices (SRI) were calculated for each system. SRI heritability and correlation were analyzed in evaluating the platform and SRI usability for breeding applications. Correlations of SRIs were low against UAS SRI and grain yield while using the Cropscan system in 2018 and 2019. Dissimilarly, the SVC system in 2020 and 2021 produced moderate correlations across UAS SRI and grain yield. UAS SRI were consistently more heritable, with broad-sense heritability ranging from 0.58 to 0.80. Data standardization and collection windows are important to consider in ensuring reliable data. Furthermore, practical aspects and best practices for these HTP platforms, relative to applied breeding applications, are highlighted and discussed. The findings of this study can be a framework to build upon when considering the implementation of HTP technology in an applied breeding program.
Collapse
Affiliation(s)
| | - Arron H. Carter
- Department of Crop and Soil Sciences, Washington State University, Pullman, WA, United States
| |
Collapse
|
6
|
Ku KB, Mansoor S, Han GD, Chung YS, Tuan TT. Identification of new cold tolerant Zoysia grass species using high-resolution RGB and multi-spectral imaging. Sci Rep 2023; 13:13209. [PMID: 37580436 PMCID: PMC10425389 DOI: 10.1038/s41598-023-40128-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2023] [Accepted: 08/04/2023] [Indexed: 08/16/2023] Open
Abstract
Zoysia grass (Zoysia spp.) is the most widely used warm-season turf grass in Korea due to its durability and resistance to environmental stresses. To develop new longer-period greenness cultivars, it is essential to screen germplasm which maintains the greenness at a lower temperature. Conventional methods are time-consuming, laborious, and subjective. Therefore, in this study, we demonstrate an objective and efficient method to screen maintaining longer greenness germplasm using RGB and multispectral images. From August to December, time-series data were acquired and we calculated green cover percentage (GCP), Normalized Difference Vegetation Index (NDVI), Normalized Difference Red Edge Index (NDRE), Soil-adjusted Vegetation Index (SAVI), and Enhanced Vegetation Index (EVI) values of germplasm from RGB and multispectral images by applying vegetation indexs. The result showed significant differences in GCP, NDVI, NDRE, SAVI, and EVI among germplasm (p < 0.05). The GCP, which evaluated the quantity of greenness by counting pixels of the green area from RGB images, exhibited maintenance of greenness over 90% for August and September but, sharply decrease from October. The study found significant differences in GCP and NDVI among germplasm. san208 exhibiting over 90% GCP and high NDVI values during 153 days. In addition, we also conducted assessments using various vegetation indexes, namely NDRE, SAVI, and EVI. san208 exhibited NDRE levels exceeding 3% throughout this period. As for SAVI, it initially started at approximately 38% and gradually decreased to around 4% over the course of these days. Furthermore, for the month of August, it recorded approximately 6%, but experienced a decline from about 9% to 1% between September and October. The complementary use of both indicators could be an efficient method for objectively assessing the greenness of turf both quantitatively and qualitatively.
Collapse
Affiliation(s)
- Ki-Bon Ku
- Department of Plant Resources and Environment, Jeju National University, Jeju, 63243, Republic of Korea
| | - Sheikh Mansoor
- Department of Plant Resources and Environment, Jeju National University, Jeju, 63243, Republic of Korea
| | - Gyung Deok Han
- Department of Practical Arts Education, Cheongju National University of Education, Cheongju, 28708, Republic of Korea
| | - Yong Suk Chung
- Department of Plant Resources and Environment, Jeju National University, Jeju, 63243, Republic of Korea.
| | - Thai Thanh Tuan
- Department of Plant Resources and Environment, Jeju National University, Jeju, 63243, Republic of Korea.
| |
Collapse
|
7
|
Wang Y, Yang Z, Gert K, Khan HA. The impact of variable illumination on vegetation indices and evaluation of illumination correction methods on chlorophyll content estimation using UAV imagery. PLANT METHODS 2023; 19:51. [PMID: 37245050 PMCID: PMC10224605 DOI: 10.1186/s13007-023-01028-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/16/2023] [Accepted: 05/09/2023] [Indexed: 05/29/2023]
Abstract
BACKGROUND The advancements in unmanned aerial vehicle (UAV) technology have recently emerged as an effective, cost-efficient, and versatile solution for monitoring crop growth with high spatial and temporal precision. This monitoring is usually achieved through the computation of vegetation indices (VIs) from agricultural lands. The VIs are based on the incoming radiance to the camera, which is affected when there is a change in the scene illumination. Such a change will cause a change in the VIs and subsequent measures, e.g., the VI-based chlorophyll-content estimation. In an ideal situation, the results from VIs should be free from the impact of scene illumination and should reflect the true state of the crop's condition. In this paper, we evaluate the performance of various VIs computed on images taken under sunny, overcast and partially cloudy days. To improve the invariance to the scene illumination, we furthermore evaluated the use of the empirical line method (ELM), which calibrates the drone images using reference panels, and the multi-scale Retinex algorithm, which performs an online calibration based on color constancy. For the assessment, we used the VIs to predict leaf chlorophyll content, which we then compared to field measurements. RESULTS The results show that the ELM worked well when the imaging conditions during the flight were stable but its performance degraded under variable illumination on a partially cloudy day. For leaf chlorophyll content estimation, The [Formula: see text] of the multivariant linear model built by VIs were 0.6 and 0.56 for sunny and overcast illumination conditions, respectively. The performance of the ELM-corrected model maintained stability and increased repeatability compared to non-corrected data. The Retinex algorithm effectively dealt with the variable illumination, outperforming the other methods in the estimation of chlorophyll content. The [Formula: see text] of the multivariable linear model based on illumination-corrected consistent VIs was 0.61 under the variable illumination condition. CONCLUSIONS Our work indicated the significance of illumination correction in improving the performance of VIs and VI-based estimation of chlorophyll content, particularly in the presence of fluctuating illumination conditions.
Collapse
Affiliation(s)
- Yuxiang Wang
- College of Engineering, China Agricultural University, Beijing, China.
- Farm Technology Group, Wageningen University and Research, Wageningen, The Netherlands.
| | - Zengling Yang
- College of Engineering, China Agricultural University, Beijing, China
| | - Kootstra Gert
- Farm Technology Group, Wageningen University and Research, Wageningen, The Netherlands
| | - Haris Ahmad Khan
- Farm Technology Group, Wageningen University and Research, Wageningen, The Netherlands
| |
Collapse
|
8
|
Jakubczyk K, Siemiątkowska B, Więckowski R, Rapcewicz J. Hyperspectral Imaging for Mobile Robot Navigation. SENSORS (BASEL, SWITZERLAND) 2022; 23:383. [PMID: 36616979 PMCID: PMC9824442 DOI: 10.3390/s23010383] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/29/2022] [Revised: 12/21/2022] [Accepted: 12/22/2022] [Indexed: 06/17/2023]
Abstract
The article presents the application of a hyperspectral camera in mobile robot navigation. Hyperspectral cameras are imaging systems that can capture a wide range of electromagnetic spectra. This feature allows them to detect a broader range of colors and features than traditional cameras and to perceive the environment more accurately. Several surface types, such as mud, can be challenging to detect using an RGB camera. In our system, the hyperspectral camera is used for ground recognition (e.g., grass, bumpy road, asphalt). Traditional global path planning methods take the shortest path length as the optimization objective. We propose an improved A* algorithm to generate the collision-free path. Semantic information makes it possible to plan a feasible and safe path in a complex off-road environment, taking traveling time as the optimization objective. We presented the results of the experiments for data collected in a natural environment. An important novelty of this paper is using a modified nearest neighbor method for hyperspectral data analysis and then using the data for path planning tasks in the same work. Using the nearest neighbor method allows us to adjust the robotic system much faster than using neural networks. As our system is continuously evolving, we intend to examine the performance of the vehicle on various road surfaces, which is why we sought to create a classification system that does not require a prolonged learning process. In our paper, we aimed to demonstrate that the incorporation of a hyperspectral camera can not only enhance route planning but also aid in the determination of parameters such as speed and acceleration.
Collapse
Affiliation(s)
- Kacper Jakubczyk
- Institute of Automatic Control and Robotics, Warsaw University of Technology, 02-525 Warsaw, Poland
| | - Barbara Siemiątkowska
- Institute of Automatic Control and Robotics, Warsaw University of Technology, 02-525 Warsaw, Poland
| | - Rafał Więckowski
- Łukasiewicz Research Network—Industrial Research Institute for Automation and Measurements PIAP, 02-486 Warsaw, Poland
| | - Jerzy Rapcewicz
- Institute of Automatic Control and Robotics, Warsaw University of Technology, 02-525 Warsaw, Poland
| |
Collapse
|
9
|
Sousa JJ, Toscano P, Matese A, Di Gennaro SF, Berton A, Gatti M, Poni S, Pádua L, Hruška J, Morais R, Peres E. UAV-Based Hyperspectral Monitoring Using Push-Broom and Snapshot Sensors: A Multisite Assessment for Precision Viticulture Applications. SENSORS (BASEL, SWITZERLAND) 2022; 22:6574. [PMID: 36081033 PMCID: PMC9460142 DOI: 10.3390/s22176574] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/25/2022] [Revised: 08/24/2022] [Accepted: 08/25/2022] [Indexed: 06/15/2023]
Abstract
Hyperspectral aerial imagery is becoming increasingly available due to both technology evolution and a somewhat affordable price tag. However, selecting a proper UAV + hyperspectral sensor combo to use in specific contexts is still challenging and lacks proper documental support. While selecting an UAV is more straightforward as it mostly relates with sensor compatibility, autonomy, reliability and cost, a hyperspectral sensor has much more to be considered. This note provides an assessment of two hyperspectral sensors (push-broom and snapshot) regarding practicality and suitability, within a precision viticulture context. The aim is to provide researchers, agronomists, winegrowers and UAV pilots with dependable data collection protocols and methods, enabling them to achieve faster processing techniques and helping to integrate multiple data sources. Furthermore, both the benefits and drawbacks of using each technology within a precision viticulture context are also highlighted. Hyperspectral sensors, UAVs, flight operations, and the processing methodology for each imaging type' datasets are presented through a qualitative and quantitative analysis. For this purpose, four vineyards in two countries were selected as case studies. This supports the extrapolation of both advantages and issues related with the two types of hyperspectral sensors used, in different contexts. Sensors' performance was compared through the evaluation of field operations complexity, processing time and qualitative accuracy of the results, namely the quality of the generated hyperspectral mosaics. The results shown an overall excellent geometrical quality, with no distortions or overlapping faults for both technologies, using the proposed mosaicking process and reconstruction. By resorting to the multi-site assessment, the qualitative and quantitative exchange of information throughout the UAV hyperspectral community is facilitated. In addition, all the major benefits and drawbacks of each hyperspectral sensor regarding its operation and data features are identified. Lastly, the operational complexity in the context of precision agriculture is also presented.
Collapse
Affiliation(s)
- Joaquim J. Sousa
- Engineering Department, School of Science and Technology, University of Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal
- Centre for Robotics in Industry and Intelligent Systems (CRIIS), INESC Technology and Science (INESCTEC), 4200-465 Porto, Portugal
| | - Piero Toscano
- Institute of BioEconomy, National Research Council (CNR-IBE), Via G. Caproni, 8, 50145 Florence, Italy
| | - Alessandro Matese
- Institute of BioEconomy, National Research Council (CNR-IBE), Via G. Caproni, 8, 50145 Florence, Italy
| | | | - Andrea Berton
- Institute of Geosciences and Earth Resources, National Research Council (CNR-IGG), Via Moruzzi 1, 56124 Pisa, Italy
| | - Matteo Gatti
- Department of Sustainable Crop Production (DI.PRO.VE.S.), Università Cattolica del Sacro Cuore, Via E. Parmense 84, 29122 Piacenza, Italy
| | - Stefano Poni
- Department of Sustainable Crop Production (DI.PRO.VE.S.), Università Cattolica del Sacro Cuore, Via E. Parmense 84, 29122 Piacenza, Italy
| | - Luís Pádua
- Centre for the Research and Technology of Agro-Environmental and Biological Sciences, University of Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal
| | - Jonáš Hruška
- Engineering Department, School of Science and Technology, University of Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal
| | - Raul Morais
- Engineering Department, School of Science and Technology, University of Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal
- Centre for the Research and Technology of Agro-Environmental and Biological Sciences, University of Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal
| | - Emanuel Peres
- Engineering Department, School of Science and Technology, University of Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal
- Centre for the Research and Technology of Agro-Environmental and Biological Sciences, University of Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal
| |
Collapse
|
10
|
UAV Multispectral Image-Based Urban River Water Quality Monitoring Using Stacked Ensemble Machine Learning Algorithms—A Case Study of the Zhanghe River, China. REMOTE SENSING 2022. [DOI: 10.3390/rs14143272] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Timely monitoring of inland water quality using unmanned aerial vehicle (UAV) remote sensing is critical for water environmental conservation and management. In this study, two UAV flights were conducted (one in February and the other in December 2021) to acquire images of the Zhanghe River (China), and a total of 45 water samples were collected concurrently with the image acquisition. Machine learning (ML) methods comprising Multiple Linear Regression, the Least Absolute Shrinkage and Selection Operator, a Backpropagation Neural Network (BP), Random Forest (RF), and eXtreme Gradient Boosting (XGBoost) were applied to retrieve four water quality parameters: chlorophyll-a (Chl-a), total nitrogen (TN), total phosphors (TP), and permanganate index (CODMn). Then, ML models based on the stacking approach were developed. Results show that stacked ML models could achieve higher accuracy than a single ML model; the optimal methods for Chl-a, TN, TP, and CODMn were RF-XGB, BP-RF, RF, and BP-RF, respectively. For the testing dataset, the R2 values of the best inversion models for Chl-a, TN, TP, and CODMn were 0.504, 0.839, 0.432, and 0.272, the root mean square errors were 1.770 μg L−1, 0.189 mg L−1, 0.053 mg L−1, and 0.767 mg L−1, and the mean absolute errors were 1.272 μg L−1, 0.632 mg L−1, 0.045 mg L−1, and 0.674 mg L−1, respectively. This study demonstrated the great potential of combined UAV remote sensing and stacked ML algorithms for water quality monitoring.
Collapse
|
11
|
Abstract
In recent years, technological advances have led to the increasing use of unmanned aerial vehicles (UAVs) for forestry applications. One emerging field for drone application is forest health monitoring (FHM). Common approaches for FHM involve small-scale resource-extensive fieldwork combined with traditional remote sensing platforms. However, the highly dynamic nature of forests requires timely and repetitive data acquisition, often at very high spatial resolution, where conventional remote sensing techniques reach the limits of feasibility. UAVs have shown that they can meet the demands of flexible operation and high spatial resolution. This is also reflected in a rapidly growing number of publications using drones to study forest health. Only a few reviews exist which do not cover the whole research history of UAV-based FHM. Since a comprehensive review is becoming critical to identify research gaps, trends, and drawbacks, we offer a systematic analysis of 99 papers covering the last ten years of research related to UAV-based monitoring of forests threatened by biotic and abiotic stressors. Advances in drone technology are being rapidly adopted and put into practice, further improving the economical use of UAVs. Despite the many advantages of UAVs, such as their flexibility, relatively low costs, and the possibility to fly below cloud cover, we also identified some shortcomings: (1) multitemporal and long-term monitoring of forests is clearly underrepresented; (2) the rare use of hyperspectral and LiDAR sensors must drastically increase; (3) complementary data from other RS sources are not sufficiently being exploited; (4) a lack of standardized workflows poses a problem to ensure data uniformity; (5) complex machine learning algorithms and workflows obscure interpretability and hinders widespread adoption; (6) the data pipeline from acquisition to final analysis often relies on commercial software at the expense of open-source tools.
Collapse
|
12
|
UAV Remote Sensing for High-Throughput Phenotyping and for Yield Prediction of Miscanthus by Machine Learning Techniques. REMOTE SENSING 2022. [DOI: 10.3390/rs14122927] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/04/2023]
Abstract
Miscanthus holds a great potential in the frame of the bioeconomy, and yield prediction can help improve Miscanthus’ logistic supply chain. Breeding programs in several countries are attempting to produce high-yielding Miscanthus hybrids better adapted to different climates and end-uses. Multispectral images acquired from unmanned aerial vehicles (UAVs) in Italy and in the UK in 2021 and 2022 were used to investigate the feasibility of high-throughput phenotyping (HTP) of novel Miscanthus hybrids for yield prediction and crop traits estimation. An intercalibration procedure was performed using simulated data from the PROSAIL model to link vegetation indices (VIs) derived from two different multispectral sensors. The random forest algorithm estimated with good accuracy yield traits (light interception, plant height, green leaf biomass, and standing biomass) using 15 VIs time series, and predicted yield using peak descriptors derived from these VIs time series with root mean square error of 2.3 Mg DM ha−1. The study demonstrates the potential of UAVs’ multispectral images in HTP applications and in yield prediction, providing important information needed to increase sustainable biomass production.
Collapse
|
13
|
Dandrifosse S, Carlier A, Dumont B, Mercatoris B. In-Field Wheat Reflectance: How to Reach the Organ Scale? SENSORS (BASEL, SWITZERLAND) 2022; 22:3342. [PMID: 35591041 PMCID: PMC9101491 DOI: 10.3390/s22093342] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/23/2022] [Revised: 04/20/2022] [Accepted: 04/26/2022] [Indexed: 06/15/2023]
Abstract
The reflectance of wheat crops provides information on their architecture or physiology. However, the methods currently used for close-range reflectance computation do not allow for the separation of the wheat canopy organs: the leaves and the ears. This study details a method to achieve high-throughput measurements of wheat reflectance at the organ scale. A nadir multispectral camera array and an incident light spectrometer were used to compute bi-directional reflectance factor (BRF) maps. Image thresholding and deep learning ear detection allowed for the segmentation of the ears and the leaves in the maps. The results showed that the BRF measured on reference targets was constant throughout the day but varied with the acquisition date. The wheat organ BRF was constant throughout the day in very cloudy conditions and with high sun altitudes but showed gradual variations in the morning under sunny or partially cloudy sky. As a consequence, measurements should be performed close to solar noon and the reference panel should be captured at the beginning and end of each field trip to correct the BRF. The method, with such precautions, was tested all throughout the wheat growing season on two varieties and various canopy architectures generated by a fertilization gradient. The method yielded consistent reflectance dynamics in all scenarios.
Collapse
Affiliation(s)
- Sébastien Dandrifosse
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium; (A.C.); (B.M.)
| | - Alexis Carlier
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium; (A.C.); (B.M.)
| | - Benjamin Dumont
- Plant Sciences, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium;
| | - Benoît Mercatoris
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium; (A.C.); (B.M.)
| |
Collapse
|
14
|
Machine Learning-Based Approaches for Predicting SPAD Values of Maize Using Multi-Spectral Images. REMOTE SENSING 2022. [DOI: 10.3390/rs14061337] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/04/2023]
Abstract
Precisely monitoring the growth condition and nutritional status of maize is crucial for optimizing agronomic management and improving agricultural production. Multi-spectral sensors are widely applied in ecological and agricultural domains. However, the images collected under varying weather conditions on multiple days show a lack of data consistency. In this study, the Mini MCA 6 Camera from UAV platform was used to collect images covering different growth stages of maize. The empirical line calibration method was applied to establish generic equations for radiometric calibration. The coefficient of determination (R2) of the reflectance from calibrated images and ASD Handheld-2 ranged from 0.964 to 0.988 (calibration), and from 0.874 to 0.927 (validation), respectively. Similarly, the root mean square errors (RMSE) were 0.110, 0.089, and 0.102% for validation using data of 5 August, 21 September, and both days in 2019, respectively. The soil and plant analyzer development (SPAD) values were measured and applied to build the linear regression relationships with spectral and textural indices of different growth stages. The Stepwise regression model (SRM) was applied to identify the optimal combination of spectral and textural indices for estimating SPAD values. The support vector machine (SVM) and random forest (RF) models were independently applied for estimating SPAD values based on the optimal combinations. SVM performed better than RF in estimating SPAD values with R2 (0.81) and RMSE (0.14), respectively. This study contributed to the retrieval of SPAD values based on both spectral and textural indices extracted from multi-spectral images using machine learning methods.
Collapse
|
15
|
Spectral Comparison of UAV-Based Hyper and Multispectral Cameras for Precision Viticulture. REMOTE SENSING 2022. [DOI: 10.3390/rs14030449] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Analysis of the spectral response of vegetation using optical sensors for non-destructive remote monitoring represents a key element for crop monitoring. Considering the wide presence on the market of unmanned aerial vehicle (UAVs) based commercial solutions, the need emerges for clear information on the performance of these products to guide the end-user in their choice and utilization for precision agriculture applications. This work aims to compare two UAV based commercial products, represented by DJI P4M and SENOP HSC-2 for the acquisition of multispectral and hyperspectral images, respectively, in vineyards. The accuracy of both cameras was evaluated on 6 different targets commonly found in vineyards, represented by bare soil, bare-stony soil, stony soil, soil with dry grass, partially grass covered soil and canopy. Given the importance of the radiometric calibration, four methods for multispectral images correction were evaluated, taking in account the irradiance sensor equipped on the camera (M1–M2) and the use of an empirical line model (ELM) based on reference reflectance panels (M3–M4). In addition, different DJI P4M exposure setups were evaluated. The performance of the cameras was evaluated by means of the calculation of three widely used vegetation indices (VIs), as percentage error (PE) with respect to ground truth spectroradiometer measurements. The results highlighted the importance of reference panels for the radiometric calibration of multispectral images (M1–M2 average PE = 21.8–100.0%; M3–M4 average PE = 11.9–29.5%). Generally, the hyperspectral camera provided the best accuracy with a PE ranging between 1.0% and 13.6%. Both cameras showed higher performance on the pure canopy pixel target, compared to mixed targets. However, this issue can be easily solved by applying widespread segmentation techniques for the row extraction. This work provides insights to assist end-users in the UAV spectral monitoring to obtain reliable information for the analysis of spatio-temporal variability within vineyards.
Collapse
|
16
|
Sharma P, Leigh L, Chang J, Maimaitijiang M, Caffé M. Above-Ground Biomass Estimation in Oats Using UAV Remote Sensing and Machine Learning. SENSORS (BASEL, SWITZERLAND) 2022; 22:601. [PMID: 35062559 PMCID: PMC8778966 DOI: 10.3390/s22020601] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/10/2021] [Revised: 01/08/2022] [Accepted: 01/09/2022] [Indexed: 02/01/2023]
Abstract
Current strategies for phenotyping above-ground biomass in field breeding nurseries demand significant investment in both time and labor. Unmanned aerial vehicles (UAV) can be used to derive vegetation indices (VIs) with high throughput and could provide an efficient way to predict forage yield with high accuracy. The main objective of the study is to investigate the potential of UAV-based multispectral data and machine learning approaches in the estimation of oat biomass. UAV equipped with a multispectral sensor was flown over three experimental oat fields in Volga, South Shore, and Beresford, South Dakota, USA, throughout the pre- and post-heading growth phases of oats in 2019. A variety of vegetation indices (VIs) derived from UAV-based multispectral imagery were employed to build oat biomass estimation models using four machine-learning algorithms: partial least squares (PLS), support vector machine (SVM), Artificial neural network (ANN), and random forest (RF). The results showed that several VIs derived from the UAV collected images were significantly positively correlated with dry biomass for Volga and Beresford (r = 0.2-0.65), however, in South Shore, VIs were either not significantly or weakly correlated with biomass. For Beresford, approximately 70% of the variance was explained by PLS, RF, and SVM validation models using data collected during the post-heading phase. Likewise for Volga, validation models had lower coefficient of determination (R2 = 0.20-0.25) and higher error (RMSE = 700-800 kg/ha) than training models (R2 = 0.50-0.60; RMSE = 500-690 kg/ha). In South Shore, validation models were only able to explain approx. 15-20% of the variation in biomass, which is possibly due to the insignificant correlation values between VIs and biomass. Overall, this study indicates that airborne remote sensing with machine learning has potential for above-ground biomass estimation in oat breeding nurseries. The main limitation was inconsistent accuracy in model prediction across locations. Multiple-year spectral data, along with the inclusion of textural features like crop surface model (CSM) derived height and volumetric indicators, should be considered in future studies while estimating biophysical parameters like biomass.
Collapse
Affiliation(s)
- Prakriti Sharma
- Department of Agronomy, Horticulture and Plant Science, South Dakota State University, Brookings, SD 57007, USA; (P.S.); (J.C.)
| | - Larry Leigh
- Image Processing Lab., Department of Electrical Engineering and Computer Science, South Dakota State University, Brookings, SD 57007, USA;
| | - Jiyul Chang
- Department of Agronomy, Horticulture and Plant Science, South Dakota State University, Brookings, SD 57007, USA; (P.S.); (J.C.)
| | - Maitiniyazi Maimaitijiang
- Department of Geography & Geospatial Sciences, South Dakota State University, Brookings, SD 57007, USA;
| | - Melanie Caffé
- Department of Agronomy, Horticulture and Plant Science, South Dakota State University, Brookings, SD 57007, USA; (P.S.); (J.C.)
| |
Collapse
|
17
|
Comparison of Multi-Methods for Identifying Maize Phenology Using PhenoCams. REMOTE SENSING 2022. [DOI: 10.3390/rs14020244] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/10/2022]
Abstract
Accurately identifying the phenology of summer maize is crucial for both cultivar breeding and fertilizer controlling in precision agriculture. In this study, daily RGB images covering the entire growth of summer maize were collected using phenocams at sites in Shangqiu (2018, 2019 and 2020) and Nanpi (2020) in China. Four phenological dates, including six leaves, booting, heading and maturity of summer maize, were pre-defined and extracted from the phenocam-based images. The spectral indices, textural indices and integrated spectral and textural indices were calculated using the improved adaptive feature-weighting method. The double logistic function, harmonic analysis of time series, Savitzky–Golay and spline interpolation were applied to filter these indices and pre-defined phenology was identified and compared with the ground observations. The results show that the DLF achieved the highest accuracy, with the coefficient of determination (R2) and the root-mean-square error (RMSE) being 0.86 and 9.32 days, respectively. The new index performed better than the single usage of spectral and textural indices, of which the R2 and RMSE were 0.92 and 9.38 days, respectively. The phenological extraction using the new index and double logistic function based on the PhenoCam data was effective and convenient, obtaining high accuracy. Therefore, it is recommended the adoption of the new index by integrating the spectral and textural indices for extracting maize phenology using PhenoCam data.
Collapse
|
18
|
Weed Detection in Rice Fields Using Remote Sensing Technique: A Review. APPLIED SCIENCES-BASEL 2021. [DOI: 10.3390/app112210701] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
Abstract
This paper reviewed the weed problems in agriculture and how remote sensing techniques can detect weeds in rice fields. The comparison of weed detection between traditional practices and automated detection using remote sensing platforms is discussed. The ideal stage for controlling weeds in rice fields was highlighted, and the types of weeds usually found in paddy fields were listed. This paper will discuss weed detection using remote sensing techniques, and algorithms commonly used to differentiate them from crops are deliberated. However, weed detection in rice fields using remote sensing platforms is still in its early stages; weed detection in other crops is also discussed. Results show that machine learning (ML) and deep learning (DL) remote sensing techniques have successfully produced a high accuracy map for detecting weeds in crops using RS platforms. Therefore, this technology positively impacts weed management in many aspects, especially in terms of the economic perspective. The implementation of this technology into agricultural development could be extended further.
Collapse
|
19
|
Integrating Spectral and Textural Information for Monitoring the Growth of Pear Trees Using Optical Images from the UAV Platform. REMOTE SENSING 2021. [DOI: 10.3390/rs13091795] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
With the recent developments of unmanned aerial vehicle (UAV) remote sensing, it is possible to monitor the growth condition of trees with the high temporal and spatial resolutions of data. In this study, the daily high-throughput RGB images of pear trees were captured from a UAV platform. A new index was generated by integrating the spectral and textural information using the improved adaptive feature weighting method (IAFWM). The inter-relationships of the air climatic variables and the soil’s physical properties (temperature, humidity and conductivity) were firstly assessed using principal component analysis (PCA). The climatic variables were selected to independently build a linear regression model with the new index when the cumulative variance explained reached 99.53%. The coefficient of determination (R2) of humidity (R2 = 0.120, p = 0.205) using linear regression analysis was the dominating influencing factor for the growth of the pear trees, among the air climatic variables tested. The humidity (%) in 40 cm depth of soil (R2 = 0.642, p < 0.001) using a linear regression coefficient was the largest among climatic variables in the soil. The impact of climatic variables on the soil was commonly greater than those in the air, and the R2 grew larger with the increasing depth of soil. The effects of the fluctuation of the soil-climatic variables on the pear trees’ growth could be detected using the sliding window method (SWM), and the maximum absolute value of coefficients with the corresponding day of year (DOY) of air temperature, soil temperature, soil humidity, and soil conductivity were confirmed as 221, 227, 228, and 226 (DOY), respectively. Thus, the impact of the fluctuation of climatic variables on the growth of pear trees can last 14, 8, 7, and 9 days, respectively. Therefore, it is highly recommended that the adoption of the integrated new index to explore the long-time impact of climate on pears growth be undertaken.
Collapse
|
20
|
Wang T, Liu Y, Wang M, Fan Q, Tian H, Qiao X, Li Y. Applications of UAS in Crop Biomass Monitoring: A Review. FRONTIERS IN PLANT SCIENCE 2021; 12:616689. [PMID: 33897719 PMCID: PMC8062761 DOI: 10.3389/fpls.2021.616689] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/13/2020] [Accepted: 03/18/2021] [Indexed: 06/12/2023]
Abstract
Biomass is an important indicator for evaluating crops. The rapid, accurate and nondestructive monitoring of biomass is the key to smart agriculture and precision agriculture. Traditional detection methods are based on destructive measurements. Although satellite remote sensing, manned airborne equipment, and vehicle-mounted equipment can nondestructively collect measurements, they are limited by low accuracy, poor flexibility, and high cost. As nondestructive remote sensing equipment with high precision, high flexibility, and low-cost, unmanned aerial systems (UAS) have been widely used to monitor crop biomass. In this review, UAS platforms and sensors, biomass indices, and data analysis methods are presented. The improvements of UAS in monitoring crop biomass in recent years are introduced, and multisensor fusion, multi-index fusion, the consideration of features not directly related to monitoring biomass, the adoption of advanced algorithms and the use of low-cost sensors are reviewed to highlight the potential for monitoring crop biomass with UAS. Considering the progress made to solve this type of problem, we also suggest some directions for future research. Furthermore, it is expected that the challenge of UAS promotion will be overcome in the future, which is conducive to the realization of smart agriculture and precision agriculture.
Collapse
Affiliation(s)
- Tianhai Wang
- College of Mechanical Engineering, Guangxi University, Nanning, China
| | - Yadong Liu
- College of Mechanical Engineering, Guangxi University, Nanning, China
| | - Minghui Wang
- College of Mechanical Engineering, Guangxi University, Nanning, China
| | - Qing Fan
- College of Civil Engineering and Architecture, Guangxi University, Nanning, China
| | - Hongkun Tian
- College of Mechanical Engineering, Guangxi University, Nanning, China
| | - Xi Qiao
- Guangdong Laboratory of Lingnan Modern Agriculture, Shenzhen, Genome Analysis Laboratory of the Ministry of Agriculture and Rural Area, Agricultural Genomics Institute at Shenzhen, Chinese Academy of Agricultural Sciences, Shenzhen, China
- Guangzhou Key Laboratory of Agricultural Products Quality & Safety Traceability Information Technology, Zhongkai University of Agriculture and Engineering, Guangzhou, China
| | - Yanzhou Li
- College of Mechanical Engineering, Guangxi University, Nanning, China
| |
Collapse
|
21
|
Assessing the Effect of Drought on Winter Wheat Growth Using Unmanned Aerial System (UAS)-Based Phenotyping. REMOTE SENSING 2021. [DOI: 10.3390/rs13061144] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
Drought significantly limits wheat productivity across the temporal and spatial domains. Unmanned Aerial Systems (UAS) has become an indispensable tool to collect refined spatial and high temporal resolution imagery data. A 2-year field study was conducted in 2018 and 2019 to determine the temporal effects of drought on canopy growth of winter wheat. Weekly UAS data were collected using red, green, and blue (RGB) and multispectral (MS) sensors over a yield trial consisting of 22 winter wheat cultivars in both irrigated and dryland environments. Raw-images were processed to compute canopy features such as canopy cover (CC) and canopy height (CH), and vegetation indices (VIs) such as Normalized Difference Vegetation Index (NDVI), Excess Green Index (ExG), and Normalized Difference Red-edge Index (NDRE). The drought was more severe in 2018 than in 2019 and the effects of growth differences across years and irrigation levels were visible in the UAS measurements. CC, CH, and VIs, measured during grain filling, were positively correlated with grain yield (r = 0.4–0.7, p < 0.05) in the dryland in both years. Yield was positively correlated with VIs in 2018 (r = 0.45–0.55, p < 0.05) in the irrigated environment, but the correlations were non-significant in 2019 (r = 0.1 to −0.4), except for CH. The study shows that high-throughput UAS data can be used to monitor the drought effects on wheat growth and productivity across the temporal and spatial domains.
Collapse
|
22
|
A Robust Vegetation Index Based on Different UAV RGB Images to Estimate SPAD Values of Naked Barley Leaves. REMOTE SENSING 2021. [DOI: 10.3390/rs13040686] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Chlorophyll content in plant leaves is an essential indicator of the growth condition and the fertilization management effect of naked barley crops. The soil plant analysis development (SPAD) values strongly correlate with leaf chlorophyll contents. Unmanned Aerial Vehicles (UAV) can provide an efficient way to retrieve SPAD values on a relatively large scale with a high temporal resolution. But the UAV mounted with high-cost multispectral or hyperspectral sensors may be a tremendous economic burden for smallholder farmers. To overcome this shortcoming, we investigated the potential of UAV mounted with a commercial digital camera for estimating the SPAD values of naked barley leaves. We related 21 color-based vegetation indices (VIs) calculated from UAV images acquired from two flight heights (6.0 m and 50.0 m above ground level) in four different growth stages with SPAD values. Our results indicated that vegetation extraction and naked barley ears mask could improve the correlation between image-calculated vegetation indices and SPAD values. The VIs of ‘L*,’ ‘b*,’ ‘G − B’ and ‘2G − R − B’ showed significant correlations with SPAD values of naked barley leaves at both flight heights. The validation of the regression model showed that the index of ‘G-B’ could be regarded as the most robust vegetation index for predicting the SPAD values of naked barley leaves for different images and different flight heights. Our study demonstrated that the UAV mounted with a commercial camera has great potentiality in retrieving SPAD values of naked barley leaves under unstable photography conditions. It is significant for farmers to take advantage of the cheap measurement system to monitor crops.
Collapse
|
23
|
Radiometric Correction of Multispectral UAS Images: Evaluating the Accuracy of the Parrot Sequoia Camera and Sunshine Sensor. REMOTE SENSING 2021. [DOI: 10.3390/rs13040577] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Unmanned aerial systems (UAS) carrying commercially sold multispectral sensors equipped with a sunshine sensor, such as Parrot Sequoia, enable mapping of vegetation at high spatial resolution with a large degree of flexibility in planning data collection. It is, however, a challenge to perform radiometric correction of the images to create reflectance maps (orthomosaics with surface reflectance) and to compute vegetation indices with sufficient accuracy to enable comparisons between data collected at different times and locations. Studies have compared different radiometric correction methods applied to the Sequoia camera, but there is no consensus about a standard method that provides consistent results for all spectral bands and for different flight conditions. In this study, we perform experiments to assess the accuracy of the Parrot Sequoia camera and sunshine sensor to get an indication if the quality of the data collected is sufficient to create accurate reflectance maps. In addition, we study if there is an influence of the atmosphere on the images and suggest a workflow to collect and process images to create a reflectance map. The main findings are that the sensitivity of the camera is influenced by camera temperature and that the atmosphere influences the images. Hence, we suggest letting the camera warm up before image collection and capturing images of reflectance calibration panels at an elevation close to the maximum flying height to compensate for influence from the atmosphere. The results also show that there is a strong influence of the orientation of the sunshine sensor. This introduces noise and limits the use of the raw sunshine sensor data to compensate for differences in light conditions. To handle this noise, we fit smoothing functions to the sunshine sensor data before we perform irradiance normalization of the images. The developed workflow is evaluated against data from a handheld spectroradiometer, giving the highest correlation (R2 = 0.99) for the normalized difference vegetation index (NDVI). For the individual wavelength bands, R2 was 0.80–0.97 for the red-edge, near-infrared, and red bands.
Collapse
|
24
|
López-Vicente M, Kramer H, Keesstra S. Effectiveness of soil erosion barriers to reduce sediment connectivity at small basin scale in a fire-affected forest. JOURNAL OF ENVIRONMENTAL MANAGEMENT 2021; 278:111510. [PMID: 33120091 DOI: 10.1016/j.jenvman.2020.111510] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/24/2020] [Revised: 08/30/2020] [Accepted: 10/11/2020] [Indexed: 06/11/2023]
Abstract
Forest fires and post-fire management practices (PFMP) cause changes in the hydrological response of a hillslope. This study evaluates the effect of log erosion barriers (LB) and Easy-Barriers® (EB) on the spatial patterns and values of structural sediment connectivity (SC) in a Mediterranean mountainous pine forest affected by an arson fire in August 2017. A drone flight was done in July 2019 (23 months after the fire and 11 months after the PFMP) to obtain a high-resolution orthomosaic and DEM (at 0.05 m). Two contrasted areas, with and without PFMP, were selected along the same hillslope and 26 small basins were identified: 16 in the treated area (mean area, slope and vegetation recovery of 916 m2, 60% and 25%; with 94 LB and 39 EB) and 10 in the untreated area (1952 m2, 75% and 20%). The aggregated index of sediment connectivity (AIC) was chosen to compute SC in three temporal scenarios: Before and just after the fire and when all PFMP were implemented including the incipient vegetation recovery. Output normalization allowed the comparison of the non-nested basins among them. After accounting the intrinsic differences among the basins and areas, and the temporal changes of SC between the three scenarios, the contribution of the barriers was estimated in 27% from the total decrease of SC in the treated area (-8.5%). The remaining 73% was explained by the vegetation recovery. The effectiveness of the LB (11.3% on average) and EB (13.4%) did not diminish with increasing slope gradients. These percentages become relevant considering the small area affected by the LB (2.8%) and EB (1.3%). Independent metrics (convergence index, flow width, flat areas and LS factor) also reported clear differences between the two areas -higher soil erosive intensity in the untreated area- and in accordance with the AIC results.
Collapse
Affiliation(s)
- Manuel López-Vicente
- Team Soil, Water and Land Use, Wageningen Environmental Research, Droevendaalsesteeg 3, Wageningen, 6708RC, Netherlands.
| | - Henk Kramer
- Team Earth Informatics, Wageningen Environmental Research, Droevendaalsesteeg 3, Wageningen, 6708RC, Netherlands.
| | - Saskia Keesstra
- Team Soil, Water and Land Use, Wageningen Environmental Research, Droevendaalsesteeg 3, Wageningen, 6708RC, Netherlands.
| |
Collapse
|
25
|
Guo Y, Yin G, Sun H, Wang H, Chen S, Senthilnath J, Wang J, Fu Y. Scaling Effects on Chlorophyll Content Estimations with RGB Camera Mounted on a UAV Platform Using Machine-Learning Methods. SENSORS 2020; 20:s20185130. [PMID: 32916808 PMCID: PMC7570550 DOI: 10.3390/s20185130] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/27/2020] [Revised: 09/02/2020] [Accepted: 09/04/2020] [Indexed: 01/07/2023]
Abstract
Timely monitoring and precise estimation of the leaf chlorophyll contents of maize are crucial for agricultural practices. The scale effects are very important as the calculated vegetation index (VI) were crucial for the quantitative remote sensing. In this study, the scale effects were investigated by analyzing the linear relationships between VI calculated from red–green–blue (RGB) images from unmanned aerial vehicles (UAV) and ground leaf chlorophyll contents of maize measured using SPAD-502. The scale impacts were assessed by applying different flight altitudes and the highest coefficient of determination (R2) can reach 0.85. We found that the VI from images acquired from flight altitude of 50 m was better to estimate the leaf chlorophyll contents using the DJI UAV platform with this specific camera (5472 × 3648 pixels). Moreover, three machine-learning (ML) methods including backpropagation neural network (BP), support vector machine (SVM), and random forest (RF) were applied for the grid-based chlorophyll content estimation based on the common VI. The average values of the root mean square error (RMSE) of chlorophyll content estimations using ML methods were 3.85, 3.11, and 2.90 for BP, SVM, and RF, respectively. Similarly, the mean absolute error (MAE) were 2.947, 2.460, and 2.389, for BP, SVM, and RF, respectively. Thus, the ML methods had relative high precision in chlorophyll content estimations using VI; in particular, the RF performed better than BP and SVM. Our findings suggest that the integrated ML methods with RGB images of this camera acquired at a flight altitude of 50 m (spatial resolution 0.018 m) can be perfectly applied for estimations of leaf chlorophyll content in agriculture.
Collapse
Affiliation(s)
- Yahui Guo
- Beijing Key Laboratory of Urban Hydrological Cycle and Sponge City Technology, College of Water Sciences, Beijing Normal University, Beijing 100875, China; (Y.G.); (G.Y.); (S.C.)
| | - Guodong Yin
- Beijing Key Laboratory of Urban Hydrological Cycle and Sponge City Technology, College of Water Sciences, Beijing Normal University, Beijing 100875, China; (Y.G.); (G.Y.); (S.C.)
| | - Hongyong Sun
- The Center for Agricultural Resources Research, Institute of Genetics and Developmental Biology, The Chinese Academy of Sciences, 286 Huaizhong Road, Shijiazhuang 050021, China;
| | - Hanxi Wang
- State Environmental Protection Key Laboratory of Wetland Ecology and Vegetation Restoration/School of Environment, Northeast Normal University, Jingyue Street 2555, Changchun 130017, China;
| | - Shouzhi Chen
- Beijing Key Laboratory of Urban Hydrological Cycle and Sponge City Technology, College of Water Sciences, Beijing Normal University, Beijing 100875, China; (Y.G.); (G.Y.); (S.C.)
| | - J. Senthilnath
- Institute for Infocomm Research, Agency for Science, Technology and Research (A*STAR), Singapore 138632, Singapore;
| | - Jingzhe Wang
- MNR Key Laboratory for Geo-Environmental Monitoring of Great Bay Area of the Ministry of Natural Resources & Guangdong Key Laboratory of Urban Informatics & Shenzhen Key Laboratory of Spatial Smart Sensing and Services, Shenzhen University, Shenzhen 518060, China;
| | - Yongshuo Fu
- Beijing Key Laboratory of Urban Hydrological Cycle and Sponge City Technology, College of Water Sciences, Beijing Normal University, Beijing 100875, China; (Y.G.); (G.Y.); (S.C.)
- Correspondence:
| |
Collapse
|
26
|
Guo Y, Wang H, Wu Z, Wang S, Sun H, Senthilnath J, Wang J, Robin Bryant C, Fu Y. Modified Red Blue Vegetation Index for Chlorophyll Estimation and Yield Prediction of Maize from Visible Images Captured by UAV. SENSORS (BASEL, SWITZERLAND) 2020; 20:E5055. [PMID: 32899582 PMCID: PMC7570511 DOI: 10.3390/s20185055] [Citation(s) in RCA: 27] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/28/2020] [Revised: 08/30/2020] [Accepted: 09/02/2020] [Indexed: 11/22/2022]
Abstract
The vegetation index (VI) has been successfully used to monitor the growth and to predict the yield of agricultural crops. In this paper, a long-term observation was conducted for the yield prediction of maize using an unmanned aerial vehicle (UAV) and estimations of chlorophyll contents using SPAD-502. A new vegetation index termed as modified red blue VI (MRBVI) was developed to monitor the growth and to predict the yields of maize by establishing relationships between MRBVI- and SPAD-502-based chlorophyll contents. The coefficients of determination (R2s) were 0.462 and 0.570 in chlorophyll contents' estimations and yield predictions using MRBVI, and the results were relatively better than the results from the seven other commonly used VI approaches. All VIs during the different growth stages of maize were calculated and compared with the measured values of chlorophyll contents directly, and the relative error (RE) of MRBVI is the lowest at 0.355. Further, machine learning (ML) methods such as the backpropagation neural network model (BP), support vector machine (SVM), random forest (RF), and extreme learning machine (ELM) were adopted for predicting the yields of maize. All VIs calculated for each image captured during important phenological stages of maize were set as independent variables and the corresponding yields of each plot were defined as dependent variables. The ML models used the leave one out method (LOO), where the root mean square errors (RMSEs) were 2.157, 1.099, 1.146, and 1.698 (g/hundred grain weight) for BP, SVM, RF, and ELM. The mean absolute errors (MAEs) were 1.739, 0.886, 0.925, and 1.356 (g/hundred grain weight) for BP, SVM, RF, and ELM, respectively. Thus, the SVM method performed better in predicting the yields of maize than the other ML methods. Therefore, it is strongly suggested that the MRBVI calculated from images acquired at different growth stages integrated with advanced ML methods should be used for agricultural- and ecological-related chlorophyll estimation and yield predictions.
Collapse
Affiliation(s)
- Yahui Guo
- Beijing Key Laboratory of Urban Hydrological Cycle and Sponge City Technology, College of Water Sciences, Beijing Normal University, Beijing 100875, China; (Y.G.); (Z.W.); (S.W.)
| | - Hanxi Wang
- State Environmental Protection Key Laboratory of Wetland Ecology and Vegetation Restoration/School of Environment, Northeast Normal University, Jingyue Street 2555, Changchun 130017, China;
| | - Zhaofei Wu
- Beijing Key Laboratory of Urban Hydrological Cycle and Sponge City Technology, College of Water Sciences, Beijing Normal University, Beijing 100875, China; (Y.G.); (Z.W.); (S.W.)
| | - Shuxin Wang
- Beijing Key Laboratory of Urban Hydrological Cycle and Sponge City Technology, College of Water Sciences, Beijing Normal University, Beijing 100875, China; (Y.G.); (Z.W.); (S.W.)
| | - Hongyong Sun
- The Center for Agricultural Resources Research, Institute of Genetics and Developmental Biology& Center for Agricultural Resources Research, Institute of Genetics and Developmental Biology, The Chinese Academy of Sciences, 286 Huaizhong Road, Shijiazhuang 050021, China;
| | - J. Senthilnath
- Institute for Infocomm Research, Agency for Science, Technology and Research (A*STAR), Singapore 138632, Singapore;
| | - Jingzhe Wang
- MNR Key Laboratory for Geo-Environmental Monitoring of Great Bay Area of the Ministry of Natural Resources & Guangdong Key Laboratory of Urban Informatics & Shenzhen Key Laboratory of Spatial Smart Sensing and Services, Shenzhen University, Shenzhen 518060, China;
| | - Christopher Robin Bryant
- The School of Environmental Design and Rural Development, University of Guelph, Guelph, ON N1G 2W1, Canada;
| | - Yongshuo Fu
- Beijing Key Laboratory of Urban Hydrological Cycle and Sponge City Technology, College of Water Sciences, Beijing Normal University, Beijing 100875, China; (Y.G.); (Z.W.); (S.W.)
| |
Collapse
|
27
|
Deep TEC: Deep Transfer Learning with Ensemble Classifier for Road Extraction from UAV Imagery. REMOTE SENSING 2020. [DOI: 10.3390/rs12020245] [Citation(s) in RCA: 27] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Unmanned aerial vehicle (UAV) remote sensing has a wide area of applications and in this paper, we attempt to address one such problem—road extraction from UAV-captured RGB images. The key challenge here is to solve the road extraction problem using the UAV multiple remote sensing scene datasets that are acquired with different sensors over different locations. We aim to extract the knowledge from a dataset that is available in the literature and apply this extracted knowledge on our dataset. The paper focuses on a novel method which consists of deep TEC (deep transfer learning with ensemble classifier) for road extraction using UAV imagery. The proposed deep TEC performs road extraction on UAV imagery in two stages, namely, deep transfer learning and ensemble classifier. In the first stage, with the help of deep learning methods, namely, the conditional generative adversarial network, the cycle generative adversarial network and the fully convolutional network, the model is pre-trained on the benchmark UAV road extraction dataset that is available in the literature. With this extracted knowledge (based on the pre-trained model) the road regions are then extracted on our UAV acquired images. Finally, for the road classified images, ensemble classification is carried out. In particular, the deep TEC method has an average quality of 71%, which is 10% higher than the next best standard deep learning methods. Deep TEC also shows a higher level of performance measures such as completeness, correctness and F1 score measures. Therefore, the obtained results show that the deep TEC is efficient in extracting road networks in an urban region.
Collapse
|
28
|
Geometric and Radiometric Consistency of Parrot Sequoia Multispectral Imagery for Precision Agriculture Applications. APPLIED SCIENCES-BASEL 2019. [DOI: 10.3390/app9245314] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/29/2023]
Abstract
This paper is about the geometric and radiometric consistency of diverse and overlapping datasets acquired with the Parrot Sequoia camera. The multispectral imagery datasets were acquired above agricultural fields in Northern Italy and radiometric calibration images were taken before each flight. Processing was performed with the Pix4Dmapper suite following a single-block approach: images acquired in different flight missions were processed in as many projects, where different block orientation strategies were adopted and compared. Results were assessed in terms of geometric and radiometric consistency in the overlapping areas. The geometric consistency was evaluated in terms of point cloud distance using iterative closest point (ICP), while the radiometric consistency was analyzed by computing the differences between the reflectance maps and vegetation indices produced according to adopted processing strategies. For normalized difference vegetation index (NDVI), a comparison with Sentinel-2 was also made. This paper will present results obtained for two (out of several) overlapped blocks. The geometric consistency is good (root mean square error (RMSE) in the order of 0.1 m), except for when direct georeferencing is considered. Radiometric consistency instead presents larger problems, especially in some bands and in vegetation indices that have differences above 20%. The comparison with Sentinel-2 products shows a general overestimation of Sequoia data but with similar spatial variations (Pearson’s correlation coefficient of about 0.7, p-value < 2.2 × 10−16).
Collapse
|
29
|
Modeling Climate Change Impacts on Rice Growth and Yield under Global Warming of 1.5 and 2.0 °C in the Pearl River Delta, China. ATMOSPHERE 2019. [DOI: 10.3390/atmos10100567] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/07/2023]
Abstract
In this study, the potential climate change impacts on rice growth and rice yield under 1.5 and 2.0 °C warming scenarios, respectively, are simulated using the Ceres-Rice Model based on high-quality, agricultural, experimental, meteorological and soil data, and the incorporation of future climate data generated by four Global Climate Models (GCMs) in the Pearl River Delta, China. The climatic data is extracted from four Global Climate Models (GCMs) namely: The Community Atmosphere Model 4 (CAM4), The European Centre for Medium-Range Weather Forecasts-Hamburg 6 (ECHAM6), Model for Interdisciplinary Research On Climate 5 (MIROC5) and the Norwegian Earth System Model 1 (NorESM1). The modeling results show that climate change has major negative impacts on both rice growth and rice yields at all study sites. More specifically, the average of flowering durations decreases by 2.8 days (3.9 days), and the maturity date decreases by 11.0 days (14.7 days) under the 1.5 °C and (2.0 °C) warming scenarios, respectively. The yield for early mature rice and late mature rice are reduced by 292.5 kg/ha (558.9 kg/ha) and 151.8 kg/ha (380.0 kg/ha) under the 1.5 °C (2.0 °C) warming scenarios, respectively. Adjusting the planting dates of eight days later and 15 days earlier for early mature rice and late mature rice are simulated to be adaptively effective, respectively. The simulated optimum fertilizer amount is about 240 kg/ha, with different industrial fertilizer and organic matter being applied.
Collapse
|