1
|
Jin X, Han K, Zhao H, Wang Y, Chen Y, Yu J. Detection and coverage estimation of purple nutsedge in turf with image classification neural networks. PEST MANAGEMENT SCIENCE 2024; 80:3504-3515. [PMID: 38436512 DOI: 10.1002/ps.8055] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/09/2023] [Revised: 02/26/2024] [Accepted: 03/04/2024] [Indexed: 03/05/2024]
Abstract
BACKGROUND Accurate detection of weeds and estimation of their coverage is crucial for implementing precision herbicide applications. Deep learning (DL) techniques are typically used for weed detection and coverage estimation by analyzing information at the pixel or individual plant level, which requires a substantial amount of annotated data for training. This study aims to evaluate the effectiveness of using image-classification neural networks (NNs) for detecting and estimating weed coverage in bermudagrass turf. RESULTS Weed-detection NNs, including DenseNet, GoogLeNet and ResNet, exhibited high overall accuracy and F1 scores (≥0.971) throughout the k-fold cross-validation. DenseNet outperformed GoogLeNet and ResNet with the highest overall accuracy and F1 scores (0.977). Among the evaluated NNs, DenseNet showed the highest overall accuracy and F1 scores (0.996) in the validation and testing data sets for estimating weed coverage. The inference speed of ResNet was similar to that of GoogLeNet but noticeably faster than DenseNet. ResNet was the most efficient and accurate deep convolution neural network for weed detection and coverage estimation. CONCLUSION These results demonstrated that the developed NNs could effectively detect weeds and estimate their coverage in bermudagrass turf, allowing calculation of the herbicide requirements for variable-rate herbicide applications. The proposed method can be employed in a machine vision-based autonomous site-specific spraying system of smart sprayers. © 2024 Society of Chemical Industry.
Collapse
Affiliation(s)
- Xiaojun Jin
- College of Mechanical and Electronic Engineering, Nanjing Forestry University, Nanjing, China
- Peking University Institute of Advanced Agricultural Sciences/Shandong Laboratory of Advanced Agricultural Sciences at Weifang, Weifang, China
| | - Kang Han
- Peking University Institute of Advanced Agricultural Sciences/Shandong Laboratory of Advanced Agricultural Sciences at Weifang, Weifang, China
| | - Hua Zhao
- School of Mechanical Engineering, Jiangsu Ocean University, Lianyungang, China
| | - Yan Wang
- School of Mechanical Engineering, Jiangsu Ocean University, Lianyungang, China
| | - Yong Chen
- College of Mechanical and Electronic Engineering, Nanjing Forestry University, Nanjing, China
| | - Jialin Yu
- Peking University Institute of Advanced Agricultural Sciences/Shandong Laboratory of Advanced Agricultural Sciences at Weifang, Weifang, China
| |
Collapse
|
2
|
Seiche AT, Wittstruck L, Jarmer T. Weed Detection from Unmanned Aerial Vehicle Imagery Using Deep Learning-A Comparison between High-End and Low-Cost Multispectral Sensors. SENSORS (BASEL, SWITZERLAND) 2024; 24:1544. [PMID: 38475081 DOI: 10.3390/s24051544] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/30/2024] [Revised: 02/13/2024] [Accepted: 02/14/2024] [Indexed: 03/14/2024]
Abstract
In order to meet the increasing demand for crops under challenging climate conditions, efficient and sustainable cultivation strategies are becoming essential in agriculture. Targeted herbicide use reduces environmental pollution and effectively controls weeds as a major cause of yield reduction. The key requirement is a reliable weed detection system that is accessible to a wide range of end users. This research paper introduces a self-built, low-cost, multispectral camera system and evaluates it against the high-end MicaSense Altum system. Pixel-based weed and crop classification was performed on UAV datasets collected with both sensors in maize using a U-Net. The training and testing data were generated via an index-based thresholding approach followed by annotation. As a result, the F1-score for the weed class reached 82% on the Altum system and 76% on the low-cost system, with recall values of 75% and 68%, respectively. Misclassifications occurred on the low-cost system images for small weeds and overlaps, with minor oversegmentation. However, with a precision of 90%, the results show great potential for application in automated weed control. The proposed system thereby enables sustainable precision farming for the general public. In future research, its spectral properties, as well as its use on different crops with real-time on-board processing, should be further investigated.
Collapse
Affiliation(s)
- Anna Teresa Seiche
- Institute of Computer Science, Osnabrück University, 49090 Osnabrück, Germany
| | - Lucas Wittstruck
- Institute of Computer Science, Osnabrück University, 49090 Osnabrück, Germany
| | - Thomas Jarmer
- Institute of Computer Science, Osnabrück University, 49090 Osnabrück, Germany
| |
Collapse
|
3
|
Kwon SH, Ku KB, Le AT, Han GD, Park Y, Kim J, Tuan TT, Chung YS, Mansoor S. Enhancing citrus fruit yield investigations through flight height optimization with UAV imaging. Sci Rep 2024; 14:322. [PMID: 38172521 PMCID: PMC10764763 DOI: 10.1038/s41598-023-50921-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2023] [Accepted: 12/28/2023] [Indexed: 01/05/2024] Open
Abstract
Citrus fruit yield is essential for market stability, as it allows businesses to plan for production and distribution. However, yield estimation is a complex and time-consuming process that often requires a large number of field samples to ensure representativeness. To address this challenge, we investigated the optimal altitude for unmanned aerial vehicle (UAV) imaging to estimate the yield of Citrus unshiu fruit. We captured images from five different altitudes (30 m, 50 m, 70 m, 90 m, and 110 m), and determined that a resolution of approximately 5 pixels/cm is necessary for reliable estimation of fruit size based on the average diameter of C. unshiu fruit (46.7 mm). Additionally, we found that histogram equalization of the images improved fruit count estimation compared to using untreated images. At the images from 30 m height, the normal image estimates fruit numbers as 73, 55, and 88. However, the histogram equalized image estimates 88, 71, 105. The actual number of fruits is 124, 88, and 141. Using a Vegetation Index such as IPCA showed a similar estimation value to histogram equalization, but I1 estimation represents a gap to actual yields. Our results provide a valuable database for future UAV field investigations of citrus fruit yield. Using flying platforms like UAVs can provide a step towards adopting this sort of model spanning ever greater regions at a cheap cost, with this system generating accurate results in this manner.
Collapse
Affiliation(s)
- Soon-Hwa Kwon
- Citrus Research Institute, National Institute of Horticultural and Herbal Science, Rural Development Administration, Jeju, 63607, Republic of Korea
| | - Ki Bon Ku
- Department of Plant Resources and Environment, Jeju National University, Jeju, 63243, Republic of Korea
| | - Anh Tuan Le
- Department of Plant Resources and Environment, Jeju National University, Jeju, 63243, Republic of Korea
| | - Gyung Deok Han
- Department of Practical Arts Education, Cheongju National University of Education, Cheongju, 28690, Republic of Korea
| | - Yosup Park
- Citrus Research Institute, National Institute of Horticultural and Herbal Science, Rural Development Administration, Jeju, 63607, Republic of Korea
| | - Jaehong Kim
- Citrus Research Institute, National Institute of Horticultural and Herbal Science, Rural Development Administration, Jeju, 63607, Republic of Korea
| | - Thai Thanh Tuan
- Department of Plant Resources and Environment, Jeju National University, Jeju, 63243, Republic of Korea
| | - Yong Suk Chung
- Department of Plant Resources and Environment, Jeju National University, Jeju, 63243, Republic of Korea.
| | - Sheikh Mansoor
- Department of Plant Resources and Environment, Jeju National University, Jeju, 63243, Republic of Korea.
| |
Collapse
|
4
|
Marfo JS, Kyeremeh K, Asamoah P, Owusu-Bio MK, Marfo AFA. Exploring factors affecting the adoption and continuance usage of drone in healthcare: The role of the environment. PLOS DIGITAL HEALTH 2023; 2:e0000266. [PMID: 37934723 PMCID: PMC10629621 DOI: 10.1371/journal.pdig.0000266] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/06/2023] [Accepted: 09/13/2023] [Indexed: 11/09/2023]
Abstract
Drone technologies and healthcare delivery have attracted scholarly attention over the years. Studies have acknowledged the positive impact of the adoption and usage of drone technologies for healthcare delivery. We argue however that, knowledge is lacking on the role of the environment in drone technologies adoption, usage and continuance usage. An examination of 330 health facilities that engage in the use of drone services from Zipline Ghana showed that the environment inversely moderates the relationship between actual usage and intention to continue usage, suggesting that reducing the influence of environmental factors will increase the impact actual usage has on the continuance usage of drone technology in healthcare delivery.
Collapse
Affiliation(s)
- John Serbe Marfo
- Supply Chain and Information Systems Department, Kwame Nkrumah University of Science and Technology, Kumasi, Ghana
| | - Kwadwo Kyeremeh
- Department of Accountancy, Sunyani Technical University, Sunyani, Ghana
| | - Pasty Asamoah
- Supply Chain and Information Systems Department, Kwame Nkrumah University of Science and Technology, Kumasi, Ghana
| | - Matilda Kokui Owusu-Bio
- Supply Chain and Information Systems Department, Kwame Nkrumah University of Science and Technology, Kumasi, Ghana
| | | |
Collapse
|
5
|
Xu B, Meng R, Chen G, Liang L, Lv Z, Zhou L, Sun R, Zhao F, Yang W. Improved weed mapping in corn fields by combining UAV-based spectral, textural, structural, and thermal measurements. PEST MANAGEMENT SCIENCE 2023; 79:2591-2602. [PMID: 36883563 DOI: 10.1002/ps.7443] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/27/2022] [Revised: 01/20/2023] [Accepted: 03/08/2023] [Indexed: 06/02/2023]
Abstract
BACKGROUND Spatial-explicit weed information is critical for controlling weed infestation and reducing corn yield losses. The development of unmanned aerial vehicle (UAV)-based remote sensing presents an unprecedented opportunity for efficient, timely weed mapping. Spectral, textural, and structural measurements have been used for weed mapping, whereas thermal measurements-for example, canopy temperature (CT)-were seldom considered and used. In this study, we quantified the optimal combination of spectral, textural, structural, and CT measurements based on different machine-learning algorithms for weed mapping. RESULTS CT improved weed-mapping accuracies as complementary information for spectral, textural, and structural features (up to 5% and 0.051 improvements in overall accuracy [OA] and Marco-F1, respectively). The fusion of textural, structural, and thermal features achieved the best performance in weed mapping (OA = 96.4%, Marco-F1 = 0.964), followed by the fusion of structural and thermal features (OA = 93.6%, Marco-F1 = 0.936). The Support Vector Machine-based model achieved the best performance in weed mapping, with 3.5% and 7.1% improvements in OA and 0.036 and 0.071 in Marco-F1 respectively, compared with the best models of Random Forest and Naïve Bayes Classifier. CONCLUSION Thermal measurement can complement other types of remote-sensing measurements and improve the weed-mapping accuracy within the data-fusion framework. Importantly, integrating textural, structural, and thermal features achieved the best performance for weed mapping. Our study provides a novel method for weed mapping using UAV-based multisource remote sensing measurements, which is critical for ensuring crop production in precision agriculture. © 2023 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.
Collapse
Affiliation(s)
- Binyuan Xu
- College of Resources and Environment, Huazhong Agricultural University, Wuhan, China
| | - Ran Meng
- College of Resources and Environment, Huazhong Agricultural University, Wuhan, China
- HIT Institute for Artificial Intelligence Co. Ltd, Harbin, China
| | - Gengshen Chen
- National Key Laboratory of Crop Genetic Improvement, National Center of Plant Gene Research (Wuhan), Hubei Hongshan Laboratory, Huazhong Agricultural University, Wuhan, China
| | - Linlin Liang
- Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing, China
| | - Zhengang Lv
- College of Resources and Environment, Huazhong Agricultural University, Wuhan, China
| | - Longfei Zhou
- College of Resources and Environment, Huazhong Agricultural University, Wuhan, China
| | - Rui Sun
- College of Resources and Environment, Huazhong Agricultural University, Wuhan, China
| | - Feng Zhao
- Key Laboratory of Geographical Process Analysis & Simulation of Hubei Province, College of Urban and Environmental Sciences, Central China Normal University, Wuhan, China
| | - Wanneng Yang
- National Key Laboratory of Crop Genetic Improvement, National Center of Plant Gene Research (Wuhan), Hubei Hongshan Laboratory, Huazhong Agricultural University, Wuhan, China
| |
Collapse
|
6
|
Sapkota R, Stenger J, Ostlie M, Flores P. Towards reducing chemical usage for weed control in agriculture using UAS imagery analysis and computer vision techniques. Sci Rep 2023; 13:6548. [PMID: 37085558 PMCID: PMC10121711 DOI: 10.1038/s41598-023-33042-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2022] [Accepted: 04/06/2023] [Indexed: 04/23/2023] Open
Abstract
Currently, applying uniform distribution of chemical herbicide through a sprayer without considering the spatial distribution information of crops and weeds is the most common method of controlling weeds in commercial agricultural production system. This kind of weed management practice lead to excessive amounts of chemical herbicides being applied in a given field. The objective of this study was to perform site-specific weed control (SSWC) in a corn field by: (1) using a unmanned aerial system (UAS) to map the spatial distribution information of weeds in the field; (2) creating a prescription map based on the weed distribution map, and (3) spraying the field using the prescription map and a commercial size sprayer. In this study, we assumed that plants growing outside the corn rows are weeds and they need to be controlled. The first step in implementing such an approach is identifying the corn rows. For that, we are proposing a Crop Row Identification algorithm, a computer vision algorithm that identifies corn rows on UAS imagery. After being identified, the corn rows were then removed from the imagery and remaining vegetation fraction was classified as weeds. Based on that information, a grid-based weed prescription map was created and the weed control application was implemented through a commercial-size sprayer. The decision of spraying herbicides on a particular grid was based on the presence of weeds in that grid cell. All the grids that contained at least one weed were sprayed, while the grids free of weeds were not. Using our SSWC approach, we were able to save 26.2% of the acreage from being sprayed with herbicide compared to the current method. This study presents a full workflow from UAS image collection to field weed control implementation using a commercial size sprayer, and it shows that some level of savings can potentially be obtained even in a situation with high weed infestation, which might provide an opportunity to reduce chemical usage in corn production systems.
Collapse
Affiliation(s)
- Ranjan Sapkota
- Center for Precision and Automated Agricultural Systems, Washington State University, 24106 N. Bunn Rd, Prosser, WA, 99350, USA
- Agricultural and Biosystems Engineering, North Dakota State University, 1221 Albrecht Blvd, Fargo, ND, 58102, USA
| | - John Stenger
- Agricultural and Biosystems Engineering, North Dakota State University, 1221 Albrecht Blvd, Fargo, ND, 58102, USA
| | - Michael Ostlie
- NDSU Carrington Research Extension Center, Carrington, ND, 58421-0219, USA
| | - Paulo Flores
- Agricultural and Biosystems Engineering, North Dakota State University, 1221 Albrecht Blvd, Fargo, ND, 58102, USA.
| |
Collapse
|
7
|
Hassan SI, Alam MM, Illahi U, Mohd Suud M. A new deep learning-based technique for rice pest detection using remote sensing. PeerJ Comput Sci 2023; 9:e1167. [PMID: 37346729 PMCID: PMC10280224 DOI: 10.7717/peerj-cs.1167] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2022] [Accepted: 12/01/2022] [Indexed: 06/23/2023]
Abstract
Background Agriculture plays a vital role in the country's economy and human society. Rice production is mainly focused on financial improvements as it is demanding worldwide. Protecting the rice field from pests during seedling and after production is becoming a challenging research problem. Identifying the pest at the right time is crucial so that the measures to prevent rice crops from pests can be taken by considering its stage. In this article, a new deep learning-based pest detection model is proposed. The proposed system can detect two types of rice pests (stem borer and Hispa) using an unmanned aerial vehicle (UAV). Methodology The image is captured in real time by a camera mounted on the UAV and then processed by filtering, labeling, and segmentation-based technique of color thresholding to convert the image into greyscale for extracting the region of interest. This article provides a rice pests dataset and a comparative analysis of existing pre-trained models. The proposed approach YO-CNN recommended in this study considers the results of the previous model because a smaller network was regarded to be better than a bigger one. Using additional layers has the advantage of preventing memorization, and it provides more precise results than existing techniques. Results The main contribution of the research is implementing a new modified deep learning model named Yolo-convolution neural network (YO-CNN) to obtain a precise output of up to 0.980 accuracies. It can be used to reduce rice wastage during production by monitoring the pests regularly. This technique can be used further for target spraying that saves applicators (fertilizer water and pesticide) and reduces the adverse effect of improper use of applicators on the environment and human beings.
Collapse
Affiliation(s)
- Syeda Iqra Hassan
- Universiti Kuala Lumpur British Malaysian Institute, Kuala Lumpur, Malaysia
- Department of Electrical Engineering, Ziauddin University, Karachi, Pakistan
| | - Muhammad Mansoor Alam
- Faculty of Computing, Riphah International University, Islamabad, Pakistan
- Malaysian Institute of Information Technology, University of Kuala Lumpur, Kuala Lumpur, Malaysia
- Faculty of Engineering and Information Technology, University of Technology Sydney, Sydney, Australia
- Faculty of Computing and Informatics, Multimedia University, Cyberjaya, Selangor, Malaysia
| | - Usman Illahi
- Electrical Engineering Department, Faculty of Engineering and Technology, Gomal University Dera Ismail Khan, Dera Ismail Khan, Pakistan
| | - Mazliham Mohd Suud
- Faculty of Computing and Informatics, Multimedia University, Cyberjaya, Selangor, Malaysia
| |
Collapse
|
8
|
Torres‐Sánchez J, Mesas‐Carrascosa FJ, Pérez‐Porras F, López‐Granados F. Detection of Ecballium elaterium in hedgerow olive orchards using a low-cost uncrewed aerial vehicle and open-source algorithms. PEST MANAGEMENT SCIENCE 2023; 79:645-654. [PMID: 36223137 PMCID: PMC10092466 DOI: 10.1002/ps.7233] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/16/2022] [Revised: 08/24/2022] [Accepted: 10/12/2022] [Indexed: 06/16/2023]
Abstract
BACKGROUND Ecballium elaterium (common name: squirting cucumber) is an emerging weed problem in hedgerow or superintensive olive groves under no tillage. It colonizes the inter-row area infesting the natural or sown cover crops, and is considered a hard-to-control weed. Research in other woody crops has shown E. elaterium has a patchy distribution, which makes this weed susceptible to design a site-specific control strategy only addressed to E. elaterium patches. Therefore, the aim of this work was to develop a methodology based on the analysis of imagery acquired with an uncrewed aerial vehicle (UAV) to detect and map E. elaterium infestations in hedgerow olive orchards. RESULTS The study was conducted in two superintensive olive orchards, and the images were taken using a UAV equipped with an RGB sensor. Flights were conducted on two dates: in May, when there were various weeds infesting the orchard, and in September, when E. elaterium was the only infesting weed. UAV-orthomosaics in the first scenario were classified using random forest models, and the orthomosaics from September with E. elaterium as the only weed, were analyzed using an unsupervised algorithm. In both cases, the overall accuracies were over 0.85, and the producer's accuracies for E. elaterium ranged between 0.74 and 1.00. CONCLUSION These results allow the design of a site-specific and efficient herbicide control protocol which would represent a step forward in sustainable weed management. The development of these algorithms in free and open-source software fosters their application in small and medium farms. © 2022 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.
Collapse
Affiliation(s)
- Jorge Torres‐Sánchez
- imaPing Group, Department of Crop ProtectionInstitute for Sustainable Agriculture (IAS), Spanish National Research Council (CSIC)CórdobaSpain
| | | | - Fernando Pérez‐Porras
- Department of Graphic Engineering and GeomaticsCampus de Rabanales, University of CordobaCórdobaSpain
| | - Francisca López‐Granados
- imaPing Group, Department of Crop ProtectionInstitute for Sustainable Agriculture (IAS), Spanish National Research Council (CSIC)CórdobaSpain
| |
Collapse
|
9
|
Tang Z, Jin Y, Brown PH, Park M. Estimation of tomato water status with photochemical reflectance index and machine learning: Assessment from proximal sensors and UAV imagery. FRONTIERS IN PLANT SCIENCE 2023; 14:1057733. [PMID: 37089640 PMCID: PMC10117946 DOI: 10.3389/fpls.2023.1057733] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/30/2022] [Accepted: 01/27/2023] [Indexed: 05/03/2023]
Abstract
Tracking plant water status is a critical step towards the adaptive precision irrigation management of processing tomatoes, one of the most important specialty crops in California. The photochemical reflectance index (PRI) from proximal sensors and the high-resolution unmanned aerial vehicle (UAV) imagery provide an opportunity to monitor the crop water status efficiently. Based on data from an experimental tomato field with intensive aerial and plant-based measurements, we developed random forest machine learning regression models to estimate tomato stem water potential (ψ stem), (using observations from proximal sensors and 12-band UAV imagery, respectively, along with weather data. The proximal sensor-based model estimation agreed well with the plant ψ stem with R 2 of 0.74 and mean absolute error (MAE) of 0.63 bars. The model included PRI, normalized difference vegetation index, vapor pressure deficit, and air temperature and tracked well with the seasonal dynamics of ψ stem across different plots. A separate model, built with multiple vegetation indices (VIs) from UAV imagery and weather variables, had an R 2 of 0.81 and MAE of 0.67 bars. The plant-level ψ stem maps generated from UAV imagery closely represented the water status differences of plots under different irrigation treatments and also tracked well the temporal change among flights. PRI was found to be the most important VI in both the proximal sensor- and the UAV-based models, providing critical information on tomato plant water status. This study demonstrated that machine learning models can accurately estimate the water status by integrating PRI, other VIs, and weather data, and thus facilitate data-driven irrigation management for processing tomatoes.
Collapse
Affiliation(s)
- Zhehan Tang
- Department of Land, Air and Water Resources, University of California, Davis, Davis, CA, United States
- *Correspondence: Zhehan Tang,
| | - Yufang Jin
- Department of Land, Air and Water Resources, University of California, Davis, Davis, CA, United States
| | - Patrick H. Brown
- Department of Plant Sciences, University of California, Davis, Davis, CA, United States
| | - Meerae Park
- Department of Plant Sciences, University of California, Davis, Davis, CA, United States
| |
Collapse
|
10
|
Pipatsitee P, Tisarum R, Taota K, Samphumphuang T, Eiumnoh A, Singh HP, Cha-Um S. Effectiveness of vegetation indices and UAV-multispectral imageries in assessing the response of hybrid maize (Zea mays L.) to water deficit stress under field environment. ENVIRONMENTAL MONITORING AND ASSESSMENT 2022; 195:128. [PMID: 36402920 DOI: 10.1007/s10661-022-10766-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/09/2021] [Accepted: 11/12/2022] [Indexed: 06/16/2023]
Abstract
Unmanned aerial vehicles (UAVs) equipped with multi-sensors are one of the most innovative technologies for measuring plant health and predicting final yield in field conditions, especially in the water deficit situation in rain-deprived regions. The objective of this investigation was to evaluate the individual plant and canopy-level measurements using UAV imageries in three different genotypes, Suwan4452 (drought-tolerant), Pac339, and S7328 (drought-sensitive) of maize (Zea mays L.) at vegetative and reproductive stages under WW (well-watered) and WD (water deficit) conditions. At the vegetative stage, only CWSI (crop water stress index) of Pac339 and S7328 under WD increased significantly by 1.86- and 1.69-fold over WW, whereas the vegetation indices (EVI2 (Enhanced Vegetation Index 2), OSAVI (Optimized Soil-Adjusted Vegetation Index), GNDVI (Green Normalized Difference Vegetation Index), NDRE (Normalized Difference Red Edge Index), and NDVI (Normalized Difference Vegetation Index)) derived from UAV multi-sensors did not vary. At the reproductive stage, CWSI in drought-sensitive genotype (S7328) under WD increased by 1.92-fold over WW. All the vegetation indices (EVI2, OSAVI, GNDVI, NDRE, and NDVI) of Pac339 and S7328 under WD decreased when compared with those of Suwan4452. NDVI derived from GreenSeeker® handheld and NDVI from UAV data was closely related (R2 = 0.5924). An increase in leaf temperature (Tleaf) and reduction in NDVI of WD stressed maize plants was observed (R2 = 0.5829) leading to yield loss (R2 = 0.5198). In summary, a close correlation was observed between the physiological data of individual plants and vegetation indices of canopy level (collected using a UAV platform) in drought-sensitive genotypes of maize crops under WD conditions, thus indicating its effectiveness in the classification of drought-tolerant genotypes.
Collapse
Affiliation(s)
- Piyanan Pipatsitee
- National Center for Genetic Engineering and Biotechnology (BIOTEC), National Science and Technology Development Agency (NSTDA), 113 Thailand Science Park, Paholyothin Road, Khlong Nueng, Khlong Luang, Pathum Thani, 12120, Thailand
| | - Rujira Tisarum
- National Center for Genetic Engineering and Biotechnology (BIOTEC), National Science and Technology Development Agency (NSTDA), 113 Thailand Science Park, Paholyothin Road, Khlong Nueng, Khlong Luang, Pathum Thani, 12120, Thailand
| | - Kanyarat Taota
- National Center for Genetic Engineering and Biotechnology (BIOTEC), National Science and Technology Development Agency (NSTDA), 113 Thailand Science Park, Paholyothin Road, Khlong Nueng, Khlong Luang, Pathum Thani, 12120, Thailand
| | - Thapanee Samphumphuang
- National Center for Genetic Engineering and Biotechnology (BIOTEC), National Science and Technology Development Agency (NSTDA), 113 Thailand Science Park, Paholyothin Road, Khlong Nueng, Khlong Luang, Pathum Thani, 12120, Thailand
| | - Apisit Eiumnoh
- National Center for Genetic Engineering and Biotechnology (BIOTEC), National Science and Technology Development Agency (NSTDA), 113 Thailand Science Park, Paholyothin Road, Khlong Nueng, Khlong Luang, Pathum Thani, 12120, Thailand
| | - Harminder Pal Singh
- Department of Environment Studies, Faculty of Science, Panjab University, Chandigarh, 160014, India
| | - Suriyan Cha-Um
- National Center for Genetic Engineering and Biotechnology (BIOTEC), National Science and Technology Development Agency (NSTDA), 113 Thailand Science Park, Paholyothin Road, Khlong Nueng, Khlong Luang, Pathum Thani, 12120, Thailand.
| |
Collapse
|
11
|
Tao H, Xu S, Tian Y, Li Z, Ge Y, Zhang J, Wang Y, Zhou G, Deng X, Zhang Z, Ding Y, Jiang D, Guo Q, Jin S. Proximal and remote sensing in plant phenomics: 20 years of progress, challenges, and perspectives. PLANT COMMUNICATIONS 2022; 3:100344. [PMID: 35655429 PMCID: PMC9700174 DOI: 10.1016/j.xplc.2022.100344] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/19/2022] [Revised: 05/08/2022] [Accepted: 05/27/2022] [Indexed: 06/01/2023]
Abstract
Plant phenomics (PP) has been recognized as a bottleneck in studying the interactions of genomics and environment on plants, limiting the progress of smart breeding and precise cultivation. High-throughput plant phenotyping is challenging owing to the spatio-temporal dynamics of traits. Proximal and remote sensing (PRS) techniques are increasingly used for plant phenotyping because of their advantages in multi-dimensional data acquisition and analysis. Substantial progress of PRS applications in PP has been observed over the last two decades and is analyzed here from an interdisciplinary perspective based on 2972 publications. This progress covers most aspects of PRS application in PP, including patterns of global spatial distribution and temporal dynamics, specific PRS technologies, phenotypic research fields, working environments, species, and traits. Subsequently, we demonstrate how to link PRS to multi-omics studies, including how to achieve multi-dimensional PRS data acquisition and processing, how to systematically integrate all kinds of phenotypic information and derive phenotypic knowledge with biological significance, and how to link PP to multi-omics association analysis. Finally, we identify three future perspectives for PRS-based PP: (1) strengthening the spatial and temporal consistency of PRS data, (2) exploring novel phenotypic traits, and (3) facilitating multi-omics communication.
Collapse
Affiliation(s)
- Haiyu Tao
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, National Engineering and Technology Center for Information Agriculture, Collaborative Innovation Centre for Modern Crop Production co-sponsored by Province and Ministry, Nanjing Agricultural University, Address: No. 1 Weigang, Xuanwu District, Nanjing 210095, China
| | - Shan Xu
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, National Engineering and Technology Center for Information Agriculture, Collaborative Innovation Centre for Modern Crop Production co-sponsored by Province and Ministry, Nanjing Agricultural University, Address: No. 1 Weigang, Xuanwu District, Nanjing 210095, China
| | - Yongchao Tian
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, National Engineering and Technology Center for Information Agriculture, Collaborative Innovation Centre for Modern Crop Production co-sponsored by Province and Ministry, Nanjing Agricultural University, Address: No. 1 Weigang, Xuanwu District, Nanjing 210095, China
| | - Zhaofeng Li
- The Key Laboratory of Oasis Eco-agriculture, Xinjiang Production and Construction Corps, Agriculture College, Shihezi University, Shihezi 832003, China
| | - Yan Ge
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, National Engineering and Technology Center for Information Agriculture, Collaborative Innovation Centre for Modern Crop Production co-sponsored by Province and Ministry, Nanjing Agricultural University, Address: No. 1 Weigang, Xuanwu District, Nanjing 210095, China
| | - Jiaoping Zhang
- State Key Laboratory of Crop Genetics and Germplasm Enhancement, National Center for Soybean Improvement, Key Laboratory for Biology and Genetic Improvement of Soybean (General, Ministry of Agriculture), Nanjing Agricultural University, Nanjing 210095, China
| | - Yu Wang
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, National Engineering and Technology Center for Information Agriculture, Collaborative Innovation Centre for Modern Crop Production co-sponsored by Province and Ministry, Nanjing Agricultural University, Address: No. 1 Weigang, Xuanwu District, Nanjing 210095, China
| | - Guodong Zhou
- Sanya Research Institute of Nanjing Agriculture University, Sanya 572024, China
| | - Xiong Deng
- Key Laboratory of Plant Molecular Physiology, Institute of Botany, Chinese Academy of Sciences, Beijing 100093, China
| | - Ze Zhang
- The Key Laboratory of Oasis Eco-agriculture, Xinjiang Production and Construction Corps, Agriculture College, Shihezi University, Shihezi 832003, China
| | - Yanfeng Ding
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, National Engineering and Technology Center for Information Agriculture, Collaborative Innovation Centre for Modern Crop Production co-sponsored by Province and Ministry, Nanjing Agricultural University, Address: No. 1 Weigang, Xuanwu District, Nanjing 210095, China; Hainan Yazhou Bay Seed Laboratory, Sanya 572025, China; Sanya Research Institute of Nanjing Agriculture University, Sanya 572024, China
| | - Dong Jiang
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, National Engineering and Technology Center for Information Agriculture, Collaborative Innovation Centre for Modern Crop Production co-sponsored by Province and Ministry, Nanjing Agricultural University, Address: No. 1 Weigang, Xuanwu District, Nanjing 210095, China; Hainan Yazhou Bay Seed Laboratory, Sanya 572025, China; Sanya Research Institute of Nanjing Agriculture University, Sanya 572024, China
| | - Qinghua Guo
- Institute of Ecology, College of Urban and Environmental Science, Peking University, Beijing 100871, China
| | - Shichao Jin
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, National Engineering and Technology Center for Information Agriculture, Collaborative Innovation Centre for Modern Crop Production co-sponsored by Province and Ministry, Nanjing Agricultural University, Address: No. 1 Weigang, Xuanwu District, Nanjing 210095, China; Hainan Yazhou Bay Seed Laboratory, Sanya 572025, China; Sanya Research Institute of Nanjing Agriculture University, Sanya 572024, China; Jiangsu Provincial Key Laboratory of Geographic Information Science and Technology, International Institute for Earth System Sciences, Nanjing University, Nanjing, Jiangsu 210023, China.
| |
Collapse
|
12
|
Barrile V, Simonetti S, Citroni R, Fotia A, Bilotta G. Experimenting Agriculture 4.0 with Sensors: A Data Fusion Approach between Remote Sensing, UAVs and Self-Driving Tractors. SENSORS (BASEL, SWITZERLAND) 2022; 22:7910. [PMID: 36298261 PMCID: PMC9611850 DOI: 10.3390/s22207910] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/25/2022] [Revised: 10/13/2022] [Accepted: 10/14/2022] [Indexed: 06/16/2023]
Abstract
Geomatics is important for agriculture 4.0; in fact, it uses different types of data (remote sensing from satellites, Unmanned Aerial Vehicles-UAVs, GNSS, photogrammetry, laser scanners and other types of data) and therefore it uses data fusion techniques depending on the different applications to be carried out. This work aims to present on a study area concerning the integration of data acquired (using data fusion techniques) from remote sensing techniques, UAVs, autonomous driving machines and data fusion, all reprocessed and visualised in terms of results obtained through GIS (Geographic Information System). In this work we emphasize the importance of the integration of different methodologies and data fusion techniques, managing data of a different nature acquired with different methodologies to optimise vineyard cultivation and production. In particular, in this note we applied (focusing on a vineyard) geomatics-type methodologies developed in other works and integrated here to be used and optimised in order to make a contribution to agriculture 4.0. More specifically, we used the NDVI (Normalized Difference Vegetation Index) applied to multispectral satellite images and drone images (suitably combined) to identify the vigour of the plants. We then used an autonomous guided vehicle (equipped with sensors and monitoring systems) which, by estimating the optimal path, allows us to optimise fertilisation, irrigation, etc., by data fusion techniques using various types of sensors. Everything is visualised on a GIS to improve the management of the field according to its potential, also using historical data on the environmental, climatic and socioeconomic characteristics of the area. For this purpose, experiments of different types of Geomatics carried out individually on other application cases have been integrated into this work and are coordinated and integrated here in order to provide research/application cues for Agriculture 4.0.
Collapse
Affiliation(s)
- Vincenzo Barrile
- DICEAM Department, University Mediterranea of Reggio Calabria, 89124 Reggio Calabria, Italy
| | - Silvia Simonetti
- Department of Engineering, Università degli Studi di Messina-Piazza Pugliatti, 1, 98122 Messina, Italy
| | - Rocco Citroni
- Department of Electronic Engineering, University of Rome Tor Vergata, 00133 Roma, Italy
| | - Antonino Fotia
- DICEAM Department, University Mediterranea of Reggio Calabria, 89124 Reggio Calabria, Italy
| | - Giuliana Bilotta
- DICEAM Department, University Mediterranea of Reggio Calabria, 89124 Reggio Calabria, Italy
| |
Collapse
|
13
|
Illana Rico S, Martínez Gila DM, Cano Marchal P, Gómez Ortega J. Automatic Detection of Olive Tree Canopies for Groves with Thick Plant Cover on the Ground. SENSORS (BASEL, SWITZERLAND) 2022; 22:6219. [PMID: 36015987 PMCID: PMC9414240 DOI: 10.3390/s22166219] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/10/2022] [Revised: 06/28/2022] [Accepted: 08/02/2022] [Indexed: 06/15/2023]
Abstract
Marking the tree canopies is an unavoidable step in any study working with high-resolution aerial images taken by a UAV in any fruit tree crop, such as olive trees, as the extraction of pixel features from these canopies is the first step to build the models whose predictions are compared with the ground truth obtained by measurements made with other types of sensors. Marking these canopies manually is an arduous and tedious process that is replaced by automatic methods that rarely work well for groves with a thick plant cover on the ground. This paper develops a standard method for the detection of olive tree canopies from high-resolution aerial images taken by a multispectral camera, regardless of the plant cover density between canopies. The method is based on the relative spatial information between canopies.The planting pattern used by the grower is computed and extrapolated using Delaunay triangulation in order to fuse this knowledge with that previously obtained from spectral information. It is shown that the minimisation of a certain function provides an optimal fit of the parameters that define the marking of the trees, yielding promising results of 77.5% recall and 70.9% precision.
Collapse
Affiliation(s)
- Sergio Illana Rico
- Robotics, Automation and Computer Vision Group, Electronic and Automation Engineering Department, University of Jaén, 23071 Jaén, Spain
| | - Diego Manuel Martínez Gila
- Robotics, Automation and Computer Vision Group, Electronic and Automation Engineering Department, University of Jaén, 23071 Jaén, Spain
- Institute for Olive Orchards and Olive Oils, University of Jaén, 23071 Jaén, Spain
| | - Pablo Cano Marchal
- Robotics, Automation and Computer Vision Group, Electronic and Automation Engineering Department, University of Jaén, 23071 Jaén, Spain
- Institute for Olive Orchards and Olive Oils, University of Jaén, 23071 Jaén, Spain
| | - Juan Gómez Ortega
- Robotics, Automation and Computer Vision Group, Electronic and Automation Engineering Department, University of Jaén, 23071 Jaén, Spain
- Institute for Olive Orchards and Olive Oils, University of Jaén, 23071 Jaén, Spain
| |
Collapse
|
14
|
Zhang Y, Zhao D, Liu H, Huang X, Deng J, Jia R, He X, Tahir MN, Lan Y. Research hotspots and frontiers in agricultural multispectral technology: Bibliometrics and scientometrics analysis of the Web of Science. FRONTIERS IN PLANT SCIENCE 2022; 13:955340. [PMID: 36035687 PMCID: PMC9404299 DOI: 10.3389/fpls.2022.955340] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/28/2022] [Accepted: 07/08/2022] [Indexed: 06/15/2023]
Abstract
Multispectral technology has a wide range of applications in agriculture. By obtaining spectral information during crop production, key information such as growth, pests and diseases, fertilizer and pesticide application can be determined quickly, accurately and efficiently. The scientific analysis based on Web of Science aims to understand the research hotspots and areas of interest in the field of agricultural multispectral technology. The publications related to agricultural multispectral research in agriculture between 2002 and 2021 were selected as the research objects. The softwares of CiteSpace, VOSviewer, and Microsoft Excel were used to provide a comprehensive review of agricultural multispectral research in terms of research areas, institutions, influential journals, and core authors. Results of the analysis show that the number of publications increased each year, with the largest increase in 2019. Remote sensing, imaging technology, environmental science, and ecology are the most popular research directions. The journal Remote Sensing is one of the most popular publishers, showing a high publishing potential in multispectral research in agriculture. The institution with the most research literature and citations is the USDA. In terms of the number of papers, Mtanga is the author with the most published articles in recent years. Through keyword co-citation analysis, it is determined that the main research areas of this topic focus on remote sensing, crop classification, plant phenotypes and other research areas. The literature co-citation analysis indicates that the main research directions concentrate in vegetation index, satellite remote sensing applications and machine learning modeling. There is still a lot of room for development of multi-spectrum technology. Further development can be carried out in the areas of multi-device synergy, spectral fusion, airborne equipment improvement, and real-time image processing technology, which will cooperate with each other to further play the role of multi-spectrum in agriculture and promote the development of agriculture.
Collapse
Affiliation(s)
- Yali Zhang
- College of Engineering, South China Agricultural University, Guangzhou, China
- National Center for International Collaboration Research on Precision Agricultural Aviation Pesticide Spraying Technology, Guangzhou, China
| | - Dehua Zhao
- College of Engineering, South China Agricultural University, Guangzhou, China
- National Center for International Collaboration Research on Precision Agricultural Aviation Pesticide Spraying Technology, Guangzhou, China
| | - Hanchao Liu
- College of Engineering, South China Agricultural University, Guangzhou, China
- National Center for International Collaboration Research on Precision Agricultural Aviation Pesticide Spraying Technology, Guangzhou, China
| | - Xinrong Huang
- College of Engineering, South China Agricultural University, Guangzhou, China
- National Center for International Collaboration Research on Precision Agricultural Aviation Pesticide Spraying Technology, Guangzhou, China
| | - Jizhong Deng
- College of Engineering, South China Agricultural University, Guangzhou, China
- National Center for International Collaboration Research on Precision Agricultural Aviation Pesticide Spraying Technology, Guangzhou, China
| | - Ruichang Jia
- College of Engineering, South China Agricultural University, Guangzhou, China
- National Center for International Collaboration Research on Precision Agricultural Aviation Pesticide Spraying Technology, Guangzhou, China
| | - Xiaoping He
- Department of Information Consulting, Library, South China Agricultural University, Guangzhou, China
| | - Muhammad Naveed Tahir
- Department of Agronomy, Pir Mehr Ali Shah-Arid Agriculture University, Rawalpindi, Pakistan
| | - Yubin Lan
- National Center for International Collaboration Research on Precision Agricultural Aviation Pesticide Spraying Technology, Guangzhou, China
- College of Electronic Engineering and College of Artificial Intelligence, South China Agricultural University, Guangzhou, China
| |
Collapse
|
15
|
Use of Oblique RGB Imagery and Apparent Surface Area of Plants for Early Estimation of Above-Ground Corn Biomass. REMOTE SENSING 2021. [DOI: 10.3390/rs13204032] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Estimating above-ground biomass in the context of fertilization management requires the monitoring of crops at early stages. Conventional remote sensing techniques make use of vegetation indices such as the normalized difference vegetation index (NDVI), but they do not exploit the high spatial resolution (ground sampling distance < 5 mm) now achievable with the introduction of unmanned aerial vehicles (UAVs) in agriculture. The aim of this study was to compare image mosaics to single images for the estimation of corn biomass and the influence of viewing angles in this estimation. Nadir imagery was captured by a high spatial resolution camera mounted on a UAV to generate orthomosaics of corn plots at different growth stages (from V2 to V7). Nadir and oblique images (30° and 45° with respect to the vertical) were also acquired from a zip line platform and processed as single images. Image segmentation was performed using the difference color index Excess Green-Excess Red, allowing for the discrimination between vegetation and background pixels. The apparent surface area of plants was then extracted and compared to biomass measured in situ. An asymptotic total least squares regression was performed and showed a strong relationship between the apparent surface area of plants and both dry and fresh biomass. Mosaics tended to underestimate the apparent surface area in comparison to single images because of radiometric degradation. It is therefore conceivable to process only single images instead of investing time and effort in acquiring and processing data for orthomosaic generation. When comparing oblique photography, an angle of 30° yielded the best results in estimating corn biomass, with a low residual standard error of orthogonal distance (RSEOD = 0.031 for fresh biomass, RSEOD = 0.034 for dry biomass). Since oblique imagery provides more flexibility in data acquisition with fewer constraints on logistics, this approach might be an efficient way to monitor crop biomass at early stages.
Collapse
|
16
|
Hu P, Chapman SC, Zheng B. Coupling of machine learning methods to improve estimation of ground coverage from unmanned aerial vehicle (UAV) imagery for high-throughput phenotyping of crops. FUNCTIONAL PLANT BIOLOGY : FPB 2021; 48:766-779. [PMID: 33663681 DOI: 10.1071/fp20309] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/02/2020] [Accepted: 02/14/2021] [Indexed: 06/12/2023]
Abstract
Ground coverage (GC) allows monitoring of crop growth and development and is normally estimated as the ratio of vegetation to the total pixels from nadir images captured by visible-spectrum (RGB) cameras. The accuracy of estimated GC can be significantly impacted by the effect of 'mixed pixels', which is related to the spatial resolution of the imagery as determined by flight altitude, camera resolution and crop characteristics (fine vs coarse textures). In this study, a two-step machine learning method was developed to improve the accuracy of GC of wheat (Triticum aestivum L.) estimated from coarse-resolution RGB images captured by an unmanned aerial vehicle (UAV) at higher altitudes. The classification tree-based per-pixel segmentation (PPS) method was first used to segment fine-resolution reference images into vegetation and background pixels. The reference and their segmented images were degraded to the target coarse spatial resolution. These degraded images were then used to generate a training dataset for a regression tree-based model to establish the sub-pixel classification (SPC) method. The newly proposed method (i.e. PPS-SPC) was evaluated with six synthetic and four real UAV image sets (SISs and RISs, respectively) with different spatial resolutions. Overall, the results demonstrated that the PPS-SPC method obtained higher accuracy of GC in both SISs and RISs comparing to PPS method, with root mean squared errors (RMSE) of less than 6% and relative RMSE (RRMSE) of less than 11% for SISs, and RMSE of less than 5% and RRMSE of less than 35% for RISs. The proposed PPS-SPC method can be potentially applied in plant breeding and precision agriculture to balance accuracy requirement and UAV flight height in the limited battery life and operation time.
Collapse
Affiliation(s)
- Pengcheng Hu
- CSIRO Agriculture and Food, Queensland Biosciences Precinct 306 Carmody Road, St Lucia 4067, Qld, Australia
| | - Scott C Chapman
- CSIRO Agriculture and Food, Queensland Biosciences Precinct 306 Carmody Road, St Lucia 4067, Qld, Australia; and School of Food and Agricultural Sciences, The University of Queensland, via Warrego Highway, Gatton 4343, Qld, Australia
| | - Bangyou Zheng
- CSIRO Agriculture and Food, Queensland Biosciences Precinct 306 Carmody Road, St Lucia 4067, Qld, Australia; and Corresponding author.
| |
Collapse
|
17
|
A Strategy of Parallel Seed-Based Image Segmentation Algorithms for Handling Massive Image Tiles over the Spark Platform. REMOTE SENSING 2021. [DOI: 10.3390/rs13101969] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
The volume of remote sensing images continues to grow as image sources become more diversified and with increasing spatial and spectral resolution. The handling of such large-volume datasets, which exceed available CPU memory, in a timely and efficient manner is becoming a challenge for single machines. The distributed cluster provides an effective solution with strong calculation power. There has been an increasing number of big data technologies that have been adopted to deal with large images using mature parallel technology. However, since most commercial big data platforms are not specifically developed for the remote sensing field, two main issues exist in processing large images with big data platforms using a distributed cluster. On the one hand, the quantities and categories of official algorithms used to process remote sensing images in big data platforms are limited compared to large amounts of sequential algorithms. On the other hand, the sequential algorithms employed directly to process large images in parallel over a distributed cluster may lead to incomplete objects in the tile edges and the generation of large communication volumes at the shuffle stage. It is, therefore, necessary to explore the distributed strategy and adapt the sequential algorithms over the distributed cluster. In this research, we employed two seed-based image segmentation algorithms to construct a distributed strategy based on the Spark platform. The proposed strategy focuses on modifying the incomplete objects by processing border areas and reducing the communication volume to a reasonable size by limiting the auxiliary bands and the buffer size to a small range during the shuffle stage. We calculated the F-measure and execution time to evaluate the accuracy and execution efficiency. The statistical data reveal that both segmentation algorithms maintained high accuracy, as achieved in the reference image segmented in the sequential way. Moreover, generally the strategy took less execution time compared to significantly larger auxiliary bands and buffer sizes. The proposed strategy can modify incomplete objects, with execution time being twice as fast as the strategies that do not employ communication volume reduction in the distributed cluster.
Collapse
|
18
|
Can Commercial Low-Cost Drones and Open-Source GIS Technologies Be Suitable for Semi-Automatic Weed Mapping for Smart Farming? A Case Study in NE Italy. REMOTE SENSING 2021. [DOI: 10.3390/rs13101869] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Weed management is a crucial issue in agriculture, resulting in environmental in-field and off-field impacts. Within Agriculture 4.0, adoption of UASs combined with spatially explicit approaches may drastically reduce doses of herbicides, increasing sustainability in weed management. However, Agriculture 4.0 technologies are barely adopted in small-medium size farms. Recently, small and low-cost UASs, together with open-source software packages, may represent a low-cost spatially explicit system to map weed distribution in crop fields. The general aim is to map weed distribution by a low-cost UASs and a replicable workflow, completely based on open GIS software and algorithms: OpenDroneMap, QGIS, SAGA and OpenCV classification algorithms. Specific objectives are: (i) testing a low-cost UAS for weed mapping; (ii) assessing open-source packages for semi-automatic weed classification; (iii) performing a sustainable management scenario by prescription maps. Results showed high performances along the whole process: in orthomosaic generation at very high spatial resolution (0.01 m/pixel), in testing weed detection (Matthews Correlation Coefficient: 0.67–0.74), and in the production of prescription maps, reducing herbicide treatment to only 3.47% of the entire field. This study reveals the feasibility of low-cost UASs combined with open-source software, enabling a spatially explicit approach for weed management in small-medium size farmlands.
Collapse
|
19
|
Optimized Deep Learning Model as a Basis for Fast UAV Mapping of Weed Species in Winter Wheat Crops. REMOTE SENSING 2021. [DOI: 10.3390/rs13091704] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Weed maps should be available quickly, reliably, and with high detail to be useful for site-specific management in crop protection and to promote more sustainable agriculture by reducing pesticide use. Here, the optimization of a deep residual convolutional neural network (ResNet-18) for the classification of weed and crop plants in UAV imagery is proposed. The target was to reach sufficient performance on an embedded system by maintaining the same features of the ResNet-18 model as a basis for fast UAV mapping. This would enable online recognition and subsequent mapping of weeds during UAV flying operation. Optimization was achieved mainly by avoiding redundant computations that arise when a classification model is applied on overlapping tiles in a larger input image. The model was trained and tested with imagery obtained from a UAV flight campaign at low altitude over a winter wheat field, and classification was performed on species level with the weed species Matricaria chamomilla L., Papaver rhoeas L., Veronica hederifolia L., and Viola arvensis ssp. arvensis observed in that field. The ResNet-18 model with the optimized image-level prediction pipeline reached a performance of 2.2 frames per second with an NVIDIA Jetson AGX Xavier on the full resolution UAV image, which would amount to about 1.78 ha h−1 area output for continuous field mapping. The overall accuracy for determining crop, soil, and weed species was 94%. There were some limitations in the detection of species unknown to the model. When shifting from 16-bit to 32-bit model precision, no improvement in classification accuracy was observed, but a strong decline in speed performance, especially when a higher number of filters was used in the ResNet-18 model. Future work should be directed towards the integration of the mapping process on UAV platforms, guiding UAVs autonomously for mapping purpose, and ensuring the transferability of the models to other crop fields.
Collapse
|
20
|
Effect of Time of Day and Sky Conditions on Different Vegetation Indices Calculated from Active and Passive Sensors and Images Taken from UAV. REMOTE SENSING 2021. [DOI: 10.3390/rs13091691] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/12/2023]
Abstract
Optical sensors have been widely reported to be useful tools to assess biomass, nutrition, and water status in several crops. However, the use of these sensors could be affected by the time of day and sky conditions. This study aimed to evaluate the effect of time of day and sky conditions (sunny versus overcast) on several vegetation indices (VI) calculated from two active sensors (the Crop Circle ACS-470 and Greenseeker RT100), two passive sensors (the hyperspectral bidirectional passive spectrometer and HandySpec Field sensor), and images taken from an unmanned aerial vehicle (UAV). The experimental work was conducted in a wheat crop in south-west Germany, with eight nitrogen (N) application treatments. Optical sensor measurements were made throughout the vegetative growth period on different dates in 2019 at 9:00, 14:00, and 16:00 solar time to evaluate the effect of time of day, and on a sunny and overcast day only at 9:00 h to evaluate the influence of sky conditions on different vegetation indices. For most vegetation indices evaluated, there were significant differences between paired time measurements, regardless of the sensor and day of measurement. The smallest differences between measurement times were found between measurements at 14:00 and 16:00 h, and they were observed for the vehicle-carried and the handheld hyperspectral passive sensor being lower than 2% and 4%, respectively, for the indices NIR/Red edge ratio, Red edge inflection point (REIP), and the water index. Differences were lower than 5% for the vehicle-carried active sensors Crop Circle ACS-470 (indices NIR/Red edge and NIR/Red ratios, and NDVI) and Greenseeker RT100 (index NDVI). The most stable indices over measurement times were the NIR/Red edge ratio, water index, and REIP index, regardless of the sensor used. The most considerable differences between measurement times were found for the simple ratios NIR/Red and NIR/Green. For measurements made on a sunny and overcast day, the most stable were the indices NIR/Red edge ratio, water index, and REIP. In practical terms, these results confirm that passive and active sensors could be used to measure on-farm at any time of day from 9:00 to 16:00 h by choosing optimized indices.
Collapse
|
21
|
Friha O, Ferrag MA, Shu L, Maglaras L, Wang X. Internet of Things for the Future of Smart Agriculture: A Comprehensive Survey of Emerging Technologies. IEEE/CAA JOURNAL OF AUTOMATICA SINICA 2021; 8:718-752. [PMID: 0 DOI: 10.1109/jas.2021.1003925] [Citation(s) in RCA: 27] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/21/2023]
|
22
|
Reference Measurements in Developing UAV Systems for Detecting Pests, Weeds, and Diseases. REMOTE SENSING 2021. [DOI: 10.3390/rs13071238] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
Abstract
The development of UAV (unmanned aerial vehicle) imaging technologies for precision farming applications is rapid, and new studies are published frequently. In cases where measurements are based on aerial imaging, there is the need to have ground truth or reference data in order to develop reliable applications. However, in several precision farming use cases such as pests, weeds, and diseases detection, the reference data can be subjective or relatively difficult to capture. Furthermore, the collection of reference data is usually laborious and time consuming. It also appears that it is difficult to develop generalisable solutions for these areas. This review studies previous research related to pests, weeds, and diseases detection and mapping using UAV imaging in the precision farming context, underpinning the applied reference measurement techniques. The majority of the reviewed studies utilised subjective visual observations of UAV images, and only a few applied in situ measurements. The conclusion of the review is that there is a lack of quantitative and repeatable reference data measurement solutions in the areas of mapping pests, weeds, and diseases. In addition, the results that the studies present should be reflected in the applied references. An option in the future approach could be the use of synthetic data as reference.
Collapse
|
23
|
Estimating Leaf Nitrogen Content in Corn Based on Information Fusion of Multiple-Sensor Imagery from UAV. REMOTE SENSING 2021. [DOI: 10.3390/rs13030340] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/25/2022]
Abstract
With the rapid development of unmanned aerial vehicle (UAV) and sensor technology, UAVs that can simultaneously carry different sensors have been increasingly used to monitor nitrogen status in crops due to their flexibility and adaptability. This study aimed to explore how to use the image information combined from two different sensors mounted on an UAV to evaluate leaf nitrogen content (LNC) in corn. Field experiments with corn were conducted using different nitrogen rates and cultivars at the National Precision Agriculture Research and Demonstration Base in China in 2017. Digital RGB and multispectral images were obtained synchronously by UAV in the V12, R1, and R3 growth stages of corn, respectively. A novel family of modified vegetation indices, named coverage adjusted spectral indices (CASIs (CASI =VI/1+FVcover, where VI denotes the reference vegetation index and FVcover refers to the fraction of vegetation coverage), has been introduced to estimate LNC in corn. Thereby, typical VIs were extracted from multispectral images, which have the advantage of relatively higher spectral resolution, and FVcover was calculated by RGB images that feature higher spatial resolution. Then, the PLS (partial least squares) method was employed to investigate the relationships between LNC and the optimal set of CASIs or VIs selected by the RFA (random frog algorithm) in different corn growth stages. The analysis results indicated that whether removing soil noise or not, CASIs guaranteed a better estimation of LNC than VIs for all of the three growth stages of corn, and the usage of CASIs in the R1 stage yielded the best R2 value of 0.59, with a RMSE (root mean square error) of 22.02% and NRMSE (normalized root mean square error) of 8.37%. It was concluded that CASIs, based on the fusion of information acquired synchronously from both lower resolution multispectral and higher resolution RGB images, have a good potential for crop nitrogen monitoring by UAV. Furthermore, they could also serve as a useful way for assessing other physical and chemical parameters in further applications for crops.
Collapse
|
24
|
Automatic Tree Crown Extraction from UAS Multispectral Imagery for the Detection of Bark Beetle Disturbance in Mixed Forests. REMOTE SENSING 2020. [DOI: 10.3390/rs12244081] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
Multispectral imaging using unmanned aerial systems (UAS) enables rapid and accurate detection of pest insect infestations, which are an increasing threat to midlatitude natural forests. Pest detection at the level of an individual tree is of particular importance in mixed forests, where it enables a sensible forest management approach. In this study, we propose a method for individual tree crown delineation (ITCD) followed by feature extraction to detect a bark beetle disturbance in a mixed urban forest using a photogrammetric point cloud (PPC) and a multispectral orthomosaic. An excess green index (ExG) threshold mask was applied before the ITCD to separate targeted coniferous trees from deciduous trees and backgrounds. The individual crowns of conifer trees were automatically delineated as (i) a full tree crown using marker-controlled watershed segmentation (MCWS), Dalponte2016 (DAL), and Li 2012 (LI) region growing algorithms or (ii) a buffer (BUFFER) around a treetop from the masked PPC. We statistically compared selected spectral and elevation features extracted from automatically delineated crowns (ADCs) of each method to reference tree crowns (RTC) to distinguish between the forest disturbance classes and two tree species. Moreover, the effect of PPC density on the ITCD accuracy and feature extraction was investigated. The ExG threshold mask application resulted in the excellent separability of targeted conifer trees and the increasing shape similarity of ADCs compared to RTC. The results revealed a strong effect of PPC density on treetop detection and ITCD. If the PPC density is sufficient (>10 points/m2), the ADCs produced by DAL, MCWS, and LI methods are comparable, and the extracted feature statistics of ADCs insignificantly differ from RTC. The BUFFER method is less suitable for detecting a bark beetle disturbance in the mixed forest because of the simplicity of crown delineation. It caused significant differences in extracted feature statistics compared to RTC. Therefore, the point density was found to be more significant than the algorithm used. We conclude that automatic ITCD methods may constitute a substitute for the time-consuming manual tree crown delineation in tree-based bark beetle disturbance detection and sanitation of individual infested trees using the suggested methodology and high-density (>20 points/m2, 10 points/m2 minimum) PPC.
Collapse
|
25
|
Remote Sensing in Agriculture—Accomplishments, Limitations, and Opportunities. REMOTE SENSING 2020. [DOI: 10.3390/rs12223783] [Citation(s) in RCA: 46] [Impact Index Per Article: 9.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2022]
Abstract
Remote sensing (RS) technologies provide a diagnostic tool that can serve as an early warning system, allowing the agricultural community to intervene early on to counter potential problems before they spread widely and negatively impact crop productivity. With the recent advancements in sensor technologies, data management and data analytics, currently, several RS options are available to the agricultural community. However, the agricultural sector is yet to implement RS technologies fully due to knowledge gaps on their sufficiency, appropriateness and techno-economic feasibilities. This study reviewed the literature between 2000 to 2019 that focused on the application of RS technologies in production agriculture, ranging from field preparation, planting, and in-season applications to harvesting, with the objective of contributing to the scientific understanding on the potential for RS technologies to support decision-making within different production stages. We found an increasing trend in the use of RS technologies in agricultural production over the past 20 years, with a sharp increase in applications of unmanned aerial systems (UASs) after 2015. The largest number of scientific papers related to UASs originated from Europe (34%), followed by the United States (20%) and China (11%). Most of the prior RS studies have focused on soil moisture and in-season crop health monitoring, and less in areas such as soil compaction, subsurface drainage, and crop grain quality monitoring. In summary, the literature highlighted that RS technologies can be used to support site-specific management decisions at various stages of crop production, helping to optimize crop production while addressing environmental quality, profitability, and sustainability.
Collapse
|
26
|
A Comparison of UAV and Satellites Multispectral Imagery in Monitoring Onion Crop. An Application in the ‘Cipolla Rossa di Tropea’ (Italy). REMOTE SENSING 2020. [DOI: 10.3390/rs12203424] [Citation(s) in RCA: 27] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
Precision agriculture (PA) is a management strategy that analyzes the spatial and temporal variability of agricultural fields using information and communication technologies with the aim to optimize profitability, sustainability, and protection of agro-ecological services. In the context of PA, this research evaluated the reliability of multispectral (MS) imagery collected at different spatial resolutions by an unmanned aerial vehicle (UAV) and PlanetScope and Sentinel-2 satellite platforms in monitoring onion crops over three different dates. The soil adjusted vegetation index (SAVI) was used for monitoring the vigor of the study field. Next, the vigor maps from the two satellite platforms with those derived from UAV were compared by statistical analysis in order to evaluate the contribution made by each platform for monitoring onion crops. Besides, the two coverage’s classes of the field, bare soil and onions, were spatially identified using geographical object-based image classification (GEOBIA), and their spectral contribution was analyzed comparing the SAVI calculated considering only crop pixels (i.e., SAVI onions) and that calculated considering only bare soil pixels (i.e., SAVI soil) with the SAVI from the three platforms. The results showed that satellite imagery, coherent and correlated with UAV images, could be useful to assess the general conditions of the field while UAV permits to discriminate localized circumscribed areas that the lowest resolution of satellites missed, where there are conditions of inhomogeneity in the field, determined by abiotic or biotic stresses.
Collapse
|
27
|
Spray Deposition on Weeds (Palmer Amaranth and Morningglory) from a Remotely Piloted Aerial Application System and Backpack Sprayer. DRONES 2020. [DOI: 10.3390/drones4030059] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
This study was designed to determine whether a remotely piloted aerial application system (RPAAS) could be used in lieu of a backpack sprayer for post-emergence herbicide application. Consequent to this objective, a spray mixture of tap water and fluorescent dye was applied on Palmer amaranth and ivyleaf morningglory using an RPAAS at 18.7 and 37.4 L·ha−1 and a CO2-pressurized backpack sprayer at a 140 L·ha−1 spray application rate. Spray efficiency (the proportion of applied spray collected on an artificial sampler) for the RPAAS treatments was comparable to that for the backpack sprayer. Fluorescent spray droplet density was significantly higher on the adaxial surface for the backpack sprayer treatment than that for the RPAAS platforms. The percent of spray droplets on the abaxial surface for the RPAAS aircraft at 37.4 L·ha−1 was 4-fold greater than that for the backpack sprayer at 140 L·ha−1. The increased spray deposition on the abaxial leaf surfaces was likely caused by rotor downwash and wind turbulence generated by the RPAAS which caused leaf fluttering. This improved spray deposition may help increase the efficacy of contact herbicides. Test results indicated that RPAASs may be used for herbicide application in lieu of conventional backpack sprayers.
Collapse
|
28
|
RGB Image-Derived Indicators for Spatial Assessment of the Impact of Broadleaf Weeds on Wheat Biomass. REMOTE SENSING 2020. [DOI: 10.3390/rs12182982] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/26/2022]
Abstract
In precision agriculture, the development of proximal imaging systems embedded in autonomous vehicles allows to explore new weed management strategies for site-specific plant application. Accurate monitoring of weeds while controlling wheat growth requires indirect measurements of leaf area index (LAI) and above-ground dry matter biomass (BM) at early growth stages. This article explores the potential of RGB images to assess crop-weed competition in a wheat (Triticum aestivum L.) crop by generating two new indicators, the weed pressure (WP) and the local wheat biomass production (δBMc). The fractional vegetation cover (FVC) of the crop and the weeds was automatically determined from the images with a SVM-RBF classifier, using bag of visual word vectors as inputs. It is based on a new vegetation index called MetaIndex, defined as a vote of six indices widely used in the literature. Beyond a simple map of weed infestation, the map of WP describes the crop-weed competition. The map of δBMc, meanwhile, evaluates the local wheat above-ground biomass production and informs us about a potential stress. It is generated from the wheat FVC because it is highly correlated with LAI (r2 = 0.99) and BM (r2 = 0.93) obtained by destructive methods. By combining these two indicators, we aim at determining whether the origin of the wheat stress is due to weeds or not. This approach opens up new perspectives for the monitoring of weeds and the monitoring of their competition during crop growth with non-destructive and proximal sensing technologies in the early stages of development.
Collapse
|
29
|
Detection of Italian Ryegrass in Wheat and Prediction of Competitive Interactions Using Remote-Sensing and Machine-Learning Techniques. REMOTE SENSING 2020. [DOI: 10.3390/rs12182977] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
Abstract
Italian ryegrass (Lolium perenne ssp. multiflorum (Lam) Husnot) is a troublesome weed species in wheat (Triticum aestivum) production in the United States, severely affecting grain yields. Spatial mapping of ryegrass infestation in wheat fields and early prediction of its impact on yield can assist management decision making. In this study, unmanned aerial systems (UAS)-based red, green and blue (RGB) imageries acquired at an early wheat growth stage in two different experimental sites were used for developing predictive models. Deep neural networks (DNNs) coupled with an extensive feature selection method were used to detect ryegrass in wheat and estimate ryegrass canopy coverage. Predictive models were developed by regressing early-season ryegrass canopy coverage (%) with end-of-season (at wheat maturity) biomass and seed yield of ryegrass, as well as biomass and grain yield reduction (%) of wheat. Italian ryegrass was detected with high accuracy (precision = 95.44 ± 4.27%, recall = 95.48 ± 5.05%, F-score = 95.56 ± 4.11%) using the best model which included four features: hue, saturation, excess green index, and visible atmospheric resistant index. End-of-season ryegrass biomass was predicted with high accuracy (R2 = 0.87), whereas the other variables had moderate to high accuracy levels (R2 values of 0.74 for ryegrass seed yield, 0.73 for wheat biomass reduction, and 0.69 for wheat grain yield reduction). The methodology demonstrated in the current study shows great potential for mapping and quantifying ryegrass infestation and predicting its competitive response in wheat, allowing for timely management decisions.
Collapse
|
30
|
A Deep Learning Approach for Weed Detection in Lettuce Crops Using Multispectral Images. AGRIENGINEERING 2020. [DOI: 10.3390/agriengineering2030032] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/27/2022]
Abstract
Weed management is one of the most important aspects of crop productivity; knowing the amount and the locations of weeds has been a problem that experts have faced for several decades. This paper presents three methods for weed estimation based on deep learning image processing in lettuce crops, and we compared them to visual estimations by experts. One method is based on support vector machines (SVM) using histograms of oriented gradients (HOG) as feature descriptor. The second method was based in YOLOV3 (you only look once V3), taking advantage of its robust architecture for object detection, and the third one was based on Mask R-CNN (region based convolutional neural network) in order to get an instance segmentation for each individual. These methods were complemented with a NDVI index (normalized difference vegetation index) as a background subtractor for removing non photosynthetic objects. According to chosen metrics, the machine and deep learning methods had F1-scores of 88%, 94%, and 94% respectively, regarding to crop detection. Subsequently, detected crops were turned into a binary mask and mixed with the NDVI background subtractor in order to detect weed in an indirect way. Once the weed image was obtained, the coverage percentage of weed was calculated by classical image processing methods. Finally, these performances were compared with the estimations of a set from weed experts through a Bland–Altman plot, intraclass correlation coefficients (ICCs) and Dunn’s test to obtain statistical measurements between every estimation (machine-human); we found that these methods improve accuracy on weed coverage estimation and minimize subjectivity in human-estimated data.
Collapse
|
31
|
Librán-Embid F, Klaus F, Tscharntke T, Grass I. Unmanned aerial vehicles for biodiversity-friendly agricultural landscapes - A systematic review. THE SCIENCE OF THE TOTAL ENVIRONMENT 2020; 732:139204. [PMID: 32438190 DOI: 10.1016/j.scitotenv.2020.139204] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/29/2019] [Revised: 04/28/2020] [Accepted: 05/02/2020] [Indexed: 06/11/2023]
Abstract
The development of biodiversity-friendly agricultural landscapes is of major importance to meet the sustainable development challenges of our time. The emergence of unmanned aerial vehicles (UAVs), i.e. drones, has opened a new set of research and management opportunities to achieve this goal. On the one hand, this review summarizes UAV applications in agricultural landscapes, focusing on biodiversity conservation and agricultural land monitoring, based on a systematic review of the literature that resulted in 550 studies. Additionally, the review proposes how to integrate UAV research in these fields and point to new potential applications that may contribute to biodiversity-friendly agricultural landscapes. UAV-based imagery can be used to identify and monitor plants, floral resources and animals, facilitating the detection of quality habitats with high prediction power. Through vegetation indices derived from their sensors, UAVs can estimate biomass, monitor crop plant health and stress, detect pest or pathogen infestations, monitor soil fertility and target patches of high weed or invasive plant pressure, allowing precise management practices and reduced agrochemical input. Thereby, UAVs are helping to design biodiversity-friendly agricultural landscapes and to mitigate yield-biodiversity trade-offs. In conclusion, UAV applications have become a major means of biodiversity conservation and biodiversity-friendly management in agriculture, while latest developments, such as the miniaturization and decreasing costs of hyperspectral sensors, promise many new applications for the future.
Collapse
Affiliation(s)
| | - Felix Klaus
- Agroecology, University of Göttingen, D-37077 Göttingen, Germany
| | - Teja Tscharntke
- Agroecology, University of Göttingen, D-37077 Göttingen, Germany
| | - Ingo Grass
- Department of Ecology of Tropical Agricultural Systems, University of Hohenheim, D-70599 Stuttgart, Germany
| |
Collapse
|
32
|
Delineation of Crop Field Areas and Boundaries from UAS Imagery Using PBIA and GEOBIA with Random Forest Classification. REMOTE SENSING 2020. [DOI: 10.3390/rs12162640] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Unmanned aircraft systems (UAS) have been proven cost- and time-effective remote-sensing platforms for precision agriculture applications. This study presents a method for automatic delineation of field areas and boundaries that uses UAS multispectral orthomosaics acquired over 7 vegetated fields having a variety of crops in Prince Edward Island (PEI). This information is needed by crop insurance agencies and growers for an accurate determination of crop insurance premiums. The field areas and boundaries were delineated by applying both a pixel-based and an object-based supervised random forest (RF) classifier applied to reflectance and vegetation index images, followed by a vectorization pipeline. Both methodologies performed exceptionally well, resulting in a mean area goodness of fit (AGoF) for the field areas greater than 98% and a mean boundary mean positional error (BMPE) lower than 0.8 m for the seven surveyed fields.
Collapse
|
33
|
Comparison of Object Detection and Patch-Based Classification Deep Learning Models on Mid- to Late-Season Weed Detection in UAV Imagery. REMOTE SENSING 2020. [DOI: 10.3390/rs12132136] [Citation(s) in RCA: 30] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Mid- to late-season weeds that escape from the routine early-season weed management threaten agricultural production by creating a large number of seeds for several future growing seasons. Rapid and accurate detection of weed patches in field is the first step of site-specific weed management. In this study, object detection-based convolutional neural network models were trained and evaluated over low-altitude unmanned aerial vehicle (UAV) imagery for mid- to late-season weed detection in soybean fields. The performance of two object detection models, Faster RCNN and the Single Shot Detector (SSD), were evaluated and compared in terms of weed detection performance using mean Intersection over Union (IoU) and inference speed. It was found that the Faster RCNN model with 200 box proposals had similar good weed detection performance to the SSD model in terms of precision, recall, f1 score, and IoU, as well as a similar inference time. The precision, recall, f1 score and IoU were 0.65, 0.68, 0.66 and 0.85 for Faster RCNN with 200 proposals, and 0.66, 0.68, 0.67 and 0.84 for SSD, respectively. However, the optimal confidence threshold of the SSD model was found to be much lower than that of the Faster RCNN model, which indicated that SSD might have lower generalization performance than Faster RCNN for mid- to late-season weed detection in soybean fields using UAV imagery. The performance of the object detection model was also compared with patch-based CNN model. The Faster RCNN model yielded a better weed detection performance than the patch-based CNN with and without overlap. The inference time of Faster RCNN was similar to patch-based CNN without overlap, but significantly less than patch-based CNN with overlap. Hence, Faster RCNN was found to be the best model in terms of weed detection performance and inference time among the different models compared in this study. This work is important in understanding the potential and identifying the algorithms for an on-farm, near real-time weed detection and management.
Collapse
|
34
|
Mapping and Estimating Weeds in Cotton Using Unmanned Aerial Systems-Borne Imagery. AGRIENGINEERING 2020. [DOI: 10.3390/agriengineering2020024] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
In recent years, Unmanned Aerial Systems (UAS) have emerged as an innovative technology to provide spatio-temporal information about weed species in crop fields. Such information is a critical input for any site-specific weed management program. A multi-rotor UAS (Phantom 4) equipped with an RGB sensor was used to collect imagery in three bands (Red, Green, and Blue; 0.8 cm/pixel resolution) with the objectives of (a) mapping weeds in cotton and (b) determining the relationship between image-based weed coverage and ground-based weed densities. For weed mapping, three different weed density levels (high, medium, and low) were established for a mix of different weed species, with three replications. To determine weed densities through ground truthing, five quadrats (1 m × 1 m) were laid out in each plot. The aerial imageries were preprocessed and subjected to Hough transformation to delineate cotton rows. Following the separation of inter-row vegetation from crop rows, a multi-level classification coupled with machine learning algorithms were used to distinguish intra-row weeds from cotton. Overall, accuracy levels of 89.16%, 85.83%, and 83.33% and kappa values of 0.84, 0.79, and 0.75 were achieved for detecting weed occurrence in high, medium, and low density plots, respectively. Further, ground-truthing based overall weed density values were fairly correlated (r2 = 0.80) with image-based weed coverage assessments. Among the specific weed species evaluated, Palmer amaranth (Amaranthus palmeri S. Watson) showed the highest correlation (r2 = 0.91) followed by red sprangletop (Leptochloa mucronata Michx) (r2 = 0.88). The results highlight the utility of UAS-borne RGB imagery for weed mapping and density estimation in cotton for precision weed management.
Collapse
|
35
|
Automated Identification of Crop Tree Crowns from UAV Multispectral Imagery by Means of Morphological Image Analysis. REMOTE SENSING 2020. [DOI: 10.3390/rs12050748] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Within the context of precision agriculture, goods insurance, public subsidies, fire damage assessment, etc., accurate knowledge about the plant population in crops represents valuable information. In this regard, the use of Unmanned Aerial Vehicles (UAVs) has proliferated as an alternative to traditional plant counting methods, which are laborious, time demanding and prone to human error. Hence, a methodology for the automated detection, geolocation and counting of crop trees in intensive cultivation orchards from high resolution multispectral images, acquired by UAV-based aerial imaging, is proposed. After image acquisition, the captures are processed by means of photogrammetry to yield a 3D point cloud-based representation of the study plot. To exploit the elevation information contained in it and eventually identify the plants, the cloud is deterministically interpolated, and subsequently transformed into a greyscale image. This image is processed, by using mathematical morphology techniques, in such a way that the absolute height of the trees with respect to their local surroundings is exploited to segment the tree pixel-regions, by global statistical thresholding binarization. This approach makes the segmentation process robust against surfaces with elevation variations of any magnitude, or to possible distracting artefacts with heights lower than expected. Finally, the segmented image is analysed by means of an ad-hoc moment representation-based algorithm to estimate the location of the trees. The methodology was tested in an intensive olive orchard of 17.5 ha, with a population of 3919 trees. Because of the plot’s plant density and tree spacing pattern, typical of intensive plantations, many occurrences of intra-row tree aggregations were observed, increasing the complexity of the scenario under study. Notwithstanding, it was achieved a precision of 99.92%, a sensibility of 99.67% and an F-score of 99.75%, thus correctly identifying and geolocating 3906 plants. The generated 3D point cloud reported root-mean square errors (RMSE) in the X, Y and Z directions of 0.73 m, 0.39 m and 1.20 m, respectively. These results support the viability and robustness of this methodology as a phenotyping solution for the automated plant counting and geolocation in olive orchards.
Collapse
|
36
|
Mapping Cynodon Dactylon Infesting Cover Crops with an Automatic Decision Tree-OBIA Procedure and UAV Imagery for Precision Viticulture. REMOTE SENSING 2019. [DOI: 10.3390/rs12010056] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/26/2022]
Abstract
The establishment and management of cover crops are common practices widely used in irrigated viticulture around the world, as they bring great benefits not only to protect and improve the soil, but also to control vine vigor and improve the yield quality, among others. However, these benefits are often reduced when cover crops are infested by Cynodon dactylon (bermudagrass), which impacts crop production due to its competition for water and nutrients and causes important economic losses for the winegrowers. Therefore, the discrimination of Cynodon dactylon in cover crops would enable site-specific control to be applied and thus drastically mitigate damage to the vineyard. In this context, this research proposes a novel, automatic and robust image analysis algorithm for the quick and accurate mapping of Cynodon dactylon growing in vineyard cover crops. The algorithm was developed using aerial images taken with an Unmanned Aerial Vehicle (UAV) and combined decision tree (DT) and object-based image analysis (OBIA) approaches. The relevance of this work consisted in dealing with the constraint caused by the spectral similarity of these complex scenarios formed by vines, cover crops, Cynodon dactylon, and bare soil. The incorporation of height information from the Digital Surface Model and several features selected by machine learning tools in the DT-OBIA algorithm solved this spectral similarity limitation and allowed the precise design of Cynodon dactylon maps. Another contribution of this work is the short time needed to apply the full process from UAV flights to image analysis, which can enable useful maps to be created on demand (within two days of the farmer´s request) and is thus timely for controlling Cynodon dactylon in the herbicide application window. Therefore, this combination of UAV imagery and a DT-OBIA algorithm would allow winegrowers to apply site-specific control of Cynodon dactylon and maintain cover crop-based management systems and their consequent benefits in the vineyards, and also comply with the European legal framework for the sustainable use of agricultural inputs and implementation of integrated crop management.
Collapse
|
37
|
Sentinel-2 Validation for Spatial Variability Assessment in Overhead Trellis System Viticulture Versus UAV and Agronomic Data. REMOTE SENSING 2019. [DOI: 10.3390/rs11212573] [Citation(s) in RCA: 29] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
Abstract
Several remote sensing technologies have been tested in precision viticulture to characterize vineyard spatial variability, from traditional aircraft and satellite platforms to recent unmanned aerial vehicles (UAVs). Imagery processing is still a challenge due to the traditional row-based architecture, where the inter-row soil provides a high to full presence of mixed pixels. In this case, UAV images combined with filtering techniques represent the solution to analyze pure canopy pixels and were used to benchmark the effectiveness of Sentinel-2 (S2) performance in overhead training systems. At harvest time, UAV filtered and unfiltered images and ground sampling data were used to validate the correlation between the S2 normalized difference vegetation indices (NDVIs) with vegetative and productive parameters in two vineyards (V1 and V2). Regarding the UAV vs. S2 NDVI comparison, in both vineyards, satellite data showed a high correlation both with UAV unfiltered and filtered images (V1 R2 = 0.80 and V2 R2 = 0.60 mean values). Ground data and remote sensing platform NDVIs correlation were strong for yield and biomass in both vineyards (R2 from 0.60 to 0.95). These results demonstrate the effectiveness of spatial resolution provided by S2 on overhead trellis system viticulture, promoting precision viticulture also within areas that are currently managed without the support of innovative technologies.
Collapse
|
38
|
Dash JP, Watt MS, Paul TSH, Morgenroth J, Hartley R. Taking a closer look at invasive alien plant research: A review of the current state, opportunities, and future directions for UAVs. Methods Ecol Evol 2019. [DOI: 10.1111/2041-210x.13296] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/25/2022]
Affiliation(s)
- Jonathan P. Dash
- Scion Rotorua New Zealand
- School of Forestry University of Canterbury Christchurch New Zealand
| | | | | | - Justin Morgenroth
- School of Forestry University of Canterbury Christchurch New Zealand
| | | |
Collapse
|
39
|
Lambert JPT, Childs DZ, Freckleton RP. Testing the ability of unmanned aerial systems and machine learning to map weeds at subfield scales: a test with the weed Alopecurus myosuroides (Huds). PEST MANAGEMENT SCIENCE 2019; 75:2283-2294. [PMID: 30972939 PMCID: PMC6767585 DOI: 10.1002/ps.5444] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/28/2018] [Revised: 04/03/2019] [Accepted: 04/08/2019] [Indexed: 06/09/2023]
Abstract
BACKGROUND It is important to map agricultural weed populations to improve management and maintain future food security. Advances in data collection and statistical methodology have created new opportunities to aid in the mapping of weed populations. We set out to apply these new methodologies (unmanned aerial systems; UAS) and statistical techniques (convolutional neural networks; CNN) to the mapping of black-grass, a highly impactful weed in wheat fields in the UK. We tested this by undertaking extensive UAS and field-based mapping over the course of 2 years, in total collecting multispectral image data from 102 fields, with 76 providing informative data. We used these data to construct a vegetation index (VI), which we used to train a custom CNN model from scratch. We undertook a suite of data engineering techniques, such as balancing and cleaning to optimize performance of our metrics. We also investigate the transferability of the models from one field to another. RESULTS The results show that our data collection methodology and implementation of CNN outperform pervious approaches in the literature. We show that data engineering to account for 'artefacts' in the image data increases our metrics significantly. We are not able to identify any traits that are shared between fields that result in high scores from our novel leave one field our cross validation (LOFO-CV) tests. CONCLUSION We conclude that this evaluation procedure is a better estimation of real-world predictive value when compared with past studies. We conclude that by engineering the image data set into discrete classes of data quality we increase the prediction accuracy from the baseline model by 5% to an area under the curve (AUC) of 0.825. We find that the temporal effects studied here have no effect on our ability to model weed densities. © 2019 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.
Collapse
Affiliation(s)
- James PT Lambert
- Department of Animal & Plant ScienceUniversity of SheffieldSheffieldU.K.
| | - Dylan Z Childs
- Department of Animal & Plant ScienceUniversity of SheffieldSheffieldU.K.
| | - Rob P Freckleton
- Department of Animal & Plant ScienceUniversity of SheffieldSheffieldU.K.
| |
Collapse
|
40
|
Jiménez-Brenes FM, López-Granados F, Torres-Sánchez J, Peña JM, Ramírez P, Castillejo-González IL, de Castro AI. Automatic UAV-based detection of Cynodon dactylon for site-specific vineyard management. PLoS One 2019; 14:e0218132. [PMID: 31185068 PMCID: PMC6559662 DOI: 10.1371/journal.pone.0218132] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2019] [Accepted: 05/25/2019] [Indexed: 11/19/2022] Open
Abstract
The perennial and stoloniferous weed, Cynodon dactylon (L.) Pers. (bermudagrass), is a serious problem in vineyards. The spectral similarity between bermudagrass and grapevines makes discrimination of the two species, based solely on spectral information from multi-band imaging sensor, unfeasible. However, that challenge can be overcome by use of object-based image analysis (OBIA) and ultra-high spatial resolution Unmanned Aerial Vehicle (UAV) images. This research aimed to automatically, accurately, and rapidly map bermudagrass and design maps for its management. Aerial images of two vineyards were captured using two multispectral cameras (RGB and RGNIR) attached to a UAV. First, spectral analysis was performed to select the optimum vegetation index (VI) for bermudagrass discrimination from bare soil. Then, the VI-based OBIA algorithm developed for each camera automatically mapped the grapevines, bermudagrass, and bare soil (accuracies greater than 97.7%). Finally, site-specific management maps were generated. Combining UAV imagery and a robust OBIA algorithm allowed the automatic mapping of bermudagrass. Analysis of the classified area made it possible to quantify grapevine growth and revealed expansion of bermudagrass infested areas. The generated bermudagrass maps could help farmers improve weed control through a well-programmed strategy. Therefore, the developed OBIA algorithm offers valuable geo-spatial information for designing site-specific bermudagrass management strategies leading farmers to potentially reduce herbicide use as well as optimize fuel, field operating time, and costs.
Collapse
Affiliation(s)
- Francisco Manuel Jiménez-Brenes
- Crop Protection Department, Institute for Sustainable Agriculture (IAS), Spanish National Research Council (CSIC), Córdoba, Spain
| | - Francisca López-Granados
- Crop Protection Department, Institute for Sustainable Agriculture (IAS), Spanish National Research Council (CSIC), Córdoba, Spain
| | - Jorge Torres-Sánchez
- Crop Protection Department, Institute for Sustainable Agriculture (IAS), Spanish National Research Council (CSIC), Córdoba, Spain
| | - José Manuel Peña
- Plant Protection Department, Institute of Agricultural Sciences (ICA), Spanish National Research Council (CSIC), Madrid, Spain
| | - Pilar Ramírez
- Crop Production Department, Andalusian Institute of Agricultural and Fisheries Research and Training (IFAPA), Cabra, Córdoba, Spain
| | | | - Ana Isabel de Castro
- Crop Protection Department, Institute for Sustainable Agriculture (IAS), Spanish National Research Council (CSIC), Córdoba, Spain
| |
Collapse
|
41
|
Methodological Ambiguity and Inconsistency Constrain Unmanned Aerial Vehicles as A Silver Bullet for Monitoring Ecological Restoration. REMOTE SENSING 2019. [DOI: 10.3390/rs11101180] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
The last decade has seen an exponential increase in the application of unmanned aerial vehicles (UAVs) to ecological monitoring research, though with little standardisation or comparability in methodological approaches and research aims. We reviewed the international peer-reviewed literature in order to explore the potential limitations on the feasibility of UAV-use in the monitoring of ecological restoration, and examined how they might be mitigated to maximise the quality, reliability and comparability of UAV-generated data. We found little evidence of translational research applying UAV-based approaches to ecological restoration, with less than 7% of 2133 published UAV monitoring studies centred around ecological restoration. Of the 48 studies, > 65% had been published in the three years preceding this study. Where studies utilised UAVs for rehabilitation or restoration applications, there was a strong propensity for single-sensor monitoring using commercially available RPAs fitted with the modest-resolution RGB sensors available. There was a strong positive correlation between the use of complex and expensive sensors (e.g., LiDAR, thermal cameras, hyperspectral sensors) and the complexity of chosen image classification techniques (e.g., machine learning), suggesting that cost remains a primary constraint to the wide application of multiple or complex sensors in UAV-based research. We propose that if UAV-acquired data are to represent the future of ecological monitoring, research requires a) consistency in the proven application of different platforms and sensors to the monitoring of target landforms, organisms and ecosystems, underpinned by clearly articulated monitoring goals and outcomes; b) optimization of data analysis techniques and the manner in which data are reported, undertaken in cross-disciplinary partnership with fields such as bioinformatics and machine learning; and c) the development of sound, reasonable and multi-laterally homogenous regulatory and policy framework supporting the application of UAVs to the large-scale and potentially trans-disciplinary ecological applications of the future.
Collapse
|
42
|
Comparison of Unsupervised Algorithms for Vineyard Canopy Segmentation from UAV Multispectral Images. REMOTE SENSING 2019. [DOI: 10.3390/rs11091023] [Citation(s) in RCA: 30] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Technical resources are currently supporting and enhancing the ability of precision agriculture techniques in crop management. The accuracy of prescription maps is a key aspect to ensure a fast and targeted intervention. In this context, remote sensing acquisition by unmanned aerial vehicles (UAV) is one of the most advanced platforms to collect imagery of the field. Besides the imagery acquisition, canopy segmentation among soil, plants and shadows is another practical and technical aspect that must be fast and precise to ensure a targeted intervention. In this paper, algorithms to be applied to UAV imagery are proposed according to the sensor used that could either be visible spectral or multispectral. These algorithms, called HSV-based (Hue, Saturation, Value), DEM (Digital Elevation Model) and K-means, are unsupervised, i.e., they perform canopy segmentation without human support. They were tested and compared in three different scenarios obtained from two vineyards over two years, 2017 and 2018 for RGB (Red-Green-Blue) and NRG (Near Infrared-Red-Green) imagery. Particular attention is given to the unsupervised ability of these algorithms to identify vines in these different acquisition conditions. This ability is quantified by the introduction of over- and under- estimation indexes, which are the algorithm’s ability to over-estimate or under-estimate vine canopies. For RGB imagery, the HSV-based algorithms consistently over-estimate vines, and never under-estimate them. The k-means and DEM method have a similar trend of under-estimation. While for NRG imagery, the HSV is the more stable algorithm and the DEM model slightly over-estimates the vines. HSV-based algorithms and the DEM algorithm have comparable computation time. The k-means algorithm increases computational demand as the quality of the DEM decreases. The algorithms developed can isolate canopy vegetation data, which is useful information about the current vineyard state, and can be used as a tool to be efficiently applied in the crop management procedure within precision viticulture applications.
Collapse
|
43
|
del-Campo-Sanchez A, Ballesteros R, Hernandez-Lopez D, Ortega JF, Moreno MA, on behalf of Agroforestry and Cartography Precision Research Group. Quantifying the effect of Jacobiasca lybica pest on vineyards with UAVs by combining geometric and computer vision techniques. PLoS One 2019; 14:e0215521. [PMID: 31009493 PMCID: PMC6476504 DOI: 10.1371/journal.pone.0215521] [Citation(s) in RCA: 20] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2018] [Accepted: 04/03/2019] [Indexed: 11/18/2022] Open
Abstract
With the increasing competitiveness in the vine market, coupled with the increasing need for sustainable use of resources, strategies for improving farm management are essential. One such effective strategy is the implementation of precision agriculture techniques. Using photogrammetric techniques, the digitalization of farms based on images acquired from unmanned aerial vehicles (UAVs) provides information that can assist in the improvement of farm management and decision-making processes. The objective of the present work is to quantify the impact of the pest Jacobiasca lybica on vineyards and to develop representative cartography of the severity of the infestation. To accomplish this work, computational vision algorithms based on an ANN (artificial neural network) combined with geometric techniques were applied to geomatic products using consumer-grade cameras in the visible spectra. The results showed that the combination of geometric and computational vision techniques with geomatic products generated from conventional RGB (red, green, blue) images improved image segmentation of the affected vegetation, healthy vegetation and ground. Thus, the proposed methodology using low-cost cameras is a more cost-effective application of UAVs compared with multispectral cameras. Moreover, the proposed method increases the accuracy of determining the impact of pests by eliminating the soil effects.
Collapse
Affiliation(s)
- Ana del-Campo-Sanchez
- Agroforestry and Cartography Precision Research Group, Institute for Regional Development, University of Castilla—La Mancha, Albacete, Spain
| | - Rocio Ballesteros
- Agroforestry and Cartography Precision Research Group, Institute for Regional Development, University of Castilla—La Mancha, Albacete, Spain
| | - David Hernandez-Lopez
- Agroforestry and Cartography Precision Research Group, Institute for Regional Development, University of Castilla—La Mancha, Albacete, Spain
| | - J. Fernando Ortega
- Agroforestry and Cartography Precision Research Group, Institute for Regional Development, University of Castilla—La Mancha, Albacete, Spain
| | - Miguel A. Moreno
- Agroforestry and Cartography Precision Research Group, Institute for Regional Development, University of Castilla—La Mancha, Albacete, Spain
- * E-mail:
| | | |
Collapse
|
44
|
Comparison of Satellite and UAV-Based Multispectral Imagery for Vineyard Variability Assessment. REMOTE SENSING 2019. [DOI: 10.3390/rs11040436] [Citation(s) in RCA: 98] [Impact Index Per Article: 16.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
In agriculture, remotely sensed data play a crucial role in providing valuable information on crop and soil status to perform effective management. Several spectral indices have proven to be valuable tools in describing crop spatial and temporal variability. In this paper, a detailed analysis and comparison of vineyard multispectral imagery, provided by decametric resolution satellite and low altitude Unmanned Aerial Vehicle (UAV) platforms, is presented. The effectiveness of Sentinel-2 imagery and of high-resolution UAV aerial images was evaluated by considering the well-known relation between the Normalised Difference Vegetation Index (NDVI) and crop vigour. After being pre-processed, the data from UAV was compared with the satellite imagery by computing three different NDVI indices to properly analyse the unbundled spectral contribution of the different elements in the vineyard environment considering: (i) the whole cropland surface; (ii) only the vine canopies; and (iii) only the inter-row terrain. The results show that the raw s resolution satellite imagery could not be directly used to reliably describe vineyard variability. Indeed, the contribution of inter-row surfaces to the remotely sensed dataset may affect the NDVI computation, leading to biased crop descriptors. On the contrary, vigour maps computed from the UAV imagery, considering only the pixels representing crop canopies, resulted to be more related to the in-field assessment compared to the satellite imagery. The proposed method may be extended to other crop typologies grown in rows or without intensive layout, where crop canopies do not extend to the whole surface or where the presence of weeds is significant.
Collapse
|
45
|
Li B, Xu X, Han J, Zhang L, Bian C, Jin L, Liu J. The estimation of crop emergence in potatoes by UAV RGB imagery. PLANT METHODS 2019; 15:15. [PMID: 30792752 PMCID: PMC6371461 DOI: 10.1186/s13007-019-0399-7] [Citation(s) in RCA: 39] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/26/2018] [Accepted: 01/31/2019] [Indexed: 05/19/2023]
Abstract
BACKGROUND Crop emergence and canopy cover are important physiological traits for potato (Solanum tuberosum L.) cultivar evaluation and nutrients management. They play important roles in variety screening, field management and yield prediction. Traditional manual assessment of these traits is not only laborious but often subjective. RESULTS In this study, semi-automated image analysis software was developed to estimate crop emergence from high-resolution RGB ortho-images captured from an unmanned aerial vehicle (UAV). Potato plant objects were extracted from bare soil using Excess Green Index and Otsu thresholding methods. Six morphological features were calculated from the images to be variables of a Random Forest classifier for estimating the number of potato plants at emergence stage. The outputs were then used to estimate crop emergence in three field experiments that were designed to investigate the effects of cultivars, levels of potassium (K) fertiliser input, and new compound fertilisers on potato growth. The results indicated that RGB UAV image analysis can accurately estimate potato crop emergence rate in comparison to manual assessment, with correlation coefficient ( r 2 ) of 0.96 and provide an efficient tool to evaluate emergence uniformity. CONCLUSIONS The proposed UAV image analysis method is a promising tool for use as a high throughput phenotyping method for assessing potato crop development at emergence stage. It can also facilitate future studies on optimizing fertiliser management and improving emergence consistency.
Collapse
Affiliation(s)
- Bo Li
- Institute of Vegetables and Flowers, Chinese Academy of Agricultural Sciences (CAAS)/Key Laboratory of Biology and Genetic Improvement of Tuber and Root Crops, Ministry of Agriculture, Beijing, China
- NIAB EMR, New Road, East Malling, Kent, ME19 4BD UK
| | - Xiangming Xu
- NIAB EMR, New Road, East Malling, Kent, ME19 4BD UK
| | - Jiwan Han
- Institute of Biological, Environmental and Rural Sciences (IBERS), Aberystwyth University, Penglais, Aberystwyth, Ceredigion, SY23 3FL UK
| | - Li Zhang
- NIAB EMR, New Road, East Malling, Kent, ME19 4BD UK
| | - Chunsong Bian
- Institute of Vegetables and Flowers, Chinese Academy of Agricultural Sciences (CAAS)/Key Laboratory of Biology and Genetic Improvement of Tuber and Root Crops, Ministry of Agriculture, Beijing, China
| | - Liping Jin
- Institute of Vegetables and Flowers, Chinese Academy of Agricultural Sciences (CAAS)/Key Laboratory of Biology and Genetic Improvement of Tuber and Root Crops, Ministry of Agriculture, Beijing, China
| | - Jiangang Liu
- Institute of Vegetables and Flowers, Chinese Academy of Agricultural Sciences (CAAS)/Key Laboratory of Biology and Genetic Improvement of Tuber and Root Crops, Ministry of Agriculture, Beijing, China
| |
Collapse
|
46
|
Detection of Helminthosporium Leaf Blotch Disease Based on UAV Imagery. APPLIED SCIENCES-BASEL 2019. [DOI: 10.3390/app9030558] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Helminthosporium leaf blotch (HLB) is a serious disease of wheat causing yield reduction globally. Usually, HLB disease is controlled by uniform chemical spraying, which is adopted by most farmers. However, increased use of chemical controls have caused agronomic and environmental problems. To solve these problems, an accurate spraying system must be applied. In this case, the disease detection over the whole field can provide decision support information for the spraying machines. The objective of this paper is to evaluate the potential of unmanned aerial vehicle (UAV) remote sensing for HLB detection. In this work, the UAV imagery acquisition and ground investigation were conducted in Central China on April 22th, 2017. Four disease categories (normal, light, medium, and heavy) were established based on different severity degrees. A convolutional neural network (CNN) was proposed for HLB disease classification. The experiments on data preprocessing, classification, and hyper-parameters tuning were conducted. The overall accuracy and standard error of the CNN method was 91.43% and 0.83%, which outperformed other methods in terms of accuracy and stabilization. Especially for the detection of the diseased samples, the CNN method significantly outperformed others. Experimental results showed that the HLB infected areas and healthy areas can be precisely discriminated based on UAV remote sensing data, indicating that UAV remote sensing can be proposed as an efficient tool for HLB disease detection.
Collapse
|
47
|
Roth L, Hund A, Aasen H. PhenoFly Planning Tool: flight planning for high-resolution optical remote sensing with unmanned areal systems. PLANT METHODS 2018; 14:116. [PMID: 30598692 PMCID: PMC6302310 DOI: 10.1186/s13007-018-0376-6] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/17/2018] [Accepted: 11/30/2018] [Indexed: 05/19/2023]
Abstract
BACKGROUND Driven by a huge improvement in automation, unmanned areal systems (UAS) are increasingly used for field observations and high-throughput phenotyping. Today, the bottleneck does not lie in the ability to fly a drone anymore, but rather in the appropriate flight planning to capture images with sufficient quality. Proper flight preparation for photography with digital frame cameras should include relevant concepts such as view, sharpness and exposure calculations. Additionally, if mapping areas with UASs, one has to consider concepts related to ground control points (GCPs), viewing geometry and way-point flights. Unfortunately, non of the available flight planning tools covers all these aspects. RESULTS We give an overview of concepts related to flight preparation, present the newly developed open source software PhenoFly Planning Tool, and evaluate other recent flight planning tools. We find that current flight planning and mapping tools strongly focus on vendor-specific solutions and mostly ignore basic photographic properties-our comparison shows, for example, that only two out of thirteen evaluated tools consider motion blur restrictions, and none of them depth of field limits. In contrast, PhenoFly Planning Tool enhances recent sophisticated UAS and autopilot systems with an optical remote sensing workflow that respects photographic concepts. The tool can assist in selecting the right equipment for your needs, experimenting with different flight settings to test the performance of the resulting imagery, preparing the field and GCP setup, and generating a flight path that can be exported as waypoints to be uploaded to an UAS. CONCLUSION By considering the introduced concepts, uncertainty in UAS-based remote sensing and high-throughput phenotyping may be considerably reduced. The presented software PhenoFly Planning Tool (https://shiny.usys.ethz.ch/PhenoFlyPlanningTool) helps users to comprehend and apply these concepts.
Collapse
Affiliation(s)
- Lukas Roth
- Institute of Agricultural Sciences, ETH Zurich, Universitätstrasse 2, 8092 Zurich, Switzerland
| | - Andreas Hund
- Institute of Agricultural Sciences, ETH Zurich, Universitätstrasse 2, 8092 Zurich, Switzerland
| | - Helge Aasen
- Institute of Agricultural Sciences, ETH Zurich, Universitätstrasse 2, 8092 Zurich, Switzerland
| |
Collapse
|
48
|
Intra-Season Crop Height Variability at Commercial Farm Scales Using a Fixed-Wing UAV. REMOTE SENSING 2018. [DOI: 10.3390/rs10122007] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Monitoring the development of vegetation height through time provides a key indicator of crop health and overall condition. Traditional manual approaches for monitoring crop height are generally time consuming, labor intensive and impractical for large-scale operations. Dynamic crop heights collected through the season allow for the identification of within-field problems at critical stages of the growth cycle, providing a mechanism for remedial action to be taken against end of season yield losses. With advances in unmanned aerial vehicle (UAV) technologies, routine monitoring of height is now feasible at any time throughout the growth cycle. To demonstrate this capability, five digital surface maps (DSM) were reconstructed from high-resolution RGB imagery collected over a field of maize during the course of a single growing season. The UAV retrievals were compared against LiDAR scans for the purpose of evaluating the derived point clouds capacity to capture ground surface variability and spatially variable crop height. A strong correlation was observed between structure-from-motion (SfM) derived heights and pixel-to-pixel comparison against LiDAR scan data for the intra-season bare-ground surface (R2 = 0.77 − 0.99, rRMSE = 0.44% − 0.85%), while there was reasonable agreement between canopy comparisons (R2 = 0.57 − 0.65, rRMSE = 37% − 50%). To examine the effect of resolution on retrieval accuracy and processing time, an evaluation of several ground sampling distances (GSD) was also performed. Our results indicate that a 10 cm resolution retrieval delivers a reliable product that provides a compromise between computational cost and spatial fidelity. Overall, UAV retrievals were able to accurately reproduce the observed spatial variability of crop heights within the maize field through the growing season and provide a valuable source of information with which to inform precision agricultural management in an operational context.
Collapse
|
49
|
Incorporating Surface Elevation Information in UAV Multispectral Images for Mapping Weed Patches. J Imaging 2018. [DOI: 10.3390/jimaging4110132] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Accurate mapping of weed distribution within a field is a first step towards effective weed management. The aim of this work was to improve the mapping of milk thistle (Silybum marianum) weed patches through unmanned aerial vehicle (UAV) images using auxiliary layers of information, such as spatial texture and estimated vegetation height from the UAV digital surface model. UAV multispectral images acquired in the visible and near-infrared parts of the spectrum were used as the main source of data, together with texture that was estimated for the image bands using a local variance filter. The digital surface model was created from structure from motion algorithms using the UAV image stereopairs. From this layer, the terrain elevation was estimated using a focal minimum filter followed by a low-pass filter. The plant height was computed by subtracting the terrain elevation from the digital surface model. Three classification algorithms (maximum likelihood, minimum distance and an object-based image classifier) were used to identify S. marianum from other vegetation using various combinations of inputs: image bands, texture and plant height. The resulting weed distribution maps were evaluated for their accuracy using field-surveyed data. Both texture and plant height have helped improve the accuracy of classification of S. marianum weed, increasing the overall accuracy of classification from 70% to 87% in 2015, and from 82% to 95% in 2016. Thus, as texture is easier to compute than plant height from a digital surface model, it may be preferable to be used in future weed mapping applications.
Collapse
|
50
|
Deep Learning with Unsupervised Data Labeling for Weed Detection in Line Crops in UAV Images. REMOTE SENSING 2018. [DOI: 10.3390/rs10111690] [Citation(s) in RCA: 49] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
In recent years, weeds have been responsible for most agricultural yield losses. To deal with this threat, farmers resort to spraying the fields uniformly with herbicides. This method not only requires huge quantities of herbicides but impacts the environment and human health. One way to reduce the cost and environmental impact is to allocate the right doses of herbicide to the right place and at the right time (precision agriculture). Nowadays, unmanned aerial vehicles (UAVs) are becoming an interesting acquisition system for weed localization and management due to their ability to obtain images of the entire agricultural field with a very high spatial resolution and at a low cost. However, despite significant advances in UAV acquisition systems, the automatic detection of weeds remains a challenging problem because of their strong similarity to the crops. Recently, a deep learning approach has shown impressive results in different complex classification problems. However, this approach needs a certain amount of training data, and creating large agricultural datasets with pixel-level annotations by an expert is an extremely time-consuming task. In this paper, we propose a novel fully automatic learning method using convolutional neuronal networks (CNNs) with an unsupervised training dataset collection for weed detection from UAV images. The proposed method comprises three main phases. First, we automatically detect the crop rows and use them to identify the inter-row weeds. In the second phase, inter-row weeds are used to constitute the training dataset. Finally, we perform CNNs on this dataset to build a model able to detect the crop and the weeds in the images. The results obtained are comparable to those of traditional supervised training data labeling, with differences in accuracy of 1.5% in the spinach field and 6% in the bean field.
Collapse
|