1
|
Seiche AT, Wittstruck L, Jarmer T. Weed Detection from Unmanned Aerial Vehicle Imagery Using Deep Learning-A Comparison between High-End and Low-Cost Multispectral Sensors. SENSORS (BASEL, SWITZERLAND) 2024; 24:1544. [PMID: 38475081 DOI: 10.3390/s24051544] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/30/2024] [Revised: 02/13/2024] [Accepted: 02/14/2024] [Indexed: 03/14/2024]
Abstract
In order to meet the increasing demand for crops under challenging climate conditions, efficient and sustainable cultivation strategies are becoming essential in agriculture. Targeted herbicide use reduces environmental pollution and effectively controls weeds as a major cause of yield reduction. The key requirement is a reliable weed detection system that is accessible to a wide range of end users. This research paper introduces a self-built, low-cost, multispectral camera system and evaluates it against the high-end MicaSense Altum system. Pixel-based weed and crop classification was performed on UAV datasets collected with both sensors in maize using a U-Net. The training and testing data were generated via an index-based thresholding approach followed by annotation. As a result, the F1-score for the weed class reached 82% on the Altum system and 76% on the low-cost system, with recall values of 75% and 68%, respectively. Misclassifications occurred on the low-cost system images for small weeds and overlaps, with minor oversegmentation. However, with a precision of 90%, the results show great potential for application in automated weed control. The proposed system thereby enables sustainable precision farming for the general public. In future research, its spectral properties, as well as its use on different crops with real-time on-board processing, should be further investigated.
Collapse
Affiliation(s)
- Anna Teresa Seiche
- Institute of Computer Science, Osnabrück University, 49090 Osnabrück, Germany
| | - Lucas Wittstruck
- Institute of Computer Science, Osnabrück University, 49090 Osnabrück, Germany
| | - Thomas Jarmer
- Institute of Computer Science, Osnabrück University, 49090 Osnabrück, Germany
| |
Collapse
|
2
|
Xu B, Meng R, Chen G, Liang L, Lv Z, Zhou L, Sun R, Zhao F, Yang W. Improved weed mapping in corn fields by combining UAV-based spectral, textural, structural, and thermal measurements. PEST MANAGEMENT SCIENCE 2023; 79:2591-2602. [PMID: 36883563 DOI: 10.1002/ps.7443] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/27/2022] [Revised: 01/20/2023] [Accepted: 03/08/2023] [Indexed: 06/02/2023]
Abstract
BACKGROUND Spatial-explicit weed information is critical for controlling weed infestation and reducing corn yield losses. The development of unmanned aerial vehicle (UAV)-based remote sensing presents an unprecedented opportunity for efficient, timely weed mapping. Spectral, textural, and structural measurements have been used for weed mapping, whereas thermal measurements-for example, canopy temperature (CT)-were seldom considered and used. In this study, we quantified the optimal combination of spectral, textural, structural, and CT measurements based on different machine-learning algorithms for weed mapping. RESULTS CT improved weed-mapping accuracies as complementary information for spectral, textural, and structural features (up to 5% and 0.051 improvements in overall accuracy [OA] and Marco-F1, respectively). The fusion of textural, structural, and thermal features achieved the best performance in weed mapping (OA = 96.4%, Marco-F1 = 0.964), followed by the fusion of structural and thermal features (OA = 93.6%, Marco-F1 = 0.936). The Support Vector Machine-based model achieved the best performance in weed mapping, with 3.5% and 7.1% improvements in OA and 0.036 and 0.071 in Marco-F1 respectively, compared with the best models of Random Forest and Naïve Bayes Classifier. CONCLUSION Thermal measurement can complement other types of remote-sensing measurements and improve the weed-mapping accuracy within the data-fusion framework. Importantly, integrating textural, structural, and thermal features achieved the best performance for weed mapping. Our study provides a novel method for weed mapping using UAV-based multisource remote sensing measurements, which is critical for ensuring crop production in precision agriculture. © 2023 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.
Collapse
Affiliation(s)
- Binyuan Xu
- College of Resources and Environment, Huazhong Agricultural University, Wuhan, China
| | - Ran Meng
- College of Resources and Environment, Huazhong Agricultural University, Wuhan, China
- HIT Institute for Artificial Intelligence Co. Ltd, Harbin, China
| | - Gengshen Chen
- National Key Laboratory of Crop Genetic Improvement, National Center of Plant Gene Research (Wuhan), Hubei Hongshan Laboratory, Huazhong Agricultural University, Wuhan, China
| | - Linlin Liang
- Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing, China
| | - Zhengang Lv
- College of Resources and Environment, Huazhong Agricultural University, Wuhan, China
| | - Longfei Zhou
- College of Resources and Environment, Huazhong Agricultural University, Wuhan, China
| | - Rui Sun
- College of Resources and Environment, Huazhong Agricultural University, Wuhan, China
| | - Feng Zhao
- Key Laboratory of Geographical Process Analysis & Simulation of Hubei Province, College of Urban and Environmental Sciences, Central China Normal University, Wuhan, China
| | - Wanneng Yang
- National Key Laboratory of Crop Genetic Improvement, National Center of Plant Gene Research (Wuhan), Hubei Hongshan Laboratory, Huazhong Agricultural University, Wuhan, China
| |
Collapse
|
3
|
Sapkota R, Stenger J, Ostlie M, Flores P. Towards reducing chemical usage for weed control in agriculture using UAS imagery analysis and computer vision techniques. Sci Rep 2023; 13:6548. [PMID: 37085558 PMCID: PMC10121711 DOI: 10.1038/s41598-023-33042-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2022] [Accepted: 04/06/2023] [Indexed: 04/23/2023] Open
Abstract
Currently, applying uniform distribution of chemical herbicide through a sprayer without considering the spatial distribution information of crops and weeds is the most common method of controlling weeds in commercial agricultural production system. This kind of weed management practice lead to excessive amounts of chemical herbicides being applied in a given field. The objective of this study was to perform site-specific weed control (SSWC) in a corn field by: (1) using a unmanned aerial system (UAS) to map the spatial distribution information of weeds in the field; (2) creating a prescription map based on the weed distribution map, and (3) spraying the field using the prescription map and a commercial size sprayer. In this study, we assumed that plants growing outside the corn rows are weeds and they need to be controlled. The first step in implementing such an approach is identifying the corn rows. For that, we are proposing a Crop Row Identification algorithm, a computer vision algorithm that identifies corn rows on UAS imagery. After being identified, the corn rows were then removed from the imagery and remaining vegetation fraction was classified as weeds. Based on that information, a grid-based weed prescription map was created and the weed control application was implemented through a commercial-size sprayer. The decision of spraying herbicides on a particular grid was based on the presence of weeds in that grid cell. All the grids that contained at least one weed were sprayed, while the grids free of weeds were not. Using our SSWC approach, we were able to save 26.2% of the acreage from being sprayed with herbicide compared to the current method. This study presents a full workflow from UAS image collection to field weed control implementation using a commercial size sprayer, and it shows that some level of savings can potentially be obtained even in a situation with high weed infestation, which might provide an opportunity to reduce chemical usage in corn production systems.
Collapse
Affiliation(s)
- Ranjan Sapkota
- Center for Precision and Automated Agricultural Systems, Washington State University, 24106 N. Bunn Rd, Prosser, WA, 99350, USA
- Agricultural and Biosystems Engineering, North Dakota State University, 1221 Albrecht Blvd, Fargo, ND, 58102, USA
| | - John Stenger
- Agricultural and Biosystems Engineering, North Dakota State University, 1221 Albrecht Blvd, Fargo, ND, 58102, USA
| | - Michael Ostlie
- NDSU Carrington Research Extension Center, Carrington, ND, 58421-0219, USA
| | - Paulo Flores
- Agricultural and Biosystems Engineering, North Dakota State University, 1221 Albrecht Blvd, Fargo, ND, 58102, USA.
| |
Collapse
|
4
|
Zhang W, Miao Z, Li N, He C, Sun T. Review of Current Robotic Approaches for Precision Weed Management. CURRENT ROBOTICS REPORTS 2022; 3:139-151. [PMID: 35891887 PMCID: PMC9305686 DOI: 10.1007/s43154-022-00086-5] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Accepted: 07/04/2022] [Indexed: 11/30/2022]
Abstract
Purpose of Review The goal of this review is to provide an overview of current robotic approaches to precision weed management. This includes an investigation into applications within this field during the past 5 years, identifying which major technical areas currently preclude more widespread use, and which key topics will drive future development and utilisation. Recent Findings Studies combining computer vision with traditional machine learning and deep learning are driving progress in weed detection and robotic approaches to mechanical weeding. Integrating key technologies for perception, decision-making, and control, autonomous weeding robots are emerging quickly. These effectively save effort while reducing environmental pollution caused by pesticide use. Summary This review assesses different weed detection methods and weeder robots used in precision weed management and summarises the trends in this area in recent years. The limitations of current systems are discussed, and ideas for future research directions are proposed.
Collapse
Affiliation(s)
- Wen Zhang
- Intelligent Equipment and Robotics Lab, Department of Automation, School of Mechatronic Engineering and Automation, Shanghai University, Shangda Street No. 99, Baoshan District, Shanghai, China
| | - Zhonghua Miao
- Intelligent Equipment and Robotics Lab, Department of Automation, School of Mechatronic Engineering and Automation, Shanghai University, Shangda Street No. 99, Baoshan District, Shanghai, China
| | - Nan Li
- Intelligent Equipment and Robotics Lab, Department of Automation, School of Mechatronic Engineering and Automation, Shanghai University, Shangda Street No. 99, Baoshan District, Shanghai, China
| | - Chuangxin He
- Intelligent Equipment and Robotics Lab, Department of Automation, School of Mechatronic Engineering and Automation, Shanghai University, Shangda Street No. 99, Baoshan District, Shanghai, China
| | - Teng Sun
- Intelligent Equipment and Robotics Lab, Department of Automation, School of Mechatronic Engineering and Automation, Shanghai University, Shangda Street No. 99, Baoshan District, Shanghai, China
| |
Collapse
|
5
|
Garibaldi-Márquez F, Flores G, Mercado-Ravell DA, Ramírez-Pedraza A, Valentín-Coronado LM. Weed Classification from Natural Corn Field-Multi-Plant Images Based on Shallow and Deep Learning. SENSORS (BASEL, SWITZERLAND) 2022; 22:3021. [PMID: 35459006 PMCID: PMC9032669 DOI: 10.3390/s22083021] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/30/2022] [Revised: 04/09/2022] [Accepted: 04/11/2022] [Indexed: 06/14/2023]
Abstract
Crop and weed discrimination in natural field environments is still challenging for implementing automatic agricultural practices, such as weed control. Some weed control methods have been proposed. However, these methods are still restricted as they are implemented under controlled conditions. The development of a sound weed control system begins by recognizing the crop and the different weed plants presented in the field. In this work, a classification approach of Zea mays L. (Crop), narrow-leaf weeds (NLW), and broadleaf weeds (BLW) from multi-plant images are presented. Moreover, a large image dataset was generated. Images were captured in natural field conditions, in different locations, and growing stages of the plants. The extraction of regions of interest (ROI) is carried out employing connected component analysis (CCA), whereas the classification of ROIs is based on Convolutional Neural Networks (CNN) and compared with a shallow learning approach. To measure the classification performance of both methods, accuracy, precision, recall, and F1-score metrics were used. The best alternative for the weed classification task at early stages of growth and in natural corn field environments was the CNN-based approach, as indicated by the 97% accuracy value obtained.
Collapse
Affiliation(s)
- Francisco Garibaldi-Márquez
- Centro de Investigaciones en Óptica A.C., Loma del Bosque 115, Leon 37150, Guanajuato, Mexico; (F.G.-M.); (G.F.); (A.R.-P.)
- Instituto Nacional de Investigaciones Forestales, Agrícolas y Pecuarias—Campo Experimental Pabellón, Pabellon de Arteaga 20671, Aguascalientes, Mexico
| | - Gerardo Flores
- Centro de Investigaciones en Óptica A.C., Loma del Bosque 115, Leon 37150, Guanajuato, Mexico; (F.G.-M.); (G.F.); (A.R.-P.)
| | - Diego A. Mercado-Ravell
- Centro de Investigación en Matemáticas A.C., Lasec y Andador Galileo Galilei, Quantum Ciudad del Conocimiento, Zacatecas 98160, Zacatecas, Mexico;
- Consejo Nacional de Ciencia y Tecnología, Ciudad de Mexico 03940, Mexico
| | - Alfonso Ramírez-Pedraza
- Centro de Investigaciones en Óptica A.C., Loma del Bosque 115, Leon 37150, Guanajuato, Mexico; (F.G.-M.); (G.F.); (A.R.-P.)
- Consejo Nacional de Ciencia y Tecnología, Ciudad de Mexico 03940, Mexico
| | - Luis M. Valentín-Coronado
- Centro de Investigaciones en Óptica A.C., Loma del Bosque 115, Leon 37150, Guanajuato, Mexico; (F.G.-M.); (G.F.); (A.R.-P.)
- Consejo Nacional de Ciencia y Tecnología, Ciudad de Mexico 03940, Mexico
| |
Collapse
|
6
|
Benos L, Tagarakis AC, Dolias G, Berruto R, Kateris D, Bochtis D. Machine Learning in Agriculture: A Comprehensive Updated Review. SENSORS (BASEL, SWITZERLAND) 2021; 21:3758. [PMID: 34071553 PMCID: PMC8198852 DOI: 10.3390/s21113758] [Citation(s) in RCA: 68] [Impact Index Per Article: 22.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/06/2021] [Revised: 05/21/2021] [Accepted: 05/24/2021] [Indexed: 01/05/2023]
Abstract
The digital transformation of agriculture has evolved various aspects of management into artificial intelligent systems for the sake of making value from the ever-increasing data originated from numerous sources. A subset of artificial intelligence, namely machine learning, has a considerable potential to handle numerous challenges in the establishment of knowledge-based farming systems. The present study aims at shedding light on machine learning in agriculture by thoroughly reviewing the recent scholarly literature based on keywords' combinations of "machine learning" along with "crop management", "water management", "soil management", and "livestock management", and in accordance with PRISMA guidelines. Only journal papers were considered eligible that were published within 2018-2020. The results indicated that this topic pertains to different disciplines that favour convergence research at the international level. Furthermore, crop management was observed to be at the centre of attention. A plethora of machine learning algorithms were used, with those belonging to Artificial Neural Networks being more efficient. In addition, maize and wheat as well as cattle and sheep were the most investigated crops and animals, respectively. Finally, a variety of sensors, attached on satellites and unmanned ground and aerial vehicles, have been utilized as a means of getting reliable input data for the data analyses. It is anticipated that this study will constitute a beneficial guide to all stakeholders towards enhancing awareness of the potential advantages of using machine learning in agriculture and contributing to a more systematic research on this topic.
Collapse
Affiliation(s)
- Lefteris Benos
- Centre of Research and Technology-Hellas (CERTH), Institute for Bio-Economy and Agri-Technology (IBO), 6th km Charilaou-Thermi Rd, GR 57001 Thessaloniki, Greece; (L.B.); (A.C.T.); (G.D.); (D.K.)
| | - Aristotelis C. Tagarakis
- Centre of Research and Technology-Hellas (CERTH), Institute for Bio-Economy and Agri-Technology (IBO), 6th km Charilaou-Thermi Rd, GR 57001 Thessaloniki, Greece; (L.B.); (A.C.T.); (G.D.); (D.K.)
| | - Georgios Dolias
- Centre of Research and Technology-Hellas (CERTH), Institute for Bio-Economy and Agri-Technology (IBO), 6th km Charilaou-Thermi Rd, GR 57001 Thessaloniki, Greece; (L.B.); (A.C.T.); (G.D.); (D.K.)
| | - Remigio Berruto
- Department of Agriculture, Forestry and Food Science (DISAFA), University of Turin, Largo Braccini 2, 10095 Grugliasco, Italy;
| | - Dimitrios Kateris
- Centre of Research and Technology-Hellas (CERTH), Institute for Bio-Economy and Agri-Technology (IBO), 6th km Charilaou-Thermi Rd, GR 57001 Thessaloniki, Greece; (L.B.); (A.C.T.); (G.D.); (D.K.)
| | - Dionysis Bochtis
- Centre of Research and Technology-Hellas (CERTH), Institute for Bio-Economy and Agri-Technology (IBO), 6th km Charilaou-Thermi Rd, GR 57001 Thessaloniki, Greece; (L.B.); (A.C.T.); (G.D.); (D.K.)
- FarmB Digital Agriculture P.C., Doiranis 17, GR 54639 Thessaloniki, Greece
| |
Collapse
|
7
|
Can Commercial Low-Cost Drones and Open-Source GIS Technologies Be Suitable for Semi-Automatic Weed Mapping for Smart Farming? A Case Study in NE Italy. REMOTE SENSING 2021. [DOI: 10.3390/rs13101869] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Weed management is a crucial issue in agriculture, resulting in environmental in-field and off-field impacts. Within Agriculture 4.0, adoption of UASs combined with spatially explicit approaches may drastically reduce doses of herbicides, increasing sustainability in weed management. However, Agriculture 4.0 technologies are barely adopted in small-medium size farms. Recently, small and low-cost UASs, together with open-source software packages, may represent a low-cost spatially explicit system to map weed distribution in crop fields. The general aim is to map weed distribution by a low-cost UASs and a replicable workflow, completely based on open GIS software and algorithms: OpenDroneMap, QGIS, SAGA and OpenCV classification algorithms. Specific objectives are: (i) testing a low-cost UAS for weed mapping; (ii) assessing open-source packages for semi-automatic weed classification; (iii) performing a sustainable management scenario by prescription maps. Results showed high performances along the whole process: in orthomosaic generation at very high spatial resolution (0.01 m/pixel), in testing weed detection (Matthews Correlation Coefficient: 0.67–0.74), and in the production of prescription maps, reducing herbicide treatment to only 3.47% of the entire field. This study reveals the feasibility of low-cost UASs combined with open-source software, enabling a spatially explicit approach for weed management in small-medium size farmlands.
Collapse
|
8
|
Comparison of Sentinel-2 and UAV Multispectral Data for Use in Precision Agriculture: An Application from Northern Greece. DRONES 2021. [DOI: 10.3390/drones5020035] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/22/2022]
Abstract
The scope of this work is to compare Sentinel-2 and unmanned aerial vehicles (UAV) imagery from northern Greece for use in precision agriculture by implementing statistical analysis and 2D visualization. Surveys took place on five dates with a difference between the sensing dates for the two techniques ranging from 1 to 4 days. Using the acquired images, we initially computed the maps of the Normalized Difference Vegetation Index (NDVI), then the values of this index for fifteen points and four polygons (areas). The UAV images were not resampled, aiming to compare both techniques based on their initial standards, as they are used by the farmers. Similarities between the two techniques are depicted on the trend of the NDVI means for both satellite and UAV techniques, considering the points and the polygons. The differences are in the a) mean NDVI values of the points and b) range of the NDVI values of the polygons probably because of the difference in the spatial resolution of the two techniques. The correlation coefficient of the NDVI values, considering both points and polygons, ranges between 83.5% and 98.26%. In conclusion, both techniques provide important information in precision agriculture depending on the spatial extent, resolution, and cost, as well as the requirements of the survey.
Collapse
|
9
|
RGB Image-Derived Indicators for Spatial Assessment of the Impact of Broadleaf Weeds on Wheat Biomass. REMOTE SENSING 2020. [DOI: 10.3390/rs12182982] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/26/2022]
Abstract
In precision agriculture, the development of proximal imaging systems embedded in autonomous vehicles allows to explore new weed management strategies for site-specific plant application. Accurate monitoring of weeds while controlling wheat growth requires indirect measurements of leaf area index (LAI) and above-ground dry matter biomass (BM) at early growth stages. This article explores the potential of RGB images to assess crop-weed competition in a wheat (Triticum aestivum L.) crop by generating two new indicators, the weed pressure (WP) and the local wheat biomass production (δBMc). The fractional vegetation cover (FVC) of the crop and the weeds was automatically determined from the images with a SVM-RBF classifier, using bag of visual word vectors as inputs. It is based on a new vegetation index called MetaIndex, defined as a vote of six indices widely used in the literature. Beyond a simple map of weed infestation, the map of WP describes the crop-weed competition. The map of δBMc, meanwhile, evaluates the local wheat above-ground biomass production and informs us about a potential stress. It is generated from the wheat FVC because it is highly correlated with LAI (r2 = 0.99) and BM (r2 = 0.93) obtained by destructive methods. By combining these two indicators, we aim at determining whether the origin of the wheat stress is due to weeds or not. This approach opens up new perspectives for the monitoring of weeds and the monitoring of their competition during crop growth with non-destructive and proximal sensing technologies in the early stages of development.
Collapse
|
10
|
Advanced Machine Learning in Point Spectroscopy, RGB- and Hyperspectral-Imaging for Automatic Discriminations of Crops and Weeds: A Review. SMART CITIES 2020. [DOI: 10.3390/smartcities3030039] [Citation(s) in RCA: 21] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
Abstract
Crop productivity is readily reduced by competition from weeds. It is particularly important to control weeds early to prevent yield losses. Limited herbicide choices and increasing costs of weed management are threatening the profitability of crops. Smart agriculture can use intelligent technology to accurately measure the distribution of weeds in the field and perform weed control tasks in selected areas, which cannot only improve the effectiveness of pesticides, but also increase the economic benefits of agricultural products. The most important thing for an automatic system to remove weeds within crop rows is to utilize reliable sensing technology to achieve accurate differentiation of weeds and crops at specific locations in the field. In recent years, there have been many significant achievements involving the differentiation of crops and weeds. These studies are related to the development of rapid and non-destructive sensors, as well as the analysis methods for the data obtained. This paper presents a review of the use of three sensing methods including spectroscopy, color imaging, and hyperspectral imaging in the discrimination of crops and weeds. Several algorithms of machine learning have been employed for data analysis such as convolutional neural network (CNN), artificial neural network (ANN), and support vector machine (SVM). Successful applications include the weed detection in grain crops (such as maize, wheat, and soybean), vegetable crops (such as tomato, lettuce, and radish), and fiber crops (such as cotton) with unsupervised or supervised learning. This review gives a brief introduction into proposed sensing and machine learning methods, then provides an overview of instructive examples of these techniques for weed/crop discrimination. The discussion describes the recent progress made in the development of automated technology for accurate plant identification as well as the challenges and future prospects. It is believed that this review is of great significance to those who study automatic plant care in crops using intelligent technology.
Collapse
|
11
|
Mapping and Estimating Weeds in Cotton Using Unmanned Aerial Systems-Borne Imagery. AGRIENGINEERING 2020. [DOI: 10.3390/agriengineering2020024] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
In recent years, Unmanned Aerial Systems (UAS) have emerged as an innovative technology to provide spatio-temporal information about weed species in crop fields. Such information is a critical input for any site-specific weed management program. A multi-rotor UAS (Phantom 4) equipped with an RGB sensor was used to collect imagery in three bands (Red, Green, and Blue; 0.8 cm/pixel resolution) with the objectives of (a) mapping weeds in cotton and (b) determining the relationship between image-based weed coverage and ground-based weed densities. For weed mapping, three different weed density levels (high, medium, and low) were established for a mix of different weed species, with three replications. To determine weed densities through ground truthing, five quadrats (1 m × 1 m) were laid out in each plot. The aerial imageries were preprocessed and subjected to Hough transformation to delineate cotton rows. Following the separation of inter-row vegetation from crop rows, a multi-level classification coupled with machine learning algorithms were used to distinguish intra-row weeds from cotton. Overall, accuracy levels of 89.16%, 85.83%, and 83.33% and kappa values of 0.84, 0.79, and 0.75 were achieved for detecting weed occurrence in high, medium, and low density plots, respectively. Further, ground-truthing based overall weed density values were fairly correlated (r2 = 0.80) with image-based weed coverage assessments. Among the specific weed species evaluated, Palmer amaranth (Amaranthus palmeri S. Watson) showed the highest correlation (r2 = 0.91) followed by red sprangletop (Leptochloa mucronata Michx) (r2 = 0.88). The results highlight the utility of UAS-borne RGB imagery for weed mapping and density estimation in cotton for precision weed management.
Collapse
|
12
|
Sudars K, Jasko J, Namatevs I, Ozola L, Badaukis N. Dataset of annotated food crops and weed images for robotic computer vision control. Data Brief 2020; 31:105833. [PMID: 32577458 PMCID: PMC7305380 DOI: 10.1016/j.dib.2020.105833] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2020] [Revised: 06/03/2020] [Accepted: 06/03/2020] [Indexed: 12/02/2022] Open
Abstract
Weed management technologies that can identify weeds and distinguish them from crops are in need of artificial intelligence solutions based on a computer vision approach, to enable the development of precisely targeted and autonomous robotic weed management systems. A prerequisite of such systems is to create robust and reliable object detection that can unambiguously distinguish weed from food crops. One of the essential steps towards precision agriculture is using annotated images to train convolutional neural networks to distinguish weed from food crops, which can be later followed using mechanical weed removal or selected spraying of herbicides. In this data paper, we propose an open-access dataset with manually annotated images for weed detection. The dataset is composed of 1118 images in which 6 food crops and 8 weed species are identified, altogether 7853 annotations were made in total. Three RGB digital cameras were used for image capturing: Intel RealSense D435, Canon EOS 800D, and Sony W800. The images were taken on food crops and weeds grown in controlled environment and field conditions at different growth stages
Collapse
Affiliation(s)
- Kaspars Sudars
- Institute of Electronics and Computer Science, Dzērbenes str.14, Riga LV-1006, Latvia
| | - Janis Jasko
- Institute for Plant Protection Research `Agrihorts', Latvia University of Life Sciences and Technologies, P. Lejiņa str. 2, LV-3004 Jelgava, Latvia
| | - Ivars Namatevs
- Institute of Electronics and Computer Science, Dzērbenes str.14, Riga LV-1006, Latvia
| | - Liva Ozola
- Institute of Electronics and Computer Science, Dzērbenes str.14, Riga LV-1006, Latvia
| | - Niks Badaukis
- Institute for Plant Protection Research `Agrihorts', Latvia University of Life Sciences and Technologies, P. Lejiņa str. 2, LV-3004 Jelgava, Latvia
| |
Collapse
|
13
|
UAV Detection of Sinapis arvensis Infestation in Alfalfa Plots Using Simple Vegetation Indices from Conventional Digital Cameras. AGRIENGINEERING 2020. [DOI: 10.3390/agriengineering2020012] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Unmanned Aerial Vehicles (UAVs) offer excellent survey capabilities at low cost to provide farmers with information about the type and distribution of weeds in their fields. In this study, the problem of detecting the infestation of a typical weed (charlock mustard) in an alfalfa crop has been addressed using conventional digital cameras installed on a lightweight UAV to compare RGB-based indices with the widely used Normalized Difference Vegetation Index (NDVI) index. The simple (R−B)/(R+B) and (R−B)/(R+B+G) vegetation indices allowed one to easily discern the yellow weed from the green crop. Moreover, they avoided the potential confusion of weeds with soil observed for the NDVI index. The small overestimation detected in the weed identification when the RGB indices were used could be easily reduced by using them in conjunction with NDVI. The proposed methodology may be used in the generation of weed cover maps for alfalfa, which may then be translated into site-specific herbicide treatment maps.
Collapse
|
14
|
Integration of Remote Sensing and GIS to Extract Plantation Rows from A Drone-Based Image Point Cloud Digital Surface Model. ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION 2020. [DOI: 10.3390/ijgi9030151] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Automated feature extraction from drone-based image point clouds (DIPC) is of paramount importance in precision agriculture (PA). PA is blessed with mechanized row seedlings to attain maximum yield and best management practices. Therefore, automated plantation rows extraction is essential in crop harvesting, pest management, and plant grow-rate predictions. Most of the existing research is consists on red, green, and blue (RGB) image-based solutions to extract plantation rows with the minimal background noise of test study sites. DIPC-based DSM row extraction solutions have not been tested frequently. In this research work, an automated method is designed to extract plantation row from DIPC-based DSM. The chosen plantation compartments have three different levels of background noise in UAVs images, therefore, methodology was tested under different background noises. The extraction results were quantified in terms of completeness, correctness, quality, and F1-score values. The case study revealed the potential of DIPC-based solution to extraction the plantation rows with an F1-score value of 0.94 for a plantation compartment with minimal background noises, 0.91 value for a highly noised compartment, and 0.85 for a compartment where DIPC was compromised. The evaluation suggests that DSM-based solutions are robust as compared to RGB image-based solutions to extract plantation-rows. Additionally, DSM-based solutions can be further extended to assess the plantation rows surface deformation caused by humans and machines and state-of-the-art is redefined.
Collapse
|
15
|
Sentinel-2 Validation for Spatial Variability Assessment in Overhead Trellis System Viticulture Versus UAV and Agronomic Data. REMOTE SENSING 2019. [DOI: 10.3390/rs11212573] [Citation(s) in RCA: 29] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
Abstract
Several remote sensing technologies have been tested in precision viticulture to characterize vineyard spatial variability, from traditional aircraft and satellite platforms to recent unmanned aerial vehicles (UAVs). Imagery processing is still a challenge due to the traditional row-based architecture, where the inter-row soil provides a high to full presence of mixed pixels. In this case, UAV images combined with filtering techniques represent the solution to analyze pure canopy pixels and were used to benchmark the effectiveness of Sentinel-2 (S2) performance in overhead training systems. At harvest time, UAV filtered and unfiltered images and ground sampling data were used to validate the correlation between the S2 normalized difference vegetation indices (NDVIs) with vegetative and productive parameters in two vineyards (V1 and V2). Regarding the UAV vs. S2 NDVI comparison, in both vineyards, satellite data showed a high correlation both with UAV unfiltered and filtered images (V1 R2 = 0.80 and V2 R2 = 0.60 mean values). Ground data and remote sensing platform NDVIs correlation were strong for yield and biomass in both vineyards (R2 from 0.60 to 0.95). These results demonstrate the effectiveness of spatial resolution provided by S2 on overhead trellis system viticulture, promoting precision viticulture also within areas that are currently managed without the support of innovative technologies.
Collapse
|
16
|
Comparison of Satellite and UAV-Based Multispectral Imagery for Vineyard Variability Assessment. REMOTE SENSING 2019. [DOI: 10.3390/rs11040436] [Citation(s) in RCA: 98] [Impact Index Per Article: 19.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
In agriculture, remotely sensed data play a crucial role in providing valuable information on crop and soil status to perform effective management. Several spectral indices have proven to be valuable tools in describing crop spatial and temporal variability. In this paper, a detailed analysis and comparison of vineyard multispectral imagery, provided by decametric resolution satellite and low altitude Unmanned Aerial Vehicle (UAV) platforms, is presented. The effectiveness of Sentinel-2 imagery and of high-resolution UAV aerial images was evaluated by considering the well-known relation between the Normalised Difference Vegetation Index (NDVI) and crop vigour. After being pre-processed, the data from UAV was compared with the satellite imagery by computing three different NDVI indices to properly analyse the unbundled spectral contribution of the different elements in the vineyard environment considering: (i) the whole cropland surface; (ii) only the vine canopies; and (iii) only the inter-row terrain. The results show that the raw s resolution satellite imagery could not be directly used to reliably describe vineyard variability. Indeed, the contribution of inter-row surfaces to the remotely sensed dataset may affect the NDVI computation, leading to biased crop descriptors. On the contrary, vigour maps computed from the UAV imagery, considering only the pixels representing crop canopies, resulted to be more related to the in-field assessment compared to the satellite imagery. The proposed method may be extended to other crop typologies grown in rows or without intensive layout, where crop canopies do not extend to the whole surface or where the presence of weeds is significant.
Collapse
|
17
|
Olsen A, Konovalov DA, Philippa B, Ridd P, Wood JC, Johns J, Banks W, Girgenti B, Kenny O, Whinney J, Calvert B, Azghadi MR, White RD. DeepWeeds: A Multiclass Weed Species Image Dataset for Deep Learning. Sci Rep 2019; 9:2058. [PMID: 30765729 PMCID: PMC6375952 DOI: 10.1038/s41598-018-38343-3] [Citation(s) in RCA: 64] [Impact Index Per Article: 12.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2018] [Accepted: 12/18/2018] [Indexed: 11/13/2022] Open
Abstract
Robotic weed control has seen increased research of late with its potential for boosting productivity in agriculture. Majority of works focus on developing robotics for croplands, ignoring the weed management problems facing rangeland stock farmers. Perhaps the greatest obstacle to widespread uptake of robotic weed control is the robust classification of weed species in their natural environment. The unparalleled successes of deep learning make it an ideal candidate for recognising various weed species in the complex rangeland environment. This work contributes the first large, public, multiclass image dataset of weed species from the Australian rangelands; allowing for the development of robust classification methods to make robotic weed control viable. The DeepWeeds dataset consists of 17,509 labelled images of eight nationally significant weed species native to eight locations across northern Australia. This paper presents a baseline for classification performance on the dataset using the benchmark deep learning models, Inception-v3 and ResNet-50. These models achieved an average classification accuracy of 95.1% and 95.7%, respectively. We also demonstrate real time performance of the ResNet-50 architecture, with an average inference time of 53.4 ms per image. These strong results bode well for future field implementation of robotic weed control methods in the Australian rangelands.
Collapse
Affiliation(s)
- Alex Olsen
- College of Science and Engineering, James Cook University, Townsville, QLD, 4811, Australia.
| | - Dmitry A Konovalov
- College of Science and Engineering, James Cook University, Townsville, QLD, 4811, Australia
| | - Bronson Philippa
- College of Science and Engineering, James Cook University, Townsville, QLD, 4811, Australia
| | - Peter Ridd
- College of Science and Engineering, James Cook University, Townsville, QLD, 4811, Australia
| | - Jake C Wood
- College of Science and Engineering, James Cook University, Townsville, QLD, 4811, Australia
| | - Jamie Johns
- College of Science and Engineering, James Cook University, Townsville, QLD, 4811, Australia
| | - Wesley Banks
- College of Science and Engineering, James Cook University, Townsville, QLD, 4811, Australia
| | - Benjamin Girgenti
- College of Science and Engineering, James Cook University, Townsville, QLD, 4811, Australia
| | - Owen Kenny
- College of Science and Engineering, James Cook University, Townsville, QLD, 4811, Australia
| | - James Whinney
- College of Science and Engineering, James Cook University, Townsville, QLD, 4811, Australia
| | - Brendan Calvert
- College of Science and Engineering, James Cook University, Townsville, QLD, 4811, Australia
| | - Mostafa Rahimi Azghadi
- College of Science and Engineering, James Cook University, Townsville, QLD, 4811, Australia
| | - Ronald D White
- College of Science and Engineering, James Cook University, Townsville, QLD, 4811, Australia
| |
Collapse
|
18
|
Ghosal S, Zheng B, Chapman SC, Potgieter AB, Jordan DR, Wang X, Singh AK, Singh A, Hirafuji M, Ninomiya S, Ganapathysubramanian B, Sarkar S, Guo W. A Weakly Supervised Deep Learning Framework for Sorghum Head Detection and Counting. PLANT PHENOMICS (WASHINGTON, D.C.) 2019; 2019:1525874. [PMID: 33313521 PMCID: PMC7706102 DOI: 10.34133/2019/1525874] [Citation(s) in RCA: 60] [Impact Index Per Article: 12.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/26/2018] [Accepted: 05/30/2019] [Indexed: 05/19/2023]
Abstract
The yield of cereal crops such as sorghum (Sorghum bicolor L. Moench) depends on the distribution of crop-heads in varying branching arrangements. Therefore, counting the head number per unit area is critical for plant breeders to correlate with the genotypic variation in a specific breeding field. However, measuring such phenotypic traits manually is an extremely labor-intensive process and suffers from low efficiency and human errors. Moreover, the process is almost infeasible for large-scale breeding plantations or experiments. Machine learning-based approaches like deep convolutional neural network (CNN) based object detectors are promising tools for efficient object detection and counting. However, a significant limitation of such deep learning-based approaches is that they typically require a massive amount of hand-labeled images for training, which is still a tedious process. Here, we propose an active learning inspired weakly supervised deep learning framework for sorghum head detection and counting from UAV-based images. We demonstrate that it is possible to significantly reduce human labeling effort without compromising final model performance (R 2 between human count and machine count is 0.88) by using a semitrained CNN model (i.e., trained with limited labeled data) to perform synthetic annotation. In addition, we also visualize key features that the network learns. This improves trustworthiness by enabling users to better understand and trust the decisions that the trained deep learning model makes.
Collapse
Affiliation(s)
- Sambuddha Ghosal
- Department of Mechanical Engineering, Iowa State University, Ames, IA, USA
- Department of Computer Science, Iowa State University, Ames, IA, USA
| | - Bangyou Zheng
- CSIRO Agriculture and Food, St. Lucia, QLD, Australia
| | - Scott C. Chapman
- CSIRO Agriculture and Food, St. Lucia, QLD, Australia
- School of Agriculture and Food Sciences, The University of Queensland, Gatton, QLD 4343, Australia
| | - Andries B. Potgieter
- Queensland Alliance for Agriculture and Food Innovation (QAAFI), The University of Queensland, Gatton, QLD, Australia
| | - David R. Jordan
- Queensland Alliance for Agriculture and Food Innovation (QAAFI), The University of Queensland, Warwick, QLD, Australia
| | - Xuemin Wang
- Queensland Alliance for Agriculture and Food Innovation (QAAFI), The University of Queensland, Warwick, QLD, Australia
| | | | - Arti Singh
- Department of Agronomy, Iowa State University, Ames, IA, USA
| | - Masayuki Hirafuji
- International Field Phenomics Research Laboratory, Institute for Sustainable Agro-Ecosystem Services, Graduate School of Agricultural and Life Sciences, The University of Tokyo, Tokyo, Japan
| | - Seishi Ninomiya
- International Field Phenomics Research Laboratory, Institute for Sustainable Agro-Ecosystem Services, Graduate School of Agricultural and Life Sciences, The University of Tokyo, Tokyo, Japan
| | | | - Soumik Sarkar
- Department of Mechanical Engineering, Iowa State University, Ames, IA, USA
| | - Wei Guo
- International Field Phenomics Research Laboratory, Institute for Sustainable Agro-Ecosystem Services, Graduate School of Agricultural and Life Sciences, The University of Tokyo, Tokyo, Japan
| |
Collapse
|
19
|
Deep Learning with Unsupervised Data Labeling for Weed Detection in Line Crops in UAV Images. REMOTE SENSING 2018. [DOI: 10.3390/rs10111690] [Citation(s) in RCA: 49] [Impact Index Per Article: 8.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
In recent years, weeds have been responsible for most agricultural yield losses. To deal with this threat, farmers resort to spraying the fields uniformly with herbicides. This method not only requires huge quantities of herbicides but impacts the environment and human health. One way to reduce the cost and environmental impact is to allocate the right doses of herbicide to the right place and at the right time (precision agriculture). Nowadays, unmanned aerial vehicles (UAVs) are becoming an interesting acquisition system for weed localization and management due to their ability to obtain images of the entire agricultural field with a very high spatial resolution and at a low cost. However, despite significant advances in UAV acquisition systems, the automatic detection of weeds remains a challenging problem because of their strong similarity to the crops. Recently, a deep learning approach has shown impressive results in different complex classification problems. However, this approach needs a certain amount of training data, and creating large agricultural datasets with pixel-level annotations by an expert is an extremely time-consuming task. In this paper, we propose a novel fully automatic learning method using convolutional neuronal networks (CNNs) with an unsupervised training dataset collection for weed detection from UAV images. The proposed method comprises three main phases. First, we automatically detect the crop rows and use them to identify the inter-row weeds. In the second phase, inter-row weeds are used to constitute the training dataset. Finally, we perform CNNs on this dataset to build a model able to detect the crop and the weeds in the images. The results obtained are comparable to those of traditional supervised training data labeling, with differences in accuracy of 1.5% in the spinach field and 6% in the bean field.
Collapse
|