1
|
Liu Y, Zeng F, Diao H, Zhu J, Ji D, Liao X, Zhao Z. YOLOv8 Model for Weed Detection in Wheat Fields Based on a Visual Converter and Multi-Scale Feature Fusion. SENSORS (BASEL, SWITZERLAND) 2024; 24:4379. [PMID: 39001158 PMCID: PMC11244458 DOI: 10.3390/s24134379] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/07/2024] [Revised: 06/25/2024] [Accepted: 07/03/2024] [Indexed: 07/16/2024]
Abstract
Accurate weed detection is essential for the precise control of weeds in wheat fields, but weeds and wheat are sheltered from each other, and there is no clear size specification, making it difficult to accurately detect weeds in wheat. To achieve the precise identification of weeds, wheat weed datasets were constructed, and a wheat field weed detection model, YOLOv8-MBM, based on improved YOLOv8s, was proposed. In this study, a lightweight visual converter (MobileViTv3) was introduced into the C2f module to enhance the detection accuracy of the model by integrating input, local (CNN), and global (ViT) features. Secondly, a bidirectional feature pyramid network (BiFPN) was introduced to enhance the performance of multi-scale feature fusion. Furthermore, to address the weak generalization and slow convergence speed of the CIoU loss function for detection tasks, the bounding box regression loss function (MPDIOU) was used instead of the CIoU loss function to improve the convergence speed of the model and further enhance the detection performance. Finally, the model performance was tested on the wheat weed datasets. The experiments show that the YOLOv8-MBM proposed in this paper is superior to Fast R-CNN, YOLOv3, YOLOv4-tiny, YOLOv5s, YOLOv7, YOLOv9, and other mainstream models in regards to detection performance. The accuracy of the improved model reaches 92.7%. Compared with the original YOLOv8s model, the precision, recall, mAP1, and mAP2 are increased by 10.6%, 8.9%, 9.7%, and 9.3%, respectively. In summary, the YOLOv8-MBM model successfully meets the requirements for accurate weed detection in wheat fields.
Collapse
Affiliation(s)
- Yinzeng Liu
- Mechanical and Electronic Engineering College, Shandong Agriculture and Engineering University, Jinan 250100, China; (Y.L.); (F.Z.); (H.D.); (D.J.); (X.L.)
| | - Fandi Zeng
- Mechanical and Electronic Engineering College, Shandong Agriculture and Engineering University, Jinan 250100, China; (Y.L.); (F.Z.); (H.D.); (D.J.); (X.L.)
| | - Hongwei Diao
- Mechanical and Electronic Engineering College, Shandong Agriculture and Engineering University, Jinan 250100, China; (Y.L.); (F.Z.); (H.D.); (D.J.); (X.L.)
| | - Junke Zhu
- School of Agricultural Engineering and Food Science, Shandong University of Technology, Zibo 255000, China;
| | - Dong Ji
- Mechanical and Electronic Engineering College, Shandong Agriculture and Engineering University, Jinan 250100, China; (Y.L.); (F.Z.); (H.D.); (D.J.); (X.L.)
| | - Xijie Liao
- Mechanical and Electronic Engineering College, Shandong Agriculture and Engineering University, Jinan 250100, China; (Y.L.); (F.Z.); (H.D.); (D.J.); (X.L.)
| | - Zhihuan Zhao
- Mechanical and Electronic Engineering College, Shandong Agriculture and Engineering University, Jinan 250100, China; (Y.L.); (F.Z.); (H.D.); (D.J.); (X.L.)
| |
Collapse
|
2
|
Ambuj, Machavaram R. Neuromorphic computing spiking neural network edge detection model for content based image retrieval. NETWORK (BRISTOL, ENGLAND) 2024:1-31. [PMID: 38708841 DOI: 10.1080/0954898x.2024.2348018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/08/2023] [Accepted: 04/22/2024] [Indexed: 05/07/2024]
Abstract
In contemporary times, content-based image retrieval (CBIR) techniques have gained widespread acceptance as a means for end-users to discern and extract specific image content from vast repositories. However, it is noteworthy that a substantial majority of CBIR studies continue to rely on linear methodologies such as gradient-based and derivative-based edge detection techniques. This research explores the integration of bioinspired Spiking Neural Network (SNN) based edge detection within CBIR. We introduce an innovative, computationally efficient SNN-based approach designed explicitly for CBIR applications, outperforming existing SNN models by reducing computational overhead by 2.5 times. The proposed SNN-based edge detection approach is seamlessly incorporated into three distinct CBIR techniques, each employing conventional edge detection methodologies including Sobel, Canny, and image derivatives. Rigorous experimentation and evaluations are carried out utilizing the Corel-10k dataset and crop weed dataset, a widely recognized and frequently adopted benchmark dataset in the realm of image analysis. Importantly, our findings underscore the enhanced performance of CBIR methodologies integrating the proposed SNN-based edge detection approach, with an average increase in mean precision values exceeding 3%. This study conclusively demonstrated the utility of our proposed methodology in optimizing feature extraction, thereby establishing its pivotal role in advancing edge centric CBIR approaches.
Collapse
Affiliation(s)
- Ambuj
- Agricultural and Food Engineering Department, Indian Institute of Technology Kharagpur, Kharagpur, West Bengal, India
| | - Rajendra Machavaram
- Agricultural and Food Engineering Department, Indian Institute of Technology Kharagpur, Kharagpur, West Bengal, India
| |
Collapse
|
3
|
Genze N, Vahl WK, Groth J, Wirth M, Grieb M, Grimm DG. Manually annotated and curated Dataset of diverse Weed Species in Maize and Sorghum for Computer Vision. Sci Data 2024; 11:109. [PMID: 38263173 PMCID: PMC10805845 DOI: 10.1038/s41597-024-02945-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2023] [Accepted: 01/10/2024] [Indexed: 01/25/2024] Open
Abstract
Sustainable weed management strategies are critical to feeding the world's population while preserving ecosystems and biodiversity. Therefore, site-specific weed control strategies based on automation are needed to reduce the additional time and effort required for weeding. Machine vision-based methods appear to be a promising approach for weed detection, but require high quality data on the species in a specific agricultural area. Here we present a dataset, the Moving Fields Weed Dataset (MFWD), which captures the growth of 28 weed species commonly found in sorghum and maize fields in Germany. A total of 94,321 images were acquired in a fully automated, high-throughput phenotyping facility to track over 5,000 individual plants at high spatial and temporal resolution. A rich set of manually curated ground truth information is also provided, which can be used not only for plant species classification, object detection and instance segmentation tasks, but also for multiple object tracking.
Collapse
Grants
- G2/N/19/13 Bayerisches Staatsministerium für Ernährung, Landwirtschaft und Forsten (Bavarian Ministry of Food, Agriculture and Forestry)
- G2/N/19/13 Bayerisches Staatsministerium für Ernährung, Landwirtschaft und Forsten (Bavarian Ministry of Food, Agriculture and Forestry)
- G2/N/19/13 Bayerisches Staatsministerium für Ernährung, Landwirtschaft und Forsten (Bavarian Ministry of Food, Agriculture and Forestry)
- G2/N/19/13 Bayerisches Staatsministerium für Ernährung, Landwirtschaft und Forsten (Bavarian Ministry of Food, Agriculture and Forestry)
- G2/N/19/13 Bayerisches Staatsministerium für Ernährung, Landwirtschaft und Forsten (Bavarian Ministry of Food, Agriculture and Forestry)
- Bayerisches Staatsministerium für Ernährung, Landwirtschaft und Forsten (Bavarian Ministry of Food, Agriculture and Forestry)
Collapse
Affiliation(s)
- Nikita Genze
- Technical University of Munich, TUM Campus Straubing for Biotechnology and Sustainability, Bioinformatics, Schulgasse 22, 94315, Straubing, Germany
- Weihenstephan-Triesdorf University of Applied Sciences, Bioinformatics, Petersgasse 18, 94315, Straubing, Germany
| | - Wouter K Vahl
- Institute for Crop Science and Plant Breeding, Bavarian State Research Center for Agriculture, Am Gereuth 6, 85354, Freising, Germany
| | - Jennifer Groth
- Institute for Crop Science and Plant Breeding, Bavarian State Research Center for Agriculture, Am Gereuth 6, 85354, Freising, Germany
| | - Maximilian Wirth
- Technical University of Munich, TUM Campus Straubing for Biotechnology and Sustainability, Bioinformatics, Schulgasse 22, 94315, Straubing, Germany
- Weihenstephan-Triesdorf University of Applied Sciences, Bioinformatics, Petersgasse 18, 94315, Straubing, Germany
| | - Michael Grieb
- Technology and Support Centre in the Centre of Excellence for Renewable Resources (TFZ), Schulgasse 18, 94315, Straubing, Germany
| | - Dominik G Grimm
- Technical University of Munich, TUM Campus Straubing for Biotechnology and Sustainability, Bioinformatics, Schulgasse 22, 94315, Straubing, Germany.
- Weihenstephan-Triesdorf University of Applied Sciences, Bioinformatics, Petersgasse 18, 94315, Straubing, Germany.
- Technical University of Munich, TUM School of Computation, Information and Technology (CIT), Boltzmannstr. 3, 85748, Garching, Germany.
| |
Collapse
|
4
|
Yordanov M, d'Andrimont R, Martinez-Sanchez L, Lemoine G, Fasbender D, van der Velde M. Crop Identification Using Deep Learning on LUCAS Crop Cover Photos. SENSORS (BASEL, SWITZERLAND) 2023; 23:6298. [PMID: 37514593 PMCID: PMC10383911 DOI: 10.3390/s23146298] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/12/2023] [Revised: 07/05/2023] [Accepted: 07/06/2023] [Indexed: 07/30/2023]
Abstract
Massive and high-quality in situ data are essential for Earth-observation-based agricultural monitoring. However, field surveying requires considerable organizational effort and money. Using computer vision to recognize crop types on geo-tagged photos could be a game changer allowing for the provision of timely and accurate crop-specific information. This study presents the first use of the largest multi-year set of labelled close-up in situ photos systematically collected across the European Union from the Land Use Cover Area frame Survey (LUCAS). Benefiting from this unique in situ dataset, this study aims to benchmark and test computer vision models to recognize major crops on close-up photos statistically distributed spatially and through time between 2006 and 2018 in a practical agricultural policy relevant context. The methodology makes use of crop calendars from various sources to ascertain the mature stage of the crop, of an extensive paradigm for the hyper-parameterization of MobileNet from random parameter initialization, and of various techniques from information theory in order to carry out more accurate post-processing filtering on results. The work has produced a dataset of 169,460 images of mature crops for the 12 classes, out of which 15,876 were manually selected as representing a clean sample without any foreign objects or unfavorable conditions. The best-performing model achieved a macro F1 (M-F1) of 0.75 on an imbalanced test dataset of 8642 photos. Using metrics from information theory, namely the equivalence reference probability, resulted in an increase of 6%. The most unfavorable conditions for taking such images, across all crop classes, were found to be too early or late in the season. The proposed methodology shows the possibility of using minimal auxiliary data outside the images themselves in order to achieve an M-F1 of 0.82 for labelling between 12 major European crops.
Collapse
Affiliation(s)
| | | | | | - Guido Lemoine
- European Commission, Joint Research Centre (JRC), 21027 Ispra, Italy
| | - Dominique Fasbender
- European Commission, Joint Research Centre (JRC), 21027 Ispra, Italy
- Walloon Institute of Evaluation, Foresight and Statistics (IWEPS), 5001 Namur, Belgium
| | | |
Collapse
|
5
|
Ripanda A, Luanda A, Sule KS, Mtabazi GS, Makangara JJ. Galinsoga parviflora (Cav.): A comprehensive review on ethnomedicinal, phytochemical and pharmacological studies. Heliyon 2023; 9:e13517. [PMID: 36846665 PMCID: PMC9946856 DOI: 10.1016/j.heliyon.2023.e13517] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2022] [Revised: 12/04/2022] [Accepted: 02/01/2023] [Indexed: 02/11/2023] Open
Abstract
Galinsoga parviflora (Cav.) is a member of the Asteraceae family traditionally used for treatment of various ailments such as malaria, flu, cold, colorectal cancer, liver problems and inflammation. The medicinal properties of G. parviflora are due to the presence of various secondary metabolites including flavonoids, saponins, terpenoids and tannins. The literature survey revealed that G. parviflora possesses several pharmacological properties such as antibacterial, antifungal, antioxidant and antidiabetic. This review systematically discusses the potential of G. parviflora for managing medical conditions. The information is collected from various online databases such as Google Scholar, ScienceDirect, Springer, Web of Science, Plant of the World Online and PubMed. Among other information provided in this review, ethnomedicinal uses, phytochemistry and pharmacological activities are discussed extensively. Additonally, the potential benefits, challenges and future opportunities are presented.
Collapse
|
6
|
Wang P, Tang Y, Luo F, Wang L, Li C, Niu Q, Li H. Weed25: A deep learning dataset for weed identification. FRONTIERS IN PLANT SCIENCE 2022; 13:1053329. [PMID: 36531369 PMCID: PMC9748680 DOI: 10.3389/fpls.2022.1053329] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/25/2022] [Accepted: 10/24/2022] [Indexed: 06/17/2023]
Abstract
Weed suppression is an important factor affecting crop yields. Precise identification of weed species will contribute to automatic weeding by applying proper herbicides, hoeing position determination, and hoeing depth to specific plants as well as reducing crop injury. However, the lack of datasets of weeds in the field has limited the application of deep learning techniques in weed management. In this paper, it presented a dataset of weeds in fields, Weed25, which contained 14,035 images of 25 different weed species. Both monocot and dicot weed image resources were included in this dataset. Meanwhile, weed images at different growth stages were also recorded. Several common deep learning detection models-YOLOv3, YOLOv5, and Faster R-CNN-were applied for weed identification model training using this dataset. The results showed that the average accuracy of detection under the same training parameters were 91.8%, 92.4%, and 92.15% respectively. It presented that Weed25 could be a potential effective training resource for further development of in-field real-time weed identification models. The dataset is available at https://pan.baidu.com/s/1rnUoDm7IxxmX1n1LmtXNXw; the password is rn5h.
Collapse
Affiliation(s)
- Pei Wang
- Key Laboratory of Agricultural Equipment for Hilly and Mountain Areas, College of Engineering and Technology, Southwest University, Chongqing, China
- Key Laboratory of Modern Agricultural Equipment and Technology (Jiangsu University), Ministry of Education, School of Agricultural Engineering, Jiangsu University, Zhenjiang, China
- Interdisciplinary Research Center for Agriculture Green Development in Yangtze River Basin, Southwest University, Chongqing, China
| | - Yin Tang
- Key Laboratory of Agricultural Equipment for Hilly and Mountain Areas, College of Engineering and Technology, Southwest University, Chongqing, China
| | - Fan Luo
- Key Laboratory of Agricultural Equipment for Hilly and Mountain Areas, College of Engineering and Technology, Southwest University, Chongqing, China
| | - Lihong Wang
- Key Laboratory of Agricultural Equipment for Hilly and Mountain Areas, College of Engineering and Technology, Southwest University, Chongqing, China
| | - Chengsong Li
- Key Laboratory of Agricultural Equipment for Hilly and Mountain Areas, College of Engineering and Technology, Southwest University, Chongqing, China
| | - Qi Niu
- Key Laboratory of Agricultural Equipment for Hilly and Mountain Areas, College of Engineering and Technology, Southwest University, Chongqing, China
| | - Hui Li
- Key Laboratory of Agricultural Equipment for Hilly and Mountain Areas, College of Engineering and Technology, Southwest University, Chongqing, China
- National Citrus Engineering Research Center, Chinese Academy of Agricultural Sciences and Southwest University, Chongqing, China
| |
Collapse
|
7
|
Yağanoğlu M. Hepatitis C virus data analysis and prediction using machine learning. DATA KNOWL ENG 2022. [DOI: 10.1016/j.datak.2022.102087] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022]
|
8
|
Xu K, Jiang Z, Liu Q, Xie Q, Zhu Y, Cao W, Ni J. Multi-modal and multi-view image dataset for weeds detection in wheat field. FRONTIERS IN PLANT SCIENCE 2022; 13:936748. [PMID: 36072331 PMCID: PMC9443486 DOI: 10.3389/fpls.2022.936748] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/05/2022] [Accepted: 07/20/2022] [Indexed: 06/15/2023]
Affiliation(s)
- Ke Xu
- College of Agriculture, Nanjing Agricultural University, Nanjing, China
- National Engineering and Technology Center for Information Agriculture, Nanjing, China
- Engineering Research Center of Smart Agriculture, Ministry of Education, Nanjing, China
- Collaborative Innovation Center for Modern Crop Production Co-sponsored by Province and Ministry, Nanjing, China
| | - Zhijian Jiang
- College of Artificial Intelligence, Nanjing Agricultural University, Nanjing, China
| | - Qihang Liu
- College of Artificial Intelligence, Nanjing Agricultural University, Nanjing, China
| | - Qi Xie
- College of Agriculture, Nanjing Agricultural University, Nanjing, China
- National Engineering and Technology Center for Information Agriculture, Nanjing, China
- Engineering Research Center of Smart Agriculture, Ministry of Education, Nanjing, China
- Collaborative Innovation Center for Modern Crop Production Co-sponsored by Province and Ministry, Nanjing, China
| | - Yan Zhu
- College of Agriculture, Nanjing Agricultural University, Nanjing, China
- National Engineering and Technology Center for Information Agriculture, Nanjing, China
- Engineering Research Center of Smart Agriculture, Ministry of Education, Nanjing, China
- Collaborative Innovation Center for Modern Crop Production Co-sponsored by Province and Ministry, Nanjing, China
| | - Weixing Cao
- College of Agriculture, Nanjing Agricultural University, Nanjing, China
- National Engineering and Technology Center for Information Agriculture, Nanjing, China
- Engineering Research Center of Smart Agriculture, Ministry of Education, Nanjing, China
- Collaborative Innovation Center for Modern Crop Production Co-sponsored by Province and Ministry, Nanjing, China
| | - Jun Ni
- College of Agriculture, Nanjing Agricultural University, Nanjing, China
- National Engineering and Technology Center for Information Agriculture, Nanjing, China
- Engineering Research Center of Smart Agriculture, Ministry of Education, Nanjing, China
- Collaborative Innovation Center for Modern Crop Production Co-sponsored by Province and Ministry, Nanjing, China
| |
Collapse
|
9
|
Zhang W, Miao Z, Li N, He C, Sun T. Review of Current Robotic Approaches for Precision Weed Management. CURRENT ROBOTICS REPORTS 2022; 3:139-151. [PMID: 35891887 PMCID: PMC9305686 DOI: 10.1007/s43154-022-00086-5] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Accepted: 07/04/2022] [Indexed: 11/30/2022]
Abstract
Purpose of Review The goal of this review is to provide an overview of current robotic approaches to precision weed management. This includes an investigation into applications within this field during the past 5 years, identifying which major technical areas currently preclude more widespread use, and which key topics will drive future development and utilisation. Recent Findings Studies combining computer vision with traditional machine learning and deep learning are driving progress in weed detection and robotic approaches to mechanical weeding. Integrating key technologies for perception, decision-making, and control, autonomous weeding robots are emerging quickly. These effectively save effort while reducing environmental pollution caused by pesticide use. Summary This review assesses different weed detection methods and weeder robots used in precision weed management and summarises the trends in this area in recent years. The limitations of current systems are discussed, and ideas for future research directions are proposed.
Collapse
Affiliation(s)
- Wen Zhang
- Intelligent Equipment and Robotics Lab, Department of Automation, School of Mechatronic Engineering and Automation, Shanghai University, Shangda Street No. 99, Baoshan District, Shanghai, China
| | - Zhonghua Miao
- Intelligent Equipment and Robotics Lab, Department of Automation, School of Mechatronic Engineering and Automation, Shanghai University, Shangda Street No. 99, Baoshan District, Shanghai, China
| | - Nan Li
- Intelligent Equipment and Robotics Lab, Department of Automation, School of Mechatronic Engineering and Automation, Shanghai University, Shangda Street No. 99, Baoshan District, Shanghai, China
| | - Chuangxin He
- Intelligent Equipment and Robotics Lab, Department of Automation, School of Mechatronic Engineering and Automation, Shanghai University, Shangda Street No. 99, Baoshan District, Shanghai, China
| | - Teng Sun
- Intelligent Equipment and Robotics Lab, Department of Automation, School of Mechatronic Engineering and Automation, Shanghai University, Shangda Street No. 99, Baoshan District, Shanghai, China
| |
Collapse
|
10
|
Deep Neural Networks to Detect Weeds from Crops in Agricultural Environments in Real-Time: A Review. REMOTE SENSING 2021. [DOI: 10.3390/rs13214486] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
Abstract
Automation, including machine learning technologies, are becoming increasingly crucial in agriculture to increase productivity. Machine vision is one of the most popular parts of machine learning and has been widely used where advanced automation and control have been required. The trend has shifted from classical image processing and machine learning techniques to modern artificial intelligence (AI) and deep learning (DL) methods. Based on large training datasets and pre-trained models, DL-based methods have proven to be more accurate than previous traditional techniques. Machine vision has wide applications in agriculture, including the detection of weeds and pests in crops. Variation in lighting conditions, failures to transfer learning, and object occlusion constitute key challenges in this domain. Recently, DL has gained much attention due to its advantages in object detection, classification, and feature extraction. DL algorithms can automatically extract information from large amounts of data used to model complex problems and is, therefore, suitable for detecting and classifying weeds and crops. We present a systematic review of AI-based systems to detect weeds, emphasizing recent trends in DL. Various DL methods are discussed to clarify their overall potential, usefulness, and performance. This study indicates that several limitations obstruct the widespread adoption of AI/DL in commercial applications. Recommendations for overcoming these challenges are summarized.
Collapse
|
11
|
Danilevicz MF, Bayer PE, Nestor BJ, Bennamoun M, Edwards D. Resources for image-based high-throughput phenotyping in crops and data sharing challenges. PLANT PHYSIOLOGY 2021; 187:699-715. [PMID: 34608963 PMCID: PMC8561249 DOI: 10.1093/plphys/kiab301] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/04/2020] [Accepted: 05/26/2021] [Indexed: 05/06/2023]
Abstract
High-throughput phenotyping (HTP) platforms are capable of monitoring the phenotypic variation of plants through multiple types of sensors, such as red green and blue (RGB) cameras, hyperspectral sensors, and computed tomography, which can be associated with environmental and genotypic data. Because of the wide range of information provided, HTP datasets represent a valuable asset to characterize crop phenotypes. As HTP becomes widely employed with more tools and data being released, it is important that researchers are aware of these resources and how they can be applied to accelerate crop improvement. Researchers may exploit these datasets either for phenotype comparison or employ them as a benchmark to assess tool performance and to support the development of tools that are better at generalizing between different crops and environments. In this review, we describe the use of image-based HTP for yield prediction, root phenotyping, development of climate-resilient crops, detecting pathogen and pest infestation, and quantitative trait measurement. We emphasize the need for researchers to share phenotypic data, and offer a comprehensive list of available datasets to assist crop breeders and tool developers to leverage these resources in order to accelerate crop breeding.
Collapse
Affiliation(s)
- Monica F. Danilevicz
- School of Biological Sciences and Institute of Agriculture, University of Western Australia, Perth, Western Australia 6009, Australia
| | - Philipp E. Bayer
- School of Biological Sciences and Institute of Agriculture, University of Western Australia, Perth, Western Australia 6009, Australia
| | - Benjamin J. Nestor
- School of Biological Sciences and Institute of Agriculture, University of Western Australia, Perth, Western Australia 6009, Australia
| | - Mohammed Bennamoun
- Department of Computer Science and Software Engineering, University of Western Australia, Perth, Western Australia 6009, Australia
| | - David Edwards
- School of Biological Sciences and Institute of Agriculture, University of Western Australia, Perth, Western Australia 6009, Australia
- Author for communication:
| |
Collapse
|
12
|
Roslim MHM, Juraimi AS, Che’Ya NN, Sulaiman N, Manaf MNHA, Ramli Z, Motmainna M. Using Remote Sensing and an Unmanned Aerial System for Weed Management in Agricultural Crops: A Review. AGRONOMY 2021; 11:1809. [DOI: 10.3390/agronomy11091809] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/02/2023]
Abstract
Weeds are unwanted plants that can reduce crop yields by competing for water, nutrients, light, space, and carbon dioxide, which need to be controlled to meet future food production requirements. The integration of drones, artificial intelligence, and various sensors, which include hyperspectral, multi-spectral, and RGB (red-green-blue), ensure the possibility of a better outcome in managing weed problems. Most of the major or minor challenges caused by weed infestation can be faced by implementing remote sensing systems in various agricultural tasks. It is a multi-disciplinary science that includes spectroscopy, optics, computer, photography, satellite launching, electronics, communication, and several other fields. Future challenges, including food security, sustainability, supply and demand, climate change, and herbicide resistance, can also be overcome by those technologies based on machine learning approaches. This review provides an overview of the potential and practical use of unmanned aerial vehicle and remote sensing techniques in weed management practices and discusses how they overcome future challenges.
Collapse
|
13
|
Wu Z, Chen Y, Zhao B, Kang X, Ding Y. Review of Weed Detection Methods Based on Computer Vision. SENSORS (BASEL, SWITZERLAND) 2021; 21:3647. [PMID: 34073867 PMCID: PMC8197187 DOI: 10.3390/s21113647] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/14/2021] [Revised: 05/15/2021] [Accepted: 05/21/2021] [Indexed: 02/04/2023]
Abstract
Weeds are one of the most important factors affecting agricultural production. The waste and pollution of farmland ecological environment caused by full-coverage chemical herbicide spraying are becoming increasingly evident. With the continuous improvement in the agricultural production level, accurately distinguishing crops from weeds and achieving precise spraying only for weeds are important. However, precise spraying depends on accurately identifying and locating weeds and crops. In recent years, some scholars have used various computer vision methods to achieve this purpose. This review elaborates the two aspects of using traditional image-processing methods and deep learning-based methods to solve weed detection problems. It provides an overview of various methods for weed detection in recent years, analyzes the advantages and disadvantages of existing methods, and introduces several related plant leaves, weed datasets, and weeding machinery. Lastly, the problems and difficulties of the existing weed detection methods are analyzed, and the development trend of future research is prospected.
Collapse
Affiliation(s)
- Zhangnan Wu
- Department of Information Science, Xi’an University of Technology, Xi’an 710048, China; (Z.W.); (X.K.); (Y.D.)
| | - Yajun Chen
- Department of Information Science, Xi’an University of Technology, Xi’an 710048, China; (Z.W.); (X.K.); (Y.D.)
| | - Bo Zhao
- Chinese Academy of Agricultural Mechanization Sciences, Beijing 100083, China;
| | - Xiaobing Kang
- Department of Information Science, Xi’an University of Technology, Xi’an 710048, China; (Z.W.); (X.K.); (Y.D.)
| | - Yuanyuan Ding
- Department of Information Science, Xi’an University of Technology, Xi’an 710048, China; (Z.W.); (X.K.); (Y.D.)
| |
Collapse
|
14
|
Khaki S, Pham H, Han Y, Kuhl A, Kent W, Wang L. DeepCorn: A semi-supervised deep learning method for high-throughput image-based corn kernel counting and yield estimation. Knowl Based Syst 2021. [DOI: 10.1016/j.knosys.2021.106874] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/28/2022]
|