1
|
Yang S, Zhou G, Feng Y, Zhang J, Jia Z. SRNet-YOLO: A model for detecting tiny and very tiny pests in cotton fields based on super-resolution reconstruction. FRONTIERS IN PLANT SCIENCE 2024; 15:1416940. [PMID: 39184581 PMCID: PMC11341441 DOI: 10.3389/fpls.2024.1416940] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/13/2024] [Accepted: 07/18/2024] [Indexed: 08/27/2024]
Abstract
Introduction Effective pest management is important during the natural growth phases of cotton in the wild. As cotton fields are infested with "tiny pests" (smaller than 32×32 pixels) and "very tiny pests" (smaller than 16×16 pixels) during growth, making it difficult for common object detection models to accurately detect and fail to make sound agricultural decisions. Methods In this study, we proposed a framework for detecting "tiny pests" and "very tiny pests" in wild cotton fields, named SRNet-YOLO. SRNet-YOLO includes a YOLOv8 feature extraction module, a feature map super-resolution reconstruction module (FM-SR), and a fusion mechanism based on BiFormer attention (BiFormerAF). Specially, the FM-SR module is designed for the feature map level to recover the important feature in detail, in other words, this module reconstructs the P5 layer feature map into the size of the P3 layer. And then we designed the BiFormerAF module to fuse this reconstruct layer with the P3 layer, which greatly improves the detection performance. The purpose of the BiFormerAF module is to solve the problem of possible loss of feature after reconstruction. Additionally, to validate the performance of our method for "tiny pests" and "very tiny pests" detection in cotton fields, we have developed a large dataset, named Cotton-Yellow-Sticky-2023, which collected pests by yellow sticky traps. Results Through comprehensive experimental verification, we demonstrate that our proposed framework achieves exceptional performance. Our method achieved 78.2% mAP on the "tiny pests" test result, it surpasses the performance of leading detection models such as YOLOv3, YOLOv5, YOLOv7 and YOLOv8 by 6.9%, 7.2%, 5.7% and 4.1%, respectively. Meanwhile, our results on "very tiny pests" reached 57% mAP, which are 32.2% higher than YOLOv8. To verify the generalizability of the model, our experiments on Yellow Sticky Traps (low-resolution) dataset still maintained the highest 92.8% mAP. Discussion The above experimental results indicate that our model not only provides help in solving the problem of tiny pests in cotton fields, but also has good generalizability and can be used for the detection of tiny pests in other crops.
Collapse
Affiliation(s)
- Sen Yang
- School of Computer Science and Technology, Xinjiang University, Urumqi, China
- The Key Laboratory of Signal Detection and Processing, Xinjiang Uygur Autonomous Region, Xinjiang University, Urumqi, China
| | - Gang Zhou
- School of Computer Science and Technology, Xinjiang University, Urumqi, China
- The Key Laboratory of Signal Detection and Processing, Xinjiang Uygur Autonomous Region, Xinjiang University, Urumqi, China
| | - Yuwei Feng
- School of Computer Science and Technology, Xinjiang University, Urumqi, China
- The Key Laboratory of Signal Detection and Processing, Xinjiang Uygur Autonomous Region, Xinjiang University, Urumqi, China
| | - Jiang Zhang
- School of Computer Science and Technology, Xinjiang University, Urumqi, China
- The Key Laboratory of Signal Detection and Processing, Xinjiang Uygur Autonomous Region, Xinjiang University, Urumqi, China
| | - Zhenhong Jia
- School of Computer Science and Technology, Xinjiang University, Urumqi, China
- The Key Laboratory of Signal Detection and Processing, Xinjiang Uygur Autonomous Region, Xinjiang University, Urumqi, China
| |
Collapse
|
2
|
Sheard JK, Adriaens T, Bowler DE, Büermann A, Callaghan CT, Camprasse ECM, Chowdhury S, Engel T, Finch EA, von Gönner J, Hsing PY, Mikula P, Rachel Oh RY, Peters B, Phartyal SS, Pocock MJO, Wäldchen J, Bonn A. Emerging technologies in citizen science and potential for insect monitoring. Philos Trans R Soc Lond B Biol Sci 2024; 379:20230106. [PMID: 38705194 PMCID: PMC11070260 DOI: 10.1098/rstb.2023.0106] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2023] [Accepted: 03/29/2024] [Indexed: 05/07/2024] Open
Abstract
Emerging technologies are increasingly employed in environmental citizen science projects. This integration offers benefits and opportunities for scientists and participants alike. Citizen science can support large-scale, long-term monitoring of species occurrences, behaviour and interactions. At the same time, technologies can foster participant engagement, regardless of pre-existing taxonomic expertise or experience, and permit new types of data to be collected. Yet, technologies may also create challenges by potentially increasing financial costs, necessitating technological expertise or demanding training of participants. Technology could also reduce people's direct involvement and engagement with nature. In this perspective, we discuss how current technologies have spurred an increase in citizen science projects and how the implementation of emerging technologies in citizen science may enhance scientific impact and public engagement. We show how technology can act as (i) a facilitator of current citizen science and monitoring efforts, (ii) an enabler of new research opportunities, and (iii) a transformer of science, policy and public participation, but could also become (iv) an inhibitor of participation, equity and scientific rigour. Technology is developing fast and promises to provide many exciting opportunities for citizen science and insect monitoring, but while we seize these opportunities, we must remain vigilant against potential risks. This article is part of the theme issue 'Towards a toolkit for global insect biodiversity monitoring'.
Collapse
Affiliation(s)
- Julie Koch Sheard
- Department of Ecosystem Services, Helmholtz Centre for Environmental Research - UFZ, Permoserstraße 15, 04318 Leipzig, Germany
- Institute of Biodiversity, Friedrich Schiller University Jena, Dornburger Straße 159, 07743 Jena, Germany
- German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig, Puschstraße 4, 04103 Leipzig, Germany
| | - Tim Adriaens
- Research Institute for Nature and Forest (INBO), Havenlaan 88 bus 73, 1000 Brussels, Belgium
| | - Diana E. Bowler
- UK Centre for Ecology & Hydrology, Wallingford, Oxfordshire, OX10 8BB, UK
| | - Andrea Büermann
- Department of Ecosystem Services, Helmholtz Centre for Environmental Research - UFZ, Permoserstraße 15, 04318 Leipzig, Germany
- German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig, Puschstraße 4, 04103 Leipzig, Germany
| | - Corey T. Callaghan
- Department of Wildlife Ecology and Conservation, Fort Lauderdale Research and Education Center, University of Florida, FL 33314, USA
| | - Elodie C. M. Camprasse
- School of Life and Environmental Sciences, Deakin University, Melbourne Burwood Campus, 221 Burwood Highway, Burwood, Victoria 3125, Australia
| | - Shawan Chowdhury
- Department of Ecosystem Services, Helmholtz Centre for Environmental Research - UFZ, Permoserstraße 15, 04318 Leipzig, Germany
- Institute of Biodiversity, Friedrich Schiller University Jena, Dornburger Straße 159, 07743 Jena, Germany
- German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig, Puschstraße 4, 04103 Leipzig, Germany
| | - Thore Engel
- Department of Ecosystem Services, Helmholtz Centre for Environmental Research - UFZ, Permoserstraße 15, 04318 Leipzig, Germany
- Institute of Biodiversity, Friedrich Schiller University Jena, Dornburger Straße 159, 07743 Jena, Germany
- German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig, Puschstraße 4, 04103 Leipzig, Germany
| | - Elizabeth A. Finch
- Department of Ecosystem Services, Helmholtz Centre for Environmental Research - UFZ, Permoserstraße 15, 04318 Leipzig, Germany
- Institute of Biodiversity, Friedrich Schiller University Jena, Dornburger Straße 159, 07743 Jena, Germany
- German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig, Puschstraße 4, 04103 Leipzig, Germany
| | - Julia von Gönner
- Department of Ecosystem Services, Helmholtz Centre for Environmental Research - UFZ, Permoserstraße 15, 04318 Leipzig, Germany
- Institute of Biodiversity, Friedrich Schiller University Jena, Dornburger Straße 159, 07743 Jena, Germany
- German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig, Puschstraße 4, 04103 Leipzig, Germany
| | - Pen-Yuan Hsing
- Faculty of Life Sciences, University of Bristol, 12a Priory Road, Bristol BS8 1TU, UK
| | - Peter Mikula
- TUM School of Life Sciences, Ecoclimatology, Technical University of Munich, Hans-Carl-von-Carlowitz-Platz 2, 85354 Freising, Germany
- Institute for Advanced Study, Technical University of Munich, Lichtenbergstraße 2a, 85748 Garching, Germany
- Faculty of Environmental Sciences, Czech University of Life Sciences Prague, Kamýcká 129, 16500 Prague, Czech Republic
| | - Rui Ying Rachel Oh
- Department of Ecosystem Services, Helmholtz Centre for Environmental Research - UFZ, Permoserstraße 15, 04318 Leipzig, Germany
- German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig, Puschstraße 4, 04103 Leipzig, Germany
| | - Birte Peters
- Department of Ecosystem Services, Helmholtz Centre for Environmental Research - UFZ, Permoserstraße 15, 04318 Leipzig, Germany
- German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig, Puschstraße 4, 04103 Leipzig, Germany
| | - Shyam S. Phartyal
- School of Ecology and Environment Studies, Nalanda University, Rajgir 803116, India
| | | | - Jana Wäldchen
- German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig, Puschstraße 4, 04103 Leipzig, Germany
- Department of Biogeochemical Integration, Max Planck Institute for Biogeochemistry, Hans-Knöll-Straße 10, 07745 Jena, Germany
| | - Aletta Bonn
- Department of Ecosystem Services, Helmholtz Centre for Environmental Research - UFZ, Permoserstraße 15, 04318 Leipzig, Germany
- Institute of Biodiversity, Friedrich Schiller University Jena, Dornburger Straße 159, 07743 Jena, Germany
- German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig, Puschstraße 4, 04103 Leipzig, Germany
| |
Collapse
|
3
|
Kalfas I, De Ketelaere B, Bunkens K, Saeys W. Towards automatic insect monitoring on witloof chicory fields using sticky plate image analysis. ECOL INFORM 2023; 75:102037. [PMID: 37397435 PMCID: PMC10295114 DOI: 10.1016/j.ecoinf.2023.102037] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2022] [Revised: 02/20/2023] [Accepted: 02/20/2023] [Indexed: 03/06/2023]
Abstract
Context Sticky trap catches of agricultural pests can be employed for early hotspot detection, identification, and estimation of pest presence in greenhouses or in the field. However, manual procedures to produce and analyze catch results require substantial time and effort. As a result, much research has gone into creating efficient techniques for remotely monitoring possible infestations. A considerable number of these studies use Artificial Intelligence (AI) to analyze the acquired data and focus on performance metrics for various model architectures. Less emphasis, however, was devoted to the testing of the trained models to investigate how well they would perform under practical, in-field conditions. Objective In this study, we showcase an automatic and reliable computational method for monitoring insects in witloof chicory fields, while shifting the focus to the challenges of compiling and using a realistic insect image dataset that contains insects with common taxonomy levels. Methods To achieve this, we collected, imaged, and annotated 731 sticky plates - containing 74,616 bounding boxes - to train a YOLOv5 object detection model, concentrating on two pest insects (chicory leaf-miners and wooly aphids) and their two predatory counterparts (ichneumon wasps and grass flies). To better understand the object detection model's actual field performance, it was validated in a practical manner by splitting our image data on the sticky plate level. Results and conclusions According to experimental findings, the average mAP score for all dataset classes was 0.76. For both pest species and their corresponding predators, high mAP values of 0.73 and 0.86 were obtained. Additionally, the model accurately forecasted the presence of pests when presented with unseen sticky plate images from the test set. Significance The findings of this research clarify the feasibility of AI-powered pest monitoring in the field for real-world applications and provide opportunities for implementing pest monitoring in witloof chicory fields with minimal human intervention.
Collapse
Affiliation(s)
- Ioannis Kalfas
- KU Leuven, Department of Biosystems, MeBioS, Faculty of Bioscience Engineering, Kasteelpark Arenberg 30, Box 2456, Leuven B-3001, Belgium
| | - Bart De Ketelaere
- KU Leuven, Department of Biosystems, MeBioS, Faculty of Bioscience Engineering, Kasteelpark Arenberg 30, Box 2456, Leuven B-3001, Belgium
| | | | - Wouter Saeys
- KU Leuven, Department of Biosystems, MeBioS, Faculty of Bioscience Engineering, Kasteelpark Arenberg 30, Box 2456, Leuven B-3001, Belgium
| |
Collapse
|
4
|
Zhang W, Xia X, Zhou G, Du J, Chen T, Zhang Z, Ma X. Research on the identification and detection of field pests in the complex background based on the rotation detection algorithm. FRONTIERS IN PLANT SCIENCE 2022; 13:1011499. [PMID: 36582640 PMCID: PMC9792778 DOI: 10.3389/fpls.2022.1011499] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/04/2022] [Accepted: 11/15/2022] [Indexed: 06/17/2023]
Abstract
As a large agricultural and population country, China's annual demand for food is significant. The crop yield will be affected by various natural disasters every year, and one of the most important factors affecting crops is the impact of insect pests. The key to solving the problem is to detect, identify and provide feedback in time at the initial stage of the pest. In this paper, according to the pest picture data obtained through the pest detection lamp in the complex natural background and the marking categories of agricultural experts, the pest data set pest rotation detection (PRD21) in different natural environments is constructed. A comparative study of image recognition is carried out through different target detection algorithms. The final experiment proves that the best algorithm for rotation detection improves mean Average Precision by 18.5% compared to the best algorithm for horizontal detection, reaching 78.5%. Regarding Recall, the best rotation detection algorithm runs 94.7%, which is 7.4% higher than horizontal detection. In terms of detection speed, the rotation detection time of a picture is only 0.163s, and the model size is 66.54MB, which can be embedded in mobile devices for fast detection. This experiment proves that rotation detection has a good effect on pests' detection and recognition rate, which can bring new application value and ideas, provide new methods for plant protection, and improve grain yield.
Collapse
Affiliation(s)
- Wei Zhang
- Institute of Physical Science and Information Technology, Anhui University, HeFei, China
- Institute of Intelligent Machines, Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei, China
| | - Xulu Xia
- Institute of Physical Science and Information Technology, Anhui University, HeFei, China
| | - Guotao Zhou
- Technology Research and Deveplopment Center, Henan Yunfei Technology Development Co. LTD, Henan, China
| | - Jianming Du
- Institute of Intelligent Machines, Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei, China
| | - Tianjiao Chen
- Institute of Intelligent Machines, Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei, China
| | - Zhengyong Zhang
- Institute of Intelligent Machines, Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei, China
| | - Xiangyang Ma
- Harvesting and Processing Department, Liaoning Provincial Institiute of Agricultural Mechanization, Shengyang, China
| |
Collapse
|
5
|
Wen C, Chen H, Ma Z, Zhang T, Yang C, Su H, Chen H. Pest-YOLO: A model for large-scale multi-class dense and tiny pest detection and counting. FRONTIERS IN PLANT SCIENCE 2022; 13:973985. [PMID: 36570910 PMCID: PMC9783619 DOI: 10.3389/fpls.2022.973985] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/20/2022] [Accepted: 09/27/2022] [Indexed: 06/17/2023]
Abstract
Frequent outbreaks of agricultural pests can reduce crop production severely and restrict agricultural production. Therefore, automatic monitoring and precise recognition of crop pests have a high practical value in the process of agricultural planting. In recent years, pest recognition and detection have been rapidly improved with the development of deep learning-based methods. Although certain progress has been made in the research on pest detection and identification technology based on deep learning, there are still many problems in the production application in a field environment. This work presents a pest detector for multi-category dense and tiny pests named the Pest-YOLO. First, the idea of focal loss is introduced into the loss function using weight distribution to improve the attention of hard samples. In this way, the problems of hard samples arose from the uneven distribution of pest populations in a dataset and low discrimination features of small pests are relieved. Next, a non-Intersection over Union bounding box selection and suppression algorithm, the confluence strategy, is used. The confluence strategy can eliminate the errors and omissions of pest detection caused by occlusion, adhesion and unlabeling among tiny dense pest individuals to the greatest extent. The proposed Pest-YOLO model is verified on a large-scale pest image dataset, the Pest24, which includes more than 20k images with over 190k pests labeled by agricultural experts and categorized into 24 classes. Experimental results show that the Pest-YOLO can obtain 69.59% for mAP and 77.71% for mRecall on the 24-class pest dataset, which is 5.32% and 28.12% higher than the benchmark model YOLOv4. Meanwhile, our proposed model is superior to other several state-of-the-art methods, including the SSD, RetinaNet, Faster RCNN, YOLOv3, YOLOv4, YOLOv5s, YOLOv5m, YOLOX, DETR, TOOD, YOLOv3-W, and AF-RCNN detectors. The code of the proposed algorithm is available at: https://github.com/chr-secrect/Pest-YOLO.
Collapse
Affiliation(s)
- Changji Wen
- College of Information and Technology, Jilin Agricultural University, Changchun, China
- Institute for the Smart Agriculture, Jilin Agricultural University, Changchun, China
| | - Hongrui Chen
- College of Information and Technology, Jilin Agricultural University, Changchun, China
| | - Zhenyu Ma
- College of Information and Technology, Jilin Agricultural University, Changchun, China
| | - Tian Zhang
- College of Information and Technology, Jilin Agricultural University, Changchun, China
| | - Ce Yang
- College of Food, Agricultural and Natural Resource Sciences, University of Minnesota, Twin Cities, MN, United States
| | - Hengqiang Su
- College of Information and Technology, Jilin Agricultural University, Changchun, China
- Institute for the Smart Agriculture, Jilin Agricultural University, Changchun, China
| | - Hongbing Chen
- College of Information and Technology, Jilin Agricultural University, Changchun, China
- Institute for the Smart Agriculture, Jilin Agricultural University, Changchun, China
| |
Collapse
|
6
|
Rustia DJA, Chiu LY, Lu CY, Wu YF, Chen SK, Chung JY, Hsu JC, Lin TT. Towards intelligent and integrated pest management through an AIoT-based monitoring system. PEST MANAGEMENT SCIENCE 2022; 78:4288-4302. [PMID: 35716088 DOI: 10.1002/ps.7048] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/10/2022] [Revised: 06/15/2022] [Accepted: 06/15/2022] [Indexed: 06/15/2023]
Abstract
BACKGROUND Main bottleneck in facilitating integrated pest management (IPM) is the unavailability of reliable and immediate crop damage data. Without sufficient insect pest and plant disease information, farm managers are unable to make proper decisions to prevent crop damage. This work aims to present how an integrated system was able to drive farm managers towards sustainable and data-driven IPM. RESULTS A system called Intelligent and Integrated Pest and Disease Management (I2 PDM) system was developed. Edge computing devices were developed to automatically detect and recognize major greenhouse insect pests such as thrips (Frankliniella intonsa, Thrips hawaiiensis, and Thrips tabaci), and whiteflies (Bemisia argentifolii and Trialeurodes vaporariorum), to name a few, and measure environmental conditions including temperature, humidity, and light intensity, and send data to a remote server. The system has been installed in greenhouses producing tomatoes and orchids for gathering long-term spatiotemporal insect pest count and environmental data, for as long as 1368 days. The findings demonstrated that the proposed system supported the farm managers in performing IPM-related tasks. Significant yearly reductions in insect pest count as high as 50.7% were observed on the farms. CONCLUSION It was concluded that novel and efficient strategies can be achieved by using an intelligent IPM system, opening IPM to potential benefits that cannot be easily realized with a traditional IPM program. This is the first work that reports the development of an intelligent strategic model for IPM based on actual automatically collected long-term data. The work presented herein can help in encouraging farm managers, researchers, experts, and industries to work together in implementing sustainable and data-driven IPM. © 2022 Society of Chemical Industry.
Collapse
Affiliation(s)
| | - Lin-Ya Chiu
- Department of Biomechatronics Engineering, National Taiwan University, Taipei, Taiwan, ROC
| | - Chen-Yi Lu
- Department of Biomechatronics Engineering, National Taiwan University, Taipei, Taiwan, ROC
| | - Ya-Fang Wu
- Tainan District Agricultural Research and Extension Station, Tainan, Taiwan, ROC
| | - Sheng-Kuan Chen
- Tainan District Agricultural Research and Extension Station, Tainan, Taiwan, ROC
| | - Jui-Yung Chung
- Tainan District Agricultural Research and Extension Station, Tainan, Taiwan, ROC
| | - Ju-Chun Hsu
- Department of Entomology, National Taiwan University, Taipei, Taiwan, ROC
| | - Ta-Te Lin
- Department of Biomechatronics Engineering, National Taiwan University, Taipei, Taiwan, ROC
| |
Collapse
|
7
|
She J, Zhan W, Hong S, Min C, Dong T, Huang H, He Z. A method for automatic real-time detection and counting of fruit fly pests in orchards by trap bottles via convolutional neural network with attention mechanism added. ECOL INFORM 2022. [DOI: 10.1016/j.ecoinf.2022.101690] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
|
8
|
Li W, Yang Z, Lv J, Zheng T, Li M, Sun C. Detection of Small-Sized Insects in Sticky Trapping Images Using Spectral Residual Model and Machine Learning. FRONTIERS IN PLANT SCIENCE 2022; 13:915543. [PMID: 35837447 PMCID: PMC9274131 DOI: 10.3389/fpls.2022.915543] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/08/2022] [Accepted: 05/24/2022] [Indexed: 06/15/2023]
Abstract
One fundamental component of Integrated pest management (IPM) is field monitoring and growers use information gathered from scouting to make an appropriate control tactics. Whitefly (Bemisia tabaci) and thrips (Frankliniella occidentalis) are two most prominent pests in greenhouses of northern China. Traditionally, growers estimate the population of these pests by counting insects caught on sticky traps, which is not only a challenging task but also an extremely time-consuming one. To alleviate this situation, this study proposed an automated detection approach to meet the need for continuous monitoring of pests in greenhouse conditions. Candidate targets were firstly located using a spectral residual model and then different color features were extracted. Ultimately, Whitefly and thrips were identified using a support vector machine classifier with an accuracy of 93.9 and 89.9%, a true positive rate of 93.1 and 80.1%, and a false positive rate of 9.9 and 12.3%, respectively. Identification performance was further tested via comparison between manual and automatic counting with a coefficient of determination, R 2, of 0.9785 and 0.9582. The results show that the proposed method can provide a comparable performance with previous handcrafted feature-based methods, furthermore, it does not require the support of high-performance hardware compare with deep learning-based method. This study demonstrates the potential of developing a vision-based identification system to facilitate rapid gathering of information pertaining to numbers of small-sized pests in greenhouse agriculture and make a reliable estimation of overall population density.
Collapse
Affiliation(s)
- Wenyong Li
- National Engineering Research Center for Information Technology in Agriculture, Beijing, China
| | - Zhankui Yang
- National Engineering Research Center for Information Technology in Agriculture, Beijing, China
- College of Computer Science and Technology, Beijing University of Technology, Beijing, China
| | - Jiawei Lv
- National Engineering Research Center for Information Technology in Agriculture, Beijing, China
- College of Information Science and Technology, Zhongkai University of Agriculture and Engineering, Guangzhou, China
| | - Tengfei Zheng
- National Engineering Research Center for Information Technology in Agriculture, Beijing, China
- College of Information, Shanghai Ocean University, Shanghai, China
| | - Ming Li
- National Engineering Research Center for Information Technology in Agriculture, Beijing, China
| | - Chuanheng Sun
- National Engineering Research Center for Information Technology in Agriculture, Beijing, China
| |
Collapse
|
9
|
|
10
|
Wang R, Li R, Chen T, Zhang J, Xie C, Qiu K, Chen P, Du J, Chen H, Shao F, Hu H, Liu H. An automatic system for pest recognition and forecasting. PEST MANAGEMENT SCIENCE 2022; 78:711-721. [PMID: 34672074 DOI: 10.1002/ps.6684] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/12/2021] [Revised: 10/14/2021] [Accepted: 10/21/2021] [Indexed: 06/13/2023]
Abstract
BACKGROUND Pests cause significant damage to agricultural crops and reduce crop yields. Use of manual methods of pest forecasting for integrated pest management is labor-intensive and time-consuming. Here, we present an automatic system for monitoring pests in large fields, with the aim of replacing manual forecasting. The system comprises an automatic detection and counting system and a human-computer data statistical fitting system. Image data sets of the target pests from large fields are first input into the system. The number of pests in the image is then counted both manually and using the automatic system. Finally, a mapping relationship between counts obtained using the automated system and by agricultural experts is established using the statistical fitting system. RESULTS Trends in the pest-count curves produced using the manual and automated counting methods were very similar. To sample the number of pests for manual statistics, plants were shaken to transfer the pests from the plant to a plate. Hence, pests hiding within plant crevices were also sampled and included in the count, whereas the automatic method counted only the pests visible in the images. Therefore, the computer index threshold was much lower than the manual index threshold. However, the proposed system correctly reflected trends in pest numbers obtained using computer vision. CONCLUSION The experimental results demonstrate that our automatic pest-monitoring system can generate pest grades and can replace manual forecasting methods in large fields. © 2021 Society of Chemical Industry.
Collapse
Affiliation(s)
- Rujing Wang
- Institute of Intelligent Machines, and Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei, China
- Intelligent Agriculture Engineering Laboratory of Anhui Province, Hefei, China
- University of Science and Technology of China, Hefei, China
| | - Rui Li
- Institute of Intelligent Machines, and Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei, China
- Intelligent Agriculture Engineering Laboratory of Anhui Province, Hefei, China
| | - Tianjiao Chen
- Institute of Intelligent Machines, and Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei, China
- Intelligent Agriculture Engineering Laboratory of Anhui Province, Hefei, China
- University of Science and Technology of China, Hefei, China
| | - Jie Zhang
- Institute of Intelligent Machines, and Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei, China
- Intelligent Agriculture Engineering Laboratory of Anhui Province, Hefei, China
| | - Chengjun Xie
- Institute of Intelligent Machines, and Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei, China
- Intelligent Agriculture Engineering Laboratory of Anhui Province, Hefei, China
| | - Kun Qiu
- Plant Protection Station of Anhui Province, Hefei, China
| | - Peng Chen
- School of Internet, Anhui University, Hefei, China
| | - Jianming Du
- Institute of Intelligent Machines, and Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei, China
- Intelligent Agriculture Engineering Laboratory of Anhui Province, Hefei, China
| | - Hongbo Chen
- Institute of Intelligent Machines, and Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei, China
- Intelligent Agriculture Engineering Laboratory of Anhui Province, Hefei, China
- University of Science and Technology of China, Hefei, China
| | - FangRong Shao
- Anhui State Farms Longkang Farm Co., Ltd, Bengbu, China
| | - Haiying Hu
- Institute of Intelligent Machines, and Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei, China
- Intelligent Agriculture Engineering Laboratory of Anhui Province, Hefei, China
| | - Haiyun Liu
- Institute of Intelligent Machines, and Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei, China
- Intelligent Agriculture Engineering Laboratory of Anhui Province, Hefei, China
- University of Science and Technology of China, Hefei, China
| |
Collapse
|
11
|
Classification and detection of insects from field images using deep learning for smart pest management: A systematic review. ECOL INFORM 2021. [DOI: 10.1016/j.ecoinf.2021.101460] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/26/2022]
|
12
|
Özcan GE, Tabak HŞ. Evaluation of electronic pheromone trap capture conditions for Ips sexdentatus with climatic and temporal factors. ENVIRONMENTAL MONITORING AND ASSESSMENT 2021; 193:625. [PMID: 34480221 DOI: 10.1007/s10661-021-09402-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/08/2021] [Accepted: 08/17/2021] [Indexed: 06/13/2023]
Abstract
Controlling forest pests to maintain the sustainability of forests and ecosystem balance is one of the interests of modern forestry. In the evaluation of damage risks associated with forest pests, pheromone traps attract attention by providing early warnings. With the development of these traps in line with modern technology, more reliable data are obtained; these data are important in the identification and planning of pest management. In this study, a pheromone trap with electronic control unit was tested under field conditions. The capture of adult Ips sexdentatus under natural conditions during 103 days of the flying period was evaluated; 97.2% of the beetles captured in the trap were the target species. The comparison of the number of beetles recorded by the trap and manual counts revealed that the trap worked with an error margin of approximately 4%. However, no statistically significant difference was noted between these two counting methods. During the study, 59% of the total beetles were captured between May 27 and June 25. The average temperature at the period of the capture was 20.09 °C, average humidity was 66%, and average wind speed was 2.9 m/s. Of the captures, 73.9% occurred in the temperature range of 15-24.9 °C, 61.1% occurred in humidity range of 61-90%, 89.6% occurred at a wind speed of 0.3-5.4 m/s, and 77.3% occurred within the period from sunrise to sunset. When these four parameters were evaluated together, the most strongly associated parameter was daylight, followed by temperature, wind speed, and humidity.
Collapse
Affiliation(s)
- Gonca Ece Özcan
- Faculty of Forestry, Department of Forestry Engineering, Kastamonu University, 37150, Kastamonu, Turkey.
| | - Hakan Şükrü Tabak
- Institute of Science, Kastamonu University, Forest Engineering Program, 37150, Kastamonu, Turkey
| |
Collapse
|
13
|
Abeywardhana D, Dangalle C, Nugaliyadde A, Mallawarachchi Y. Deep learning approach to classify Tiger beetles of Sri Lanka. ECOL INFORM 2021. [DOI: 10.1016/j.ecoinf.2021.101286] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/22/2023]
|
14
|
Hong SJ, Nam I, Kim SY, Kim E, Lee CH, Ahn S, Park IK, Kim G. Automatic Pest Counting from Pheromone Trap Images Using Deep Learning Object Detectors for Matsucoccus thunbergianae Monitoring. INSECTS 2021; 12:342. [PMID: 33921492 PMCID: PMC8068825 DOI: 10.3390/insects12040342] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/09/2021] [Revised: 04/08/2021] [Accepted: 04/09/2021] [Indexed: 11/16/2022]
Abstract
The black pine bast scale, M. thunbergianae, is a major insect pest of black pine and causes serious environmental and economic losses in forests. Therefore, it is essential to monitor the occurrence and population of M. thunbergianae, and a monitoring method using a pheromone trap is commonly employed. Because the counting of insects performed by humans in these pheromone traps is labor intensive and time consuming, this study proposes automated deep learning counting algorithms using pheromone trap images. The pheromone traps collected in the field were photographed in the laboratory, and the images were used for training, validation, and testing of the detection models. In addition, the image cropping method was applied for the successful detection of small objects in the image, considering the small size of M. thunbergianae in trap images. The detection and counting performance were evaluated and compared for a total of 16 models under eight model conditions and two cropping conditions, and a counting accuracy of 95% or more was shown in most models. This result shows that the artificial intelligence-based pest counting method proposed in this study is suitable for constant and accurate monitoring of insect pests.
Collapse
Affiliation(s)
- Suk-Ju Hong
- Department of Biosystems Engineering, College of Agriculture and Life Sciences, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826, Korea; (S.-J.H.); (S.-Y.K.); (E.K.); (C.-H.L.); (S.A.)
| | - Il Nam
- Department of Agriculture, Forestry and Bioresources, College of Agriculture and Life Sciences, Seoul National University, Seoul 08826, Korea; (I.N.); (I.-K.P.)
| | - Sang-Yeon Kim
- Department of Biosystems Engineering, College of Agriculture and Life Sciences, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826, Korea; (S.-J.H.); (S.-Y.K.); (E.K.); (C.-H.L.); (S.A.)
| | - Eungchan Kim
- Department of Biosystems Engineering, College of Agriculture and Life Sciences, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826, Korea; (S.-J.H.); (S.-Y.K.); (E.K.); (C.-H.L.); (S.A.)
- Global Smart Farm Convergence Major, College of Agriculture and Life Sciences, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826, Korea
| | - Chang-Hyup Lee
- Department of Biosystems Engineering, College of Agriculture and Life Sciences, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826, Korea; (S.-J.H.); (S.-Y.K.); (E.K.); (C.-H.L.); (S.A.)
- Global Smart Farm Convergence Major, College of Agriculture and Life Sciences, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826, Korea
| | - Sebeom Ahn
- Department of Biosystems Engineering, College of Agriculture and Life Sciences, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826, Korea; (S.-J.H.); (S.-Y.K.); (E.K.); (C.-H.L.); (S.A.)
| | - Il-Kwon Park
- Department of Agriculture, Forestry and Bioresources, College of Agriculture and Life Sciences, Seoul National University, Seoul 08826, Korea; (I.N.); (I.-K.P.)
- Research Institute of Agriculture and Life Science, College of Agriculture and Life Sciences, Seoul National University, Seoul 08826, Korea
| | - Ghiseok Kim
- Department of Biosystems Engineering, College of Agriculture and Life Sciences, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826, Korea; (S.-J.H.); (S.-Y.K.); (E.K.); (C.-H.L.); (S.A.)
- Global Smart Farm Convergence Major, College of Agriculture and Life Sciences, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826, Korea
- Research Institute of Agriculture and Life Science, College of Agriculture and Life Sciences, Seoul National University, Seoul 08826, Korea
| |
Collapse
|
15
|
Bemisia tabaci on Vegetables in the Southern United States: Incidence, Impact, and Management. INSECTS 2021; 12:insects12030198. [PMID: 33652635 PMCID: PMC7996905 DOI: 10.3390/insects12030198] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/01/2021] [Revised: 02/12/2021] [Accepted: 02/13/2021] [Indexed: 01/25/2023]
Abstract
Simple Summary The sweetpotato whitefly, Bemisia tabaci, was initially discovered in the United States in 1894 but was not considered an economic insect pest on various agricultural crops across the southern and western states. After the introduction of B. tabaci Middle East-Asia Minor 1 (MEAM1) into the United States around 1985, the insect rapidly spread throughout the Southern United States to Texas, Arizona, and California. Extreme field outbreaks occurred on vegetable and other crops in those areas. The sweetpotato whitefly is now regarded as one of the most destructive insect pests in vegetable production systems in the Southern United States. The direct and indirect plant damage caused by B. tabaci has led to substantial economic losses in vegetable crops. Bemisia tabaci outbreaks on vegetables in Georgia resulted in significant economic losses of 132.3 and 161.2 million US dollars (USD) in 2016 and 2017, respectively. Therefore, integrated pest management (IPM) tactics are warranted, including cultural control by manipulation of production practices, resistant vegetable varieties, biological control using various natural enemies, and the judicious use of insecticides. Abstract Bemisia tabaci Gennadius (Hemiptera: Aleyrodidae) is among the most economically important insect pests of various vegetable crops in the Southern United States. This insect is considered a complex of at least 40 morphologically indistinguishable cryptic species. Bemisia tabaci Middle East-Asia Minor 1 (MEAM1) was initially introduced in the United States around 1985 and has since rapidly spread across the Southern United States to Texas, Arizona, and California, where extreme field outbreaks have occurred on vegetable and other crops. This pest creates extensive plant damage through direct feeding on vegetables, secreting honeydew, causing plant physiological disorders, and vectoring plant viruses. The direct and indirect plant damage in vegetable crops has resulted in enormous economic losses in the Southern United States, especially in Florida, Georgia, and Texas. Effective management of B. tabaci on vegetables relies mainly on the utilization of chemical insecticides, particularly neonicotinoids. However, B. tabaci has developed considerable resistance to most insecticides. Therefore, alternative integrated pest management (IPM) strategies are required, such as cultural control by manipulation of production practices, resistant vegetable varieties, and biological control using a suite of natural enemies for the management of the pest.
Collapse
|
16
|
Detecting and Classifying Pests in Crops Using Proximal Images and Machine Learning: A Review. AI 2020. [DOI: 10.3390/ai1020021] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/09/2023] Open
Abstract
Pest management is among the most important activities in a farm. Monitoring all different species visually may not be effective, especially in large properties. Accordingly, considerable research effort has been spent towards the development of effective ways to remotely monitor potential infestations. A growing number of solutions combine proximal digital images with machine learning techniques, but since species and conditions associated to each study vary considerably, it is difficult to draw a realistic picture of the actual state of the art on the subject. In this context, the objectives of this article are (1) to briefly describe some of the most relevant investigations on the subject of automatic pest detection using proximal digital images and machine learning; (2) to provide a unified overview of the research carried out so far, with special emphasis to research gaps that still linger; (3) to propose some possible targets for future research.
Collapse
|
17
|
A Study on CNN-Based Detection of Psyllids in Sticky Traps Using Multiple Image Data Sources. AI 2020. [DOI: 10.3390/ai1020013] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Deep learning architectures like Convolutional Neural Networks (CNNs) are quickly becoming the standard for detecting and counting objects in digital images. However, most of the experiments found in the literature train and test the neural networks using data from a single image source, making it difficult to infer how the trained models would perform under a more diverse context. The objective of this study was to assess the robustness of models trained using data from a varying number of sources. Nine different devices were used to acquire images of yellow sticky traps containing psyllids and a wide variety of other objects, with each model being trained and tested using different data combinations. The results from the experiments were used to draw several conclusions about how the training process should be conducted and how the robustness of the trained models is influenced by data quantity and variety.
Collapse
|
18
|
Reitz SR, Gao Y, Kirk WDJ, Hoddle MS, Leiss KA, Funderburk JE. Invasion Biology, Ecology, and Management of Western Flower Thrips. ANNUAL REVIEW OF ENTOMOLOGY 2020; 65:17-37. [PMID: 31536711 DOI: 10.1146/annurev-ento-011019-024947] [Citation(s) in RCA: 97] [Impact Index Per Article: 24.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
Western flower thrips, Frankliniella occidentalis, first arose as an important invasive pest of many crops during the 1970s-1980s. The tremendous growth in international agricultural trade that developed then fostered the invasiveness of western flower thrips. We examine current knowledge regarding the biology of western flower thrips, with an emphasis on characteristics that contribute to its invasiveness and pest status. Efforts to control this pest and the tospoviruses that it vectors with intensive insecticide applications have been unsuccessful and have created significant problems because of the development of resistance to numerous insecticides and associated outbreaks of secondary pests. We synthesize information on effective integrated management approaches for western flower thrips that have developed through research on its biology, behavior, and ecology. We further highlight emerging topics regarding the species status of western flower thrips, as well as its genetics, biology, and ecology that facilitate its use as a model study organism and will guide development of appropriate management practices.
Collapse
Affiliation(s)
- Stuart R Reitz
- Department of Crop and Soil Science, Oregon State University, Ontario, Oregon 97914, USA;
| | - Yulin Gao
- State Key Laboratory for Biology of Plant Disease and Insect Pests, Institute of Plant Protection, Chinese Academy of Agricultural Sciences, Beijing 100193, PR China;
| | - William D J Kirk
- Centre for Applied Entomology and Parasitology, School of Life Sciences, Keele University, Newcastle Under Lyme, Staffordshire ST5 5BG, United Kingdom;
| | - Mark S Hoddle
- Department of Entomology, University of California, Riverside, California 92521;
| | - Kirsten A Leiss
- Horticulture, Wageningen University and Research, 2665 ZG Bleiswijk, The Netherlands;
| | - Joe E Funderburk
- North Florida Research and Education Center, University of Florida, Quincy, Florida 32351, USA;
| |
Collapse
|
19
|
|
20
|
Automatic Segmentation and Counting of Aphid Nymphs on Leaves Using Convolutional Neural Networks. AGRONOMY-BASEL 2018. [DOI: 10.3390/agronomy8080129] [Citation(s) in RCA: 20] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The presence of pests is one of the main problems in crop production, and obtaining reliable statistics of pest infestation is essential for pest management. Detection of pests should be automated because human monitoring of pests is time-consuming and error-prone. Aphids are among the most destructive pests in greenhouses and they reproduce quickly. Automatic detection of aphid nymphs on leaves (especially on the lower surface) using image analysis is a challenging problem due to color similarity and complicated background. In this study, we propose a method for segmentation and counting of aphid nymphs on leaves using convolutional neural networks. Digital images of pakchoi leaves at different aphid infestation stages were obtained, and corresponding pixel-level binary mask annotated. In the test, segmentation results by the proposed method achieved high overlap with annotation by human experts (Dice coefficient of 0.8207). Automatic counting based on segmentation showed high precision (0.9563) and recall (0.9650). The correlation between aphid nymph count by the proposed method and manual counting was high (R2 = 0.99). The proposed method is generic and can be applied for other species of pests.
Collapse
|
21
|
Zhong Y, Gao J, Lei Q, Zhou Y. A Vision-Based Counting and Recognition System for Flying Insects in Intelligent Agriculture. SENSORS 2018; 18:s18051489. [PMID: 29747429 PMCID: PMC5982143 DOI: 10.3390/s18051489] [Citation(s) in RCA: 86] [Impact Index Per Article: 14.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/18/2018] [Revised: 04/21/2018] [Accepted: 05/04/2018] [Indexed: 11/16/2022]
Abstract
Rapid and accurate counting and recognition of flying insects are of great importance, especially for pest control. Traditional manual identification and counting of flying insects is labor intensive and inefficient. In this study, a vision-based counting and classification system for flying insects is designed and implemented. The system is constructed as follows: firstly, a yellow sticky trap is installed in the surveillance area to trap flying insects and a camera is set up to collect real-time images. Then the detection and coarse counting method based on You Only Look Once (YOLO) object detection, the classification method and fine counting based on Support Vector Machines (SVM) using global features are designed. Finally, the insect counting and recognition system is implemented on Raspberry PI. Six species of flying insects including bee, fly, mosquito, moth, chafer and fruit fly are selected to assess the effectiveness of the system. Compared with the conventional methods, the test results show promising performance. The average counting accuracy is 92.50% and average classifying accuracy is 90.18% on Raspberry PI. The proposed system is easy-to-use and provides efficient and accurate recognition data, therefore, it can be used for intelligent agriculture applications.
Collapse
Affiliation(s)
- Yuanhong Zhong
- College of Communication of Engineering, Chongqing University, Chongqing 400044, China.
| | - Junyuan Gao
- College of Communication of Engineering, Chongqing University, Chongqing 400044, China.
| | - Qilun Lei
- College of Communication of Engineering, Chongqing University, Chongqing 400044, China.
| | - Yao Zhou
- College of Communication of Engineering, Chongqing University, Chongqing 400044, China.
| |
Collapse
|
22
|
Przybyłowicz Ł, Pniak M, Tofilski A. Semiautomated Identification of European Corn Borer (Lepidoptera: Crambidae). JOURNAL OF ECONOMIC ENTOMOLOGY 2016; 109:195-199. [PMID: 26487742 DOI: 10.1093/jee/tov300] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
The European corn borer Ostrinia nubilalis (Hübner, 1796) is a serious and widely studied pest of corn. The most common method of its control is by means of insecticides. However, biological control is becoming more and more popular. The hymenopteran parasitoid Trichogramma sp. is the most promising and effective one among the biological agents and is now widely used in North America and Europe. Its application should occur at the time when the European corn borer is at the beginning of the eggs laying period. However, the discrimination between the European corn borer and some other species occurring in agricultural landscapes at the same time can be difficult, especially for farmers which are neither familiar with the morphological nor molecular methods of identification. The scope of this study is to test the ability of the automatic computer equipment to determine the European corn borer and to separate it from the most common Lepidoptera pests found in corn plantations. The experiment showed that the 97.0% of the 247 specimens belonging to four common pestlepidopterans were correctly classified by the use of a personal computer, desktop scanner, and the special software. The obtained results showed that this technique based on wing measurements can be an effective tool for monitoring of the European corn borer. In the future, this method can be used by farmers to identify this pest and apply control measures at optimal time.
Collapse
|
23
|
|