1
|
Varga-Szilay Z, Szövényi G, Pozsgai G. Flower Visitation through the Lens: Exploring the Foraging Behaviour of Bombus terrestris with a Computer Vision-Based Application. INSECTS 2024; 15:729. [PMID: 39336697 PMCID: PMC11432343 DOI: 10.3390/insects15090729] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/10/2024] [Revised: 09/06/2024] [Accepted: 09/20/2024] [Indexed: 09/30/2024]
Abstract
To understand the processes behind pollinator declines and for the conservation of pollination services, we need to understand fundamental drivers influencing pollinator behaviour. Here, we aimed to elucidate how wild bumblebees interact with three plant species and investigated their foraging behaviour with varying flower densities. We video-recorded Bombus terrestris in 60 × 60 cm quadrats of Lotus creticus, Persicaria capitata, and Trifolium pratense in urban areas of Terceira (Azores, Portugal). For the automated bumblebee detection and counting, we created deep learning-based computer vision models with custom datasets. We achieved high model accuracy of 0.88 for Lotus and Persicaria and 0.95 for Trifolium, indicating accurate bumblebee detection. In our study, flower cover was the only factor that influenced the attractiveness of flower patches, and plant species did not have an effect. We detected a significant positive effect of flower cover on the attractiveness of flower patches for flower-visiting bumblebees. The time spent per unit of inflorescence surface area was longer on the Trifolium than those on the Lotus and Persicaria. However, our result did not indicate significant differences in the time bumblebees spent on inflorescences among the three plant species. Here, we also justify computer vision-based analysis as a reliable tool for studying pollinator behavioural ecology.
Collapse
Affiliation(s)
- Zsófia Varga-Szilay
- Doctoral School of Biology, Institute of Biology, ELTE Eötvös Loránd University, 1117 Budapest, Hungary
| | - Gergely Szövényi
- Department of Systematic Zoology and Ecology, ELTE Eötvös Loránd University, 1117 Budapest, Hungary
| | - Gábor Pozsgai
- Ce3C-Centre for Ecology, Evolution and Environmental Changes, Azorean Biodiversity Group, CHANGE–Global Change and Sustainability Institute, University of the Azores, 9700-042 Angra do Heroísmo, Portugal
| |
Collapse
|
2
|
Roy DB, Alison J, August TA, Bélisle M, Bjerge K, Bowden JJ, Bunsen MJ, Cunha F, Geissmann Q, Goldmann K, Gomez-Segura A, Jain A, Huijbers C, Larrivée M, Lawson JL, Mann HM, Mazerolle MJ, McFarland KP, Pasi L, Peters S, Pinoy N, Rolnick D, Skinner GL, Strickson OT, Svenning A, Teagle S, Høye TT. Towards a standardized framework for AI-assisted, image-based monitoring of nocturnal insects. Philos Trans R Soc Lond B Biol Sci 2024; 379:20230108. [PMID: 38705190 PMCID: PMC11070254 DOI: 10.1098/rstb.2023.0108] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2023] [Accepted: 01/17/2024] [Indexed: 05/07/2024] Open
Abstract
Automated sensors have potential to standardize and expand the monitoring of insects across the globe. As one of the most scalable and fastest developing sensor technologies, we describe a framework for automated, image-based monitoring of nocturnal insects-from sensor development and field deployment to workflows for data processing and publishing. Sensors comprise a light to attract insects, a camera for collecting images and a computer for scheduling, data storage and processing. Metadata is important to describe sampling schedules that balance the capture of relevant ecological information against power and data storage limitations. Large data volumes of images from automated systems necessitate scalable and effective data processing. We describe computer vision approaches for the detection, tracking and classification of insects, including models built from existing aggregations of labelled insect images. Data from automated camera systems necessitate approaches that account for inherent biases. We advocate models that explicitly correct for bias in species occurrence or abundance estimates resulting from the imperfect detection of species or individuals present during sampling occasions. We propose ten priorities towards a step-change in automated monitoring of nocturnal insects, a vital task in the face of rapid biodiversity loss from global threats. This article is part of the theme issue 'Towards a toolkit for global insect biodiversity monitoring'.
Collapse
Affiliation(s)
- D. B. Roy
- UK Centre for Ecology & Hydrology, Maclean Building, Benson Lane, Wallingford OX10 8BB, UK
- Centre for Ecology and Conservation, University of Exeter, Penryn TR10 9EZ, UK
| | - J. Alison
- Department of Ecoscience and Arctic Research Centre, Aarhus University, C.F Møllers Alle 3, Aarhus, Denmark
| | - T. A. August
- UK Centre for Ecology & Hydrology, Maclean Building, Benson Lane, Wallingford OX10 8BB, UK
| | - M. Bélisle
- Centre d'étude de la forêt (CEF) et Département de biologie, Université de Sherbrooke, 2500 Boulevard de l'Université, Sherbrooke, Québec, Canada J1K 2R1
| | - K. Bjerge
- Department of Electrical and Computer Engineering, Aarhus University, C.F Møllers Alle 3, Aarhus, Denmark
| | - J. J. Bowden
- Natural Resources Canada, Canadian Forest Service – Atlantic Forestry Centre, 26 University Drive, PO Box 960, Corner Brook, Newfoundland, Canada A2H 6J3
| | - M. J. Bunsen
- Mila – Québec AI Institute, Montréal, Québec, Canada H3A 0E9
| | - F. Cunha
- Mila – Québec AI Institute, Montréal, Québec, Canada H3A 0E9
- Federal University of Amazonas, Manaus, 69080–900, Brazil
| | - Q. Geissmann
- Center For Quantitative Genetics and Genomics, Aarhus University, C.F Møllers Alle 3, Aarhus, Denmark
| | - K. Goldmann
- The Alan Turing Institute, 96 Euston Road, London NW1 2DB, UK
| | - A. Gomez-Segura
- UK Centre for Ecology & Hydrology, Maclean Building, Benson Lane, Wallingford OX10 8BB, UK
| | - A. Jain
- Mila – Québec AI Institute, Montréal, Québec, Canada H3A 0E9
| | - C. Huijbers
- Naturalis Biodiversity Centre, Darwinweg 2, 2333 CR Leiden, The Netherlands
| | - M. Larrivée
- Insectarium de Montreal, 4581 Sherbrooke Rue E, Montreal, Québec, Canada H1X 2B2
| | - J. L. Lawson
- UK Centre for Ecology & Hydrology, Maclean Building, Benson Lane, Wallingford OX10 8BB, UK
| | - H. M. Mann
- Department of Ecoscience and Arctic Research Centre, Aarhus University, C.F Møllers Alle 3, Aarhus, Denmark
| | - M. J. Mazerolle
- Centre d'étude de la forêt, Département des sciences du bois et de la forêt, Faculté de foresterie, de géographie et de géomatique, Université Laval, Québec, Canada G1V 0A6
| | - K. P. McFarland
- Vermont Centre for Ecostudies, 20 Palmer Court, White River Junction, VT 05001, USA
| | - L. Pasi
- Mila – Québec AI Institute, Montréal, Québec, Canada H3A 0E9
- Ecole Polytechnique, Federale de Lausanne, Station 21, 1015 Lausanne, Switzerland
| | - S. Peters
- Faunabit, Strijkviertel 26 achter, 3454 Pm De Meern, The Netherlands
| | - N. Pinoy
- Department of Ecoscience and Arctic Research Centre, Aarhus University, C.F Møllers Alle 3, Aarhus, Denmark
| | - D. Rolnick
- Mila – Québec AI Institute, Montréal, Québec, Canada H3A 0E9
- School of Computer Science, McGill University, Montreal, Canada H3A 0E99
| | - G. L. Skinner
- UK Centre for Ecology & Hydrology, Maclean Building, Benson Lane, Wallingford OX10 8BB, UK
| | - O. T. Strickson
- The Alan Turing Institute, 96 Euston Road, London NW1 2DB, UK
| | - A. Svenning
- Department of Ecoscience and Arctic Research Centre, Aarhus University, C.F Møllers Alle 3, Aarhus, Denmark
| | - S. Teagle
- UK Centre for Ecology & Hydrology, Maclean Building, Benson Lane, Wallingford OX10 8BB, UK
| | - T. T. Høye
- Department of Ecoscience and Arctic Research Centre, Aarhus University, C.F Møllers Alle 3, Aarhus, Denmark
| |
Collapse
|
3
|
van Klink R, Sheard JK, Høye TT, Roslin T, Do Nascimento LA, Bauer S. Towards a toolkit for global insect biodiversity monitoring. Philos Trans R Soc Lond B Biol Sci 2024; 379:20230101. [PMID: 38705179 PMCID: PMC11070268 DOI: 10.1098/rstb.2023.0101] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/12/2024] [Accepted: 03/28/2024] [Indexed: 05/07/2024] Open
Abstract
Insects are the most diverse group of animals on Earth, yet our knowledge of their diversity, ecology and population trends remains abysmally poor. Four major technological approaches are coming to fruition for use in insect monitoring and ecological research-molecular methods, computer vision, autonomous acoustic monitoring and radar-based remote sensing-each of which has seen major advances over the past years. Together, they have the potential to revolutionize insect ecology, and to make all-taxa, fine-grained insect monitoring feasible across the globe. So far, advances within and among technologies have largely taken place in isolation, and parallel efforts among projects have led to redundancy and a methodological sprawl; yet, given the commonalities in their goals and approaches, increased collaboration among projects and integration across technologies could provide unprecedented improvements in taxonomic and spatio-temporal resolution and coverage. This theme issue showcases recent developments and state-of-the-art applications of these technologies, and outlines the way forward regarding data processing, cost-effectiveness, meaningful trend analysis, technological integration and open data requirements. Together, these papers set the stage for the future of automated insect monitoring. This article is part of the theme issue 'Towards a toolkit for global insect biodiversity monitoring'.
Collapse
Affiliation(s)
- Roel van Klink
- German Centre for Integrative Biodiversity Research Halle-Jena-Leipzig, Puschstrasse 4, Leipzig 04103, Germany
- Department of Computer Science, Martin-Luther-University Halle-Wittenberg, Von-Seckendorff-Platz 1 06120 Halle, Germany
| | - Julie Koch Sheard
- German Centre for Integrative Biodiversity Research Halle-Jena-Leipzig, Puschstrasse 4, Leipzig 04103, Germany
- Department of Ecosystem Services, Helmholtz-Centre for Environmental Research - UFZ, Permoserstr. 15, Leipzig 04318, Germany
- Friedrich Schiller University Jena, Institute of Biodiversity, Dornburger Straße 159, Jena 07743, Germany
- Department of Biology, Animal Ecology, University of Marburg, Karl-von-Frisch-Straße 8, Marburg 35043, Germany
| | - Toke T. Høye
- Department of Ecoscience, Aarhus University, C. F. Møllers Allé 8, Aarhus C 8000, Denmark
- Arctic Research Centre, Aarhus University, Ole Worms Allé 1, Aarhus C 8000, Denmark
| | - Tomas Roslin
- Department of Ecology, Swedish University of Agricultural Sciences (SLU), Ulls väg 18B, Uppsala 75651, Sweden
- Organismal and Evolutionary Biology Research Programme, Faculty of Biological and Environmental Sciences, FI-00014 University of Helsinki, Helsinki, Finland
| | - Leandro A. Do Nascimento
- Science Department, biometrio.earth, Dr.-Schoenemann-Str. 38, Saarbrücken 66123 Deutschland, Germany
| | - Silke Bauer
- Swiss Federal Research Institute WSL, Zürcherstrasse 111, Birmensdorf CH-8903, Switzerland
- Swiss Ornithological Institute, Seerose 1, Sempach 6204, Switzerland
- Institute for Biodiversity and Ecosystem Dynamics, Sciencepark 904, Amsterdam 1098 XH, The Netherlands
- Department of Environmental Systems Science, ETH Zürich, Universitätstrasse 16 Zürich 8092, Switzerland
| |
Collapse
|
4
|
Brydegaard M, Pedales RD, Feng V, Yamoa ASD, Kouakou B, Månefjord H, Wührl L, Pylatiuk C, Amorim DDS, Meier R. Towards global insect biomonitoring with frugal methods. Philos Trans R Soc Lond B Biol Sci 2024; 379:20230103. [PMID: 38705174 PMCID: PMC11070255 DOI: 10.1098/rstb.2023.0103] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2023] [Accepted: 01/24/2024] [Indexed: 05/07/2024] Open
Abstract
None of the global targets for protecting nature are currently met, although humanity is critically dependent on biodiversity. A significant issue is the lack of data for most biodiverse regions of the planet where the use of frugal methods for biomonitoring would be particularly important because the available funding for monitoring is insufficient, especially in low-income countries. We here discuss how three approaches to insect biomonitoring (computer vision, lidar, DNA sequences) could be made more frugal and urge that all biomonitoring techniques should be evaluated for global suitability before becoming the default in high-income countries. This requires that techniques popular in high-income countries should undergo a phase of 'innovation through simplification' before they are implemented more broadly. We predict that techniques that acquire raw data at low cost and are suitable for analysis with AI (e.g. images, lidar-signals) will be particularly suitable for global biomonitoring, while techniques that rely heavily on patented technologies may be less promising (e.g. DNA sequences). We conclude the opinion piece by pointing out that the widespread use of AI for data analysis will require a global strategy for providing the necessary computational resources and training. This article is part of the theme issue 'Towards a toolkit for global insect biodiversity monitoring'.
Collapse
Affiliation(s)
- Mikkel Brydegaard
- Dept. Physics, Lund University, Sölvegatan 14c, 22362 Lund, Sweden
- Dept. Biology, Lund University, Sölvegatan 35, 22362 Lund, Sweden
- Norsk Elektro Optikk, Østensjøveien 34, 0667 Oslo, Norge
- FaunaPhotonics, Støberi Støberigade 14, 2450 København, Denmark
| | - Ronniel D. Pedales
- Institute of Biology, University of the Philippines Diliman, Quezon City, Philippines 1101
- Center for Integrative Biodiversity Discovery, Museum für Naturkunde, Leibniz Institute for Evolution and Biodiversity Science, Invalidenstraße 43, 10115, Berlin, Germany
- Institute of Biology, Humboldt University, 10115 Berlin, Germany
| | - Vivian Feng
- Center for Integrative Biodiversity Discovery, Museum für Naturkunde, Leibniz Institute for Evolution and Biodiversity Science, Invalidenstraße 43, 10115, Berlin, Germany
- Institute of Biology, Humboldt University, 10115 Berlin, Germany
| | - Assoumou saint-doria Yamoa
- Instrumentation, Imaging and Spectroscopy Laboratory, Felix Houphouet-Boigny Institute, BP1093 Yamoussoukro, Ivory Coast
| | - Benoit Kouakou
- Instrumentation, Imaging and Spectroscopy Laboratory, Felix Houphouet-Boigny Institute, BP1093 Yamoussoukro, Ivory Coast
| | - Hampus Månefjord
- Dept. Physics, Lund University, Sölvegatan 14c, 22362 Lund, Sweden
| | - Lorenz Wührl
- Institute for Automation and Applied Informatics, Karlsruhe Institute of Technology, 76344 Eggenstein-Leopoldshafen, Germany
| | - Christian Pylatiuk
- Institute for Automation and Applied Informatics, Karlsruhe Institute of Technology, 76344 Eggenstein-Leopoldshafen, Germany
| | - Dalton de Souza Amorim
- Departamento de Biologia, FFCLRP, Universidade de São Paulo, Ribeirão Preto 14040-901, Brazil
| | - Rudolf Meier
- Center for Integrative Biodiversity Discovery, Museum für Naturkunde, Leibniz Institute for Evolution and Biodiversity Science, Invalidenstraße 43, 10115, Berlin, Germany
- Institute of Biology, Humboldt University, 10115 Berlin, Germany
| |
Collapse
|
5
|
Sittinger M, Uhler J, Pink M, Herz A. Insect detect: An open-source DIY camera trap for automated insect monitoring. PLoS One 2024; 19:e0295474. [PMID: 38568922 PMCID: PMC10990185 DOI: 10.1371/journal.pone.0295474] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2023] [Accepted: 02/28/2024] [Indexed: 04/05/2024] Open
Abstract
Insect monitoring is essential to design effective conservation strategies, which are indispensable to mitigate worldwide declines and biodiversity loss. For this purpose, traditional monitoring methods are widely established and can provide data with a high taxonomic resolution. However, processing of captured insect samples is often time-consuming and expensive, which limits the number of potential replicates. Automated monitoring methods can facilitate data collection at a higher spatiotemporal resolution with a comparatively lower effort and cost. Here, we present the Insect Detect DIY (do-it-yourself) camera trap for non-invasive automated monitoring of flower-visiting insects, which is based on low-cost off-the-shelf hardware components combined with open-source software. Custom trained deep learning models detect and track insects landing on an artificial flower platform in real time on-device and subsequently classify the cropped detections on a local computer. Field deployment of the solar-powered camera trap confirmed its resistance to high temperatures and humidity, which enables autonomous deployment during a whole season. On-device detection and tracking can estimate insect activity/abundance after metadata post-processing. Our insect classification model achieved a high top-1 accuracy on the test dataset and generalized well on a real-world dataset with captured insect images. The camera trap design and open-source software are highly customizable and can be adapted to different use cases. With custom trained detection and classification models, as well as accessible software programming, many possible applications surpassing our proposed deployment method can be realized.
Collapse
Affiliation(s)
- Maximilian Sittinger
- Julius Kühn Institute (JKI)—Federal Research Centre for Cultivated Plants, Institute for Biological Control, Dossenheim, Germany
| | - Johannes Uhler
- Julius Kühn Institute (JKI)—Federal Research Centre for Cultivated Plants, Institute for Biological Control, Dossenheim, Germany
| | - Maximilian Pink
- Julius Kühn Institute (JKI)—Federal Research Centre for Cultivated Plants, Institute for Biological Control, Dossenheim, Germany
| | - Annette Herz
- Julius Kühn Institute (JKI)—Federal Research Centre for Cultivated Plants, Institute for Biological Control, Dossenheim, Germany
| |
Collapse
|
6
|
Lövei GL, Ferrante M. The Use and Prospects of Nonlethal Methods in Entomology. ANNUAL REVIEW OF ENTOMOLOGY 2024; 69:183-198. [PMID: 37669564 DOI: 10.1146/annurev-ento-120220-024402] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/07/2023]
Abstract
Arthropods are declining globally, and entomologists ought to be in the forefront of protecting them. However, entomological study methods are typically lethal, and we argue that this makes the ethical status of the profession precarious. Lethal methods are used in most studies, even those that aim to support arthropod conservation. Additionally, almost all collecting methods result in bycatch, and a first step toward less destructive research practices is to minimize bycatch and/or ensure its proper storage and use. In this review, we describe the available suite of nonlethal methods with the aim of promoting their use. We classify nonlethal methods into (a) reuse of already collected material, (b) methods that are damaging but not lethal, (c) methods that modify behavior, and (d) true nonlethal methods. Artificial intelligence and miniaturization will help to extend the nonlethal methodological toolkit, but the need for further method development and testing remains.
Collapse
Affiliation(s)
- Gábor L Lövei
- Department of Agroecology, Flakkebjerg Research Centre, Aarhus University, Slagelse, Denmark;
- Hungarian Research Network Anthropocene Ecology Research Group, Debrecen University, Debrecen, Hungary
| | - Marco Ferrante
- Functional Agrobiodiversity, Department of Crop Sciences, University of Göttingen, Germany;
| |
Collapse
|
7
|
Zeuss D, Bald L, Gottwald J, Becker M, Bellafkir H, Bendix J, Bengel P, Beumer LT, Brandl R, Brändle M, Dahlke S, Farwig N, Freisleben B, Friess N, Heidrich L, Heuer S, Höchst J, Holzmann H, Lampe P, Leberecht M, Lindner K, Masello JF, Mielke Möglich J, Mühling M, Müller T, Noskov A, Opgenoorth L, Peter C, Quillfeldt P, Rösner S, Royauté R, Mestre-Runge C, Schabo D, Schneider D, Seeger B, Shayle E, Steinmetz R, Tafo P, Vogelbacher M, Wöllauer S, Younis S, Zobel J, Nauss T. Nature 4.0: A networked sensor system for integrated biodiversity monitoring. GLOBAL CHANGE BIOLOGY 2024; 30:e17056. [PMID: 38273542 DOI: 10.1111/gcb.17056] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/06/2023] [Revised: 10/13/2023] [Accepted: 10/26/2023] [Indexed: 01/27/2024]
Abstract
Ecosystem functions and services are severely threatened by unprecedented global loss in biodiversity. To counteract these trends, it is essential to develop systems to monitor changes in biodiversity for planning, evaluating, and implementing conservation and mitigation actions. However, the implementation of monitoring systems suffers from a trade-off between grain (i.e., the level of detail), extent (i.e., the number of study sites), and temporal repetition. Here, we present an applied and realized networked sensor system for integrated biodiversity monitoring in the Nature 4.0 project as a solution to these challenges, which considers plants and animals not only as targets of investigation, but also as parts of the modular sensor network by carrying sensors. Our networked sensor system consists of three main closely interlinked components with a modular structure: sensors, data transmission, and data storage, which are integrated into pipelines for automated biodiversity monitoring. We present our own real-world examples of applications, share our experiences in operating them, and provide our collected open data. Our flexible, low-cost, and open-source solutions can be applied for monitoring individual and multiple terrestrial plants and animals as well as their interactions. Ultimately, our system can also be applied to area-wide ecosystem mapping tasks, thereby providing an exemplary cost-efficient and powerful solution for biodiversity monitoring. Building upon our experiences in the Nature 4.0 project, we identified ten key challenges that need to be addressed to better understand and counteract the ongoing loss of biodiversity using networked sensor systems. To tackle these challenges, interdisciplinary collaboration, additional research, and practical solutions are necessary to enhance the capability and applicability of networked sensor systems for researchers and practitioners, ultimately further helping to ensure the sustainable management of ecosystems and the provision of ecosystem services.
Collapse
Affiliation(s)
- Dirk Zeuss
- Department of Geography, Environmental Informatics, Philipps-Universität Marburg, Marburg, Germany
| | - Lisa Bald
- Department of Geography, Environmental Informatics, Philipps-Universität Marburg, Marburg, Germany
| | - Jannis Gottwald
- Department of Geography, Environmental Informatics, Philipps-Universität Marburg, Marburg, Germany
| | - Marcel Becker
- Department of Biology, Conservation Ecology, Philipps-Universität Marburg, Marburg, Germany
| | - Hicham Bellafkir
- Department of Mathematics and Computer Science, Distributed Systems and Intelligent Computing, Philipps-Universität Marburg, Marburg, Germany
| | - Jörg Bendix
- Department of Geography, Climatology and Environmental Modelling, Philipps-Universität Marburg, Marburg, Germany
| | - Phillip Bengel
- Department of Geography, Didactics and Education, Philipps-Universität Marburg, Marburg, Germany
| | - Larissa T Beumer
- Senckenberg Biodiversity and Climate Research Centre (SBiK-F), Frankfurt am Main, Germany
| | - Roland Brandl
- Department of Biology, Animal Ecology, Philipps-Universität Marburg, Marburg, Germany
| | - Martin Brändle
- Department of Biology, Animal Ecology, Philipps-Universität Marburg, Marburg, Germany
| | - Stephan Dahlke
- Department of Mathematics and Computer Science, Numerics, Philipps-Universität Marburg, Marburg, Germany
| | - Nina Farwig
- Department of Biology, Conservation Ecology, Philipps-Universität Marburg, Marburg, Germany
| | - Bernd Freisleben
- Department of Mathematics and Computer Science, Distributed Systems and Intelligent Computing, Philipps-Universität Marburg, Marburg, Germany
| | - Nicolas Friess
- Department of Geography, Environmental Informatics, Philipps-Universität Marburg, Marburg, Germany
| | - Lea Heidrich
- Department of Geography, Environmental Informatics, Philipps-Universität Marburg, Marburg, Germany
| | - Sven Heuer
- Department of Mathematics and Computer Science, Numerics, Philipps-Universität Marburg, Marburg, Germany
| | - Jonas Höchst
- Department of Mathematics and Computer Science, Distributed Systems and Intelligent Computing, Philipps-Universität Marburg, Marburg, Germany
| | - Hajo Holzmann
- Department of Mathematics and Computer Science, Stochastics, Philipps-Universität Marburg, Marburg, Germany
| | - Patrick Lampe
- Department of Mathematics and Computer Science, Distributed Systems and Intelligent Computing, Philipps-Universität Marburg, Marburg, Germany
| | - Martin Leberecht
- Department of Biology, Plant Ecology and Geobotany, Philipps-Universität Marburg, Marburg, Germany
| | - Kim Lindner
- Department of Biology, Conservation Ecology, Philipps-Universität Marburg, Marburg, Germany
| | - Juan F Masello
- Department of Animal Ecology & Systematics, Justus Liebig University Gießen, Gießen, Germany
| | - Jonas Mielke Möglich
- Department of Biology, Animal Ecology, Philipps-Universität Marburg, Marburg, Germany
| | - Markus Mühling
- Department of Mathematics and Computer Science, Distributed Systems and Intelligent Computing, Philipps-Universität Marburg, Marburg, Germany
| | - Thomas Müller
- Senckenberg Biodiversity and Climate Research Centre (SBiK-F), Frankfurt am Main, Germany
- Department of Biological Sciences, Goethe University Frankfurt am Main, Frankfurt am Main, Germany
| | - Alexey Noskov
- Department of Geography, Climatology and Environmental Modelling, Philipps-Universität Marburg, Marburg, Germany
| | - Lars Opgenoorth
- Department of Biology, Plant Ecology and Geobotany, Philipps-Universität Marburg, Marburg, Germany
| | - Carina Peter
- Department of Geography, Didactics and Education, Philipps-Universität Marburg, Marburg, Germany
| | - Petra Quillfeldt
- Department of Animal Ecology & Systematics, Justus Liebig University Gießen, Gießen, Germany
| | - Sascha Rösner
- Department of Biology, Conservation Ecology, Philipps-Universität Marburg, Marburg, Germany
| | - Raphaël Royauté
- Senckenberg Biodiversity and Climate Research Centre (SBiK-F), Frankfurt am Main, Germany
- Université Paris-Saclay, INRAE, AgroParisTech, UMR EcoSys, Palaiseau, France
| | - Christian Mestre-Runge
- Department of Geography, Environmental Informatics, Philipps-Universität Marburg, Marburg, Germany
- Department of Biology, Plant Ecology and Geobotany, Philipps-Universität Marburg, Marburg, Germany
| | - Dana Schabo
- Department of Biology, Conservation Ecology, Philipps-Universität Marburg, Marburg, Germany
| | - Daniel Schneider
- Department of Mathematics and Computer Science, Distributed Systems and Intelligent Computing, Philipps-Universität Marburg, Marburg, Germany
| | - Bernhard Seeger
- Department of Mathematics and Computer Science, Database Systems, Philipps-Universität Marburg, Marburg, Germany
| | - Elliot Shayle
- Department of Geography, Environmental Informatics, Philipps-Universität Marburg, Marburg, Germany
| | - Ralf Steinmetz
- Department of Electrical Engineering and Information Technology, Multimedia Communications Lab (KOM), Technical University of Darmstadt, Darmstadt, Germany
| | - Pavel Tafo
- Department of Mathematics and Computer Science, Stochastics, Philipps-Universität Marburg, Marburg, Germany
| | - Markus Vogelbacher
- Department of Mathematics and Computer Science, Distributed Systems and Intelligent Computing, Philipps-Universität Marburg, Marburg, Germany
| | - Stephan Wöllauer
- Department of Geography, Environmental Informatics, Philipps-Universität Marburg, Marburg, Germany
| | - Sohaib Younis
- Department of Mathematics and Computer Science, Database Systems, Philipps-Universität Marburg, Marburg, Germany
| | - Julian Zobel
- Department of Electrical Engineering and Information Technology, Multimedia Communications Lab (KOM), Technical University of Darmstadt, Darmstadt, Germany
| | - Thomas Nauss
- Department of Geography, Environmental Informatics, Philipps-Universität Marburg, Marburg, Germany
| |
Collapse
|
8
|
Stiemer LN, Thoma A, Braun C. MBT3D: Deep learning based multi-object tracker for bumblebee 3D flight path estimation. PLoS One 2023; 18:e0291415. [PMID: 37738269 PMCID: PMC10516433 DOI: 10.1371/journal.pone.0291415] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2023] [Accepted: 08/29/2023] [Indexed: 09/24/2023] Open
Abstract
This work presents the Multi-Bees-Tracker (MBT3D) algorithm, a Python framework implementing a deep association tracker for Tracking-By-Detection, to address the challenging task of tracking flight paths of bumblebees in a social group. While tracking algorithms for bumblebees exist, they often come with intensive restrictions, such as the need for sufficient lighting, high contrast between the animal and background, absence of occlusion, significant user input, etc. Tracking flight paths of bumblebees in a social group is challenging. They suddenly adjust movements and change their appearance during different wing beat states while exhibiting significant similarities in their individual appearance. The MBT3D tracker, developed in this research, is an adaptation of an existing ant tracking algorithm for bumblebee tracking. It incorporates an offline trained appearance descriptor along with a Kalman Filter for appearance and motion matching. Different detector architectures for upstream detections (You Only Look Once (YOLOv5), Faster Region Proposal Convolutional Neural Network (Faster R-CNN), and RetinaNet) are investigated in a comparative study to optimize performance. The detection models were trained on a dataset containing 11359 labeled bumblebee images. YOLOv5 reaches an Average Precision of AP = 53, 8%, Faster R-CNN achieves AP = 45, 3% and RetinaNet AP = 38, 4% on the bumblebee validation dataset, which consists of 1323 labeled bumblebee images. The tracker's appearance model is trained on 144 samples. The tracker (with Faster R-CNN detections) reaches a Multiple Object Tracking Accuracy MOTA = 93, 5% and a Multiple Object Tracking Precision MOTP = 75, 6% on a validation dataset containing 2000 images, competing with state-of-the-art computer vision methods. The framework allows reliable tracking of different bumblebees in the same video stream with rarely occurring identity switches (IDS). MBT3D has much lower IDS than other commonly used algorithms, with one of the lowest false positive rates, competing with state-of-the-art animal tracking algorithms. The developed framework reconstructs the 3-dimensional (3D) flight paths of the bumblebees by triangulation. It also handles and compares two alternative stereo camera pairs if desired.
Collapse
Affiliation(s)
- Luc Nicolas Stiemer
- Department of Aerospace Engineering, FH Aachen, Aachen, North Rhine-Westphalia, Germany
| | - Andreas Thoma
- Department of Aerospace Engineering, FH Aachen, Aachen, North Rhine-Westphalia, Germany
- Department of Aerospace Engineering, RMIT University, Melbourne, Victoria, Australia
| | - Carsten Braun
- Department of Aerospace Engineering, FH Aachen, Aachen, North Rhine-Westphalia, Germany
| |
Collapse
|
9
|
Panigrahi S, Maski P, Thondiyath A. Real-time biodiversity analysis using deep-learning algorithms on mobile robotic platforms. PeerJ Comput Sci 2023; 9:e1502. [PMID: 37705641 PMCID: PMC10495972 DOI: 10.7717/peerj-cs.1502] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2022] [Accepted: 07/04/2023] [Indexed: 09/15/2023]
Abstract
Ecological biodiversity is declining at an unprecedented rate. To combat such irreversible changes in natural ecosystems, biodiversity conservation initiatives are being conducted globally. However, the lack of a feasible methodology to quantify biodiversity in real-time and investigate population dynamics in spatiotemporal scales prevents the use of ecological data in environmental planning. Traditionally, ecological studies rely on the census of an animal population by the "capture, mark and recapture" technique. In this technique, human field workers manually count, tag and observe tagged individuals, making it time-consuming, expensive, and cumbersome to patrol the entire area. Recent research has also demonstrated the potential for inexpensive and accessible sensors for ecological data monitoring. However, stationary sensors collect localised data which is highly specific on the placement of the setup. In this research, we propose the methodology for biodiversity monitoring utilising state-of-the-art deep learning (DL) methods operating in real-time on sample payloads of mobile robots. Such trained DL algorithms demonstrate a mean average precision (mAP) of 90.51% in an average inference time of 67.62 milliseconds within 6,000 training epochs. We claim that the use of such mobile platform setups inferring real-time ecological data can help us achieve our goal of quick and effective biodiversity surveys. An experimental test payload is fabricated, and online as well as offline field surveys are conducted, validating the proposed methodology for species identification that can be further extended to geo-localisation of flora and fauna in any ecosystem.
Collapse
Affiliation(s)
- Siddhant Panigrahi
- Department of Engineering Design, Indian Institute of Technology Madras, Chennai, Tamil Nadu, India
| | - Prajwal Maski
- Department of Engineering Design, Indian Institute of Technology Madras, Chennai, Tamil Nadu, India
| | - Asokan Thondiyath
- Department of Engineering Design, Indian Institute of Technology Madras, Chennai, Tamil Nadu, India
| |
Collapse
|
10
|
Bjerge K, Frigaard CE, Karstoft H. Object Detection of Small Insects in Time-Lapse Camera Recordings. SENSORS (BASEL, SWITZERLAND) 2023; 23:7242. [PMID: 37631778 PMCID: PMC10459366 DOI: 10.3390/s23167242] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/15/2023] [Revised: 08/09/2023] [Accepted: 08/16/2023] [Indexed: 08/27/2023]
Abstract
As pollinators, insects play a crucial role in ecosystem management and world food production. However, insect populations are declining, necessitating efficient insect monitoring methods. Existing methods analyze video or time-lapse images of insects in nature, but analysis is challenging as insects are small objects in complex and dynamic natural vegetation scenes. In this work, we provide a dataset of primarily honeybees visiting three different plant species during two months of the summer. The dataset consists of 107,387 annotated time-lapse images from multiple cameras, including 9423 annotated insects. We present a method for detecting insects in time-lapse RGB images, which consists of a two-step process. Firstly, the time-lapse RGB images are preprocessed to enhance insects in the images. This motion-informed enhancement technique uses motion and colors to enhance insects in images. Secondly, the enhanced images are subsequently fed into a convolutional neural network (CNN) object detector. The method improves on the deep learning object detectors You Only Look Once (YOLO) and faster region-based CNN (Faster R-CNN). Using motion-informed enhancement, the YOLO detector improves the average micro F1-score from 0.49 to 0.71, and the Faster R-CNN detector improves the average micro F1-score from 0.32 to 0.56. Our dataset and proposed method provide a step forward for automating the time-lapse camera monitoring of flying insects.
Collapse
Affiliation(s)
- Kim Bjerge
- Department of Electrical and Computer Engineering, Aarhus University, 8200 Aarhus N, Denmark (H.K.)
| | | | | |
Collapse
|
11
|
Sun Y, Zhan W, Dong T, Guo Y, Liu H, Gui L, Zhang Z. Real-Time Recognition and Detection of Bactrocera minax (Diptera: Trypetidae) Grooming Behavior Using Body Region Localization and Improved C3D Network. SENSORS (BASEL, SWITZERLAND) 2023; 23:6442. [PMID: 37514739 PMCID: PMC10386511 DOI: 10.3390/s23146442] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/27/2023] [Revised: 07/09/2023] [Accepted: 07/14/2023] [Indexed: 07/30/2023]
Abstract
Pest management has long been a critical aspect of crop protection. Insect behavior is of great research value as an important indicator for assessing insect characteristics. Currently, insect behavior research is increasingly based on the quantification of behavior. Traditional manual observation and analysis methods can no longer meet the requirements of data volume and observation time. In this paper, we propose a method based on region localization combined with an improved 3D convolutional neural network for six grooming behaviors of Bactrocera minax: head grooming, foreleg grooming, fore-mid leg grooming, mid-hind leg grooming, hind leg grooming, and wing grooming. The overall recognition accuracy reached 93.46%. We compared the results obtained from the detection model with manual observations; the average difference was about 12%. This shows that the model reached a level close to manual observation. Additionally, recognition time using this method is only one-third of that required for manual observation, making it suitable for real-time detection needs. Experimental data demonstrate that this method effectively eliminates the interference caused by the walking behavior of Bactrocera minax, enabling efficient and automated detection of grooming behavior. Consequently, it offers a convenient means of studying pest characteristics in the field of crop protection.
Collapse
Affiliation(s)
- Yong Sun
- School of Computer Science, Yangtze University, Jingzhou 434023, China
- Jingzhou Yingtuo Technology Co., Ltd., Jingzhou 434023, China
| | - Wei Zhan
- School of Computer Science, Yangtze University, Jingzhou 434023, China
| | - Tianyu Dong
- School of Computer Science, Yangtze University, Jingzhou 434023, China
| | - Yuheng Guo
- School of Computer Science, Yangtze University, Jingzhou 434023, China
| | - Hu Liu
- School of Computer Science, Yangtze University, Jingzhou 434023, China
| | - Lianyou Gui
- College of Agriculture, Yangtze University, Jingzhou 434023, China
| | - Zhiliang Zhang
- School of Computer Science, Yangtze University, Jingzhou 434023, China
| |
Collapse
|
12
|
Xue Y, Cai C, Chi Y. Frame Structure Fault Diagnosis Based on a High-Precision Convolution Neural Network. SENSORS (BASEL, SWITZERLAND) 2022; 22:9427. [PMID: 36502133 PMCID: PMC9738882 DOI: 10.3390/s22239427] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/10/2022] [Revised: 11/27/2022] [Accepted: 12/01/2022] [Indexed: 06/17/2023]
Abstract
Structural health monitoring and fault diagnosis are important scientific issues in mechanical engineering, civil engineering, and other disciplines. The basic premise of structural health work is to be able to accurately diagnose the fault in the structure. Therefore, the accurate fault diagnosis of structure can not only ensure the safe operation of mechanical equipment and the safe use of civil construction, but also ensure the safety of people's lives and property. In order to improve the accuracy fault diagnosis of frame structure under noise conditions, the existing Convolutional Neural Network with Training Interference (TICNN) model is improved, and a new convolutional neural network model with strong noise resistance is proposed. In order to verify THE superiority of the proposed improved TICNN in anti-noise, comparative experiments are carried out by using TICNN, One Dimensional Convolution Neural Network (1DCNN) and First Layer Wide Convolution Kernel Deep Convolution Neural Network (WDCNN). The experimental results show that the improved TICNN has the best anti-noise ability. Based on the improved TICNN, the fault diagnosis experiment of a four-story steel structure model is carried out. The experimental results show that the improved TICNN can obtain high diagnostic accuracy under strong noise conditions, which verifies the advantages of the improved TICNN.
Collapse
|
13
|
Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination. Int J Comput Vis 2022. [DOI: 10.1007/s11263-022-01715-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
14
|
Fetnassi N, Ude K, Kull A, Tammaru T. Weather Sensitivity of Sugar Bait Trapping of Nocturnal Moths: A Case Study from Northern Europe. INSECTS 2022; 13:1087. [PMID: 36554997 PMCID: PMC9783685 DOI: 10.3390/insects13121087] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/01/2022] [Revised: 11/18/2022] [Accepted: 11/21/2022] [Indexed: 06/17/2023]
Abstract
Assemblages of insects need to be quantitatively sampled in the context of various research questions. Light trapping is the most widely used method for sampling nocturnal Lepidoptera. Attracting moths to sugar baits offers a viable alternative. However, this method is rarely used in professional research despite its popularity among amateur lepidopterists. As the activity of insects is strongly dependent on ambient conditions, the sensitivity of any trapping method to weather parameters needs to be known for the quantitative interpretation of trapping results. In the present paper, we report data on the weather dependence of moth catches obtained by automatic bait traps. The study was performed in Estonia, representing the European hemiboreal forest zone. Portable weather stations set up next to each of the traps were used for collecting weather data. Both abundance and diversity of the moths in the catches depended strongly positively on temperature and negatively on air humidity. Diversity was also negatively correlated with air pressure and positively with the change in pressure during the night. The results show that in situ recording of weather parameters in connection to insect trapping provides useful insights for the study of insect behaviour and the interpretation of the results of monitoring projects.
Collapse
Affiliation(s)
- Nidal Fetnassi
- Department of Zoology, Institute of Ecology and Earth Sciences, Faculty of Science and Technology, University of Tartu, 50409 Tartu, Estonia
- Water, Biodiversity and Climate Change Laboratory, Faculty of Sciences Semlalia, Cadi Ayyad University, P.O. Box 2390, Marrakech 40000, Morocco
| | - Kadri Ude
- Department of Zoology, Institute of Ecology and Earth Sciences, Faculty of Science and Technology, University of Tartu, 50409 Tartu, Estonia
| | - Ain Kull
- Department of Geography, Institute of Ecology and Earth Sciences, Faculty of Sciences and Technology, University of Tartu, 50410 Tartu, Estonia
| | - Toomas Tammaru
- Department of Zoology, Institute of Ecology and Earth Sciences, Faculty of Science and Technology, University of Tartu, 50409 Tartu, Estonia
| |
Collapse
|
15
|
Kerry RG, Montalbo FJP, Das R, Patra S, Mahapatra GP, Maurya GK, Nayak V, Jena AB, Ukhurebor KE, Jena RC, Gouda S, Majhi S, Rout JR. An overview of remote monitoring methods in biodiversity conservation. ENVIRONMENTAL SCIENCE AND POLLUTION RESEARCH INTERNATIONAL 2022; 29:80179-80221. [PMID: 36197618 PMCID: PMC9534007 DOI: 10.1007/s11356-022-23242-y] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/10/2021] [Accepted: 09/20/2022] [Indexed: 06/16/2023]
Abstract
Conservation of biodiversity is critical for the coexistence of humans and the sustenance of other living organisms within the ecosystem. Identification and prioritization of specific regions to be conserved are impossible without proper information about the sites. Advanced monitoring agencies like the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES) had accredited that the sum total of species that are now threatened with extinction is higher than ever before in the past and are progressing toward extinct at an alarming rate. Besides this, the conceptualized global responses to these crises are still inadequate and entail drastic changes. Therefore, more sophisticated monitoring and conservation techniques are required which can simultaneously cover a larger surface area within a stipulated time frame and gather a large pool of data. Hence, this study is an overview of remote monitoring methods in biodiversity conservation via a survey of evidence-based reviews and related studies, wherein the description of the application of some technology for biodiversity conservation and monitoring is highlighted. Finally, the paper also describes various transformative smart technologies like artificial intelligence (AI) and/or machine learning algorithms for enhanced working efficiency of currently available techniques that will aid remote monitoring methods in biodiversity conservation.
Collapse
Affiliation(s)
- Rout George Kerry
- Department of Biotechnology, Utkal University, Vani Vihar, Bhubaneswar, Odisha 751004 India
| | | | - Rajeswari Das
- Department of Soil Science and Agricultural Chemistry, School of Agriculture, GIET University, Gunupur, Rayagada, Odisha 765022 India
| | - Sushmita Patra
- Indian Council of Agricultural Research-Directorate of Foot and Mouth Disease-International Centre for Foot and Mouth Disease, Arugul, Bhubaneswar, Odisha 752050 India
| | | | - Ganesh Kumar Maurya
- Zoology Section, Mahila MahaVidyalya, Banaras Hindu University, Varanasi, 221005 India
| | - Vinayak Nayak
- Indian Council of Agricultural Research-Directorate of Foot and Mouth Disease-International Centre for Foot and Mouth Disease, Arugul, Bhubaneswar, Odisha 752050 India
| | - Atala Bihari Jena
- Department of Neurosurgery, Brigham and Women’s Hospital, Harvard Medical School, Boston, MA 02115 USA
| | | | - Ram Chandra Jena
- Department of Pharmaceutical Sciences, Utkal University, Vani Vihar, Bhubaneswar, Odisha 751004 India
| | - Sushanto Gouda
- Department of Zoology, Mizoram University, Aizawl, 796009 India
| | - Sanatan Majhi
- Department of Biotechnology, Utkal University, Vani Vihar, Bhubaneswar, Odisha 751004 India
| | - Jyoti Ranjan Rout
- School of Biological Sciences, AIPH University, Bhubaneswar, Odisha 752101 India
| |
Collapse
|
16
|
Mutanu L, Gohil J, Gupta K, Wagio P, Kotonya G. A Review of Automated Bioacoustics and General Acoustics Classification Research. SENSORS (BASEL, SWITZERLAND) 2022; 22:8361. [PMID: 36366061 PMCID: PMC9658612 DOI: 10.3390/s22218361] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/01/2022] [Revised: 10/19/2022] [Accepted: 10/21/2022] [Indexed: 06/16/2023]
Abstract
Automated bioacoustics classification has received increasing attention from the research community in recent years due its cross-disciplinary nature and its diverse application. Applications in bioacoustics classification range from smart acoustic sensor networks that investigate the effects of acoustic vocalizations on species to context-aware edge devices that anticipate changes in their environment adapt their sensing and processing accordingly. The research described here is an in-depth survey of the current state of bioacoustics classification and monitoring. The survey examines bioacoustics classification alongside general acoustics to provide a representative picture of the research landscape. The survey reviewed 124 studies spanning eight years of research. The survey identifies the key application areas in bioacoustics research and the techniques used in audio transformation and feature extraction. The survey also examines the classification algorithms used in bioacoustics systems. Lastly, the survey examines current challenges, possible opportunities, and future directions in bioacoustics.
Collapse
Affiliation(s)
- Leah Mutanu
- Department of Computing, United States International University Africa, Nairobi P.O. Box 14634-0800, Kenya
| | - Jeet Gohil
- Department of Computing, United States International University Africa, Nairobi P.O. Box 14634-0800, Kenya
| | - Khushi Gupta
- Department of Computer Science, Sam Houston State University, Huntsville, TX 77341, USA
| | - Perpetua Wagio
- Department of Computing, United States International University Africa, Nairobi P.O. Box 14634-0800, Kenya
| | - Gerald Kotonya
- School of Computing and Communications, Lancaster University, Lacaster LA1 4WA, UK
| |
Collapse
|
17
|
Høye TT, Dyrmann M, Kjær C, Nielsen J, Bruus M, Mielec CL, Vesterdal MS, Bjerge K, Madsen SA, Jeppesen MR, Melvad C. Accurate image-based identification of macroinvertebrate specimens using deep learning-How much training data is needed? PeerJ 2022; 10:e13837. [PMID: 36032940 PMCID: PMC9415355 DOI: 10.7717/peerj.13837] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2022] [Accepted: 07/13/2022] [Indexed: 01/18/2023] Open
Abstract
Image-based methods for species identification offer cost-efficient solutions for biomonitoring. This is particularly relevant for invertebrate studies, where bulk samples often represent insurmountable workloads for sorting, identifying, and counting individual specimens. On the other hand, image-based classification using deep learning tools have strict requirements for the amount of training data, which is often a limiting factor. Here, we examine how classification accuracy increases with the amount of training data using the BIODISCOVER imaging system constructed for image-based classification and biomass estimation of invertebrate specimens. We use a balanced dataset of 60 specimens of each of 16 taxa of freshwater macroinvertebrates to systematically quantify how classification performance of a convolutional neural network (CNN) increases for individual taxa and the overall community as the number of specimens used for training is increased. We show a striking 99.2% classification accuracy when the CNN (EfficientNet-B6) is trained on 50 specimens of each taxon, and also how the lower classification accuracy of models trained on less data is particularly evident for morphologically similar species placed within the same taxonomic order. Even with as little as 15 specimens used for training, classification accuracy reached 97%. Our results add to a recent body of literature showing the huge potential of image-based methods and deep learning for specimen-based research, and furthermore offers a perspective to future automatized approaches for deriving ecological data from bulk arthropod samples.
Collapse
Affiliation(s)
- Toke T. Høye
- Department of Ecoscience, Aarhus University, Aarhus, Denmark
- Arctic Research Centre, Aarhus University, Aarhus, Denmark
| | - Mads Dyrmann
- Department of Electrical and Computer Engineering, Aarhus University, Aarhus, Denmark
| | - Christian Kjær
- Department of Ecoscience, Aarhus University, Aarhus, Denmark
| | - Johnny Nielsen
- Department of Ecoscience, Aarhus University, Aarhus, Denmark
| | - Marianne Bruus
- Department of Ecoscience, Aarhus University, Aarhus, Denmark
| | | | | | - Kim Bjerge
- Department of Electrical and Computer Engineering, Aarhus University, Aarhus, Denmark
| | - Sigurd A. Madsen
- Department of Mechanical and Production Engineering, Aarhus University, Aarhus, Denmark
| | - Mads R. Jeppesen
- Department of Mechanical and Production Engineering, Aarhus University, Aarhus, Denmark
| | - Claus Melvad
- Arctic Research Centre, Aarhus University, Aarhus, Denmark
- Department of Mechanical and Production Engineering, Aarhus University, Aarhus, Denmark
| |
Collapse
|
18
|
Reynolds J, Williams E, Martin D, Readling C, Ahmmed P, Huseth A, Bozkurt A. A Multimodal Sensing Platform for Interdisciplinary Research in Agrarian Environments. SENSORS (BASEL, SWITZERLAND) 2022; 22:5582. [PMID: 35898084 PMCID: PMC9331660 DOI: 10.3390/s22155582] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/14/2022] [Revised: 07/13/2022] [Accepted: 07/20/2022] [Indexed: 06/15/2023]
Abstract
Agricultural and environmental monitoring programs often require labor-intensive inputs and substantial costs to manually gather data from remote field locations. Recent advances in the Internet of Things enable the construction of wireless sensor systems to automate these remote monitoring efforts. This paper presents the design of a modular system to serve as a research platform for outdoor sensor development and deployment. The advantages of this system include low power consumption (enabling solar charging), the use of commercially available electronic parts for lower-cost and scaled up deployments, and the flexibility to include internal electronics and external sensors, allowing novel applications. In addition to tracking environmental parameters, the modularity of this system brings the capability to measure other non-traditional elements. This capability is demonstrated with two different agri- and aquacultural field applications: tracking moth phenology and monitoring bivalve gaping. Collection of these signals in conjunction with environmental parameters could provide a holistic and context-aware data analysis. Preliminary experiments generated promising results, demonstrating the reliability of the system. Idle power consumption of 27.2 mW and 16.6 mW for the moth- and bivalve-tracking systems, respectively, coupled with 2.5 W solar cells allows for indefinite deployment in remote locations.
Collapse
Affiliation(s)
- James Reynolds
- Department of Electrical and Computer Engineering, North Carolina State University, Raleigh, NC 27695-7911, USA; (J.R.); (E.W.); (D.M.); (C.R.); (P.A.)
| | - Evan Williams
- Department of Electrical and Computer Engineering, North Carolina State University, Raleigh, NC 27695-7911, USA; (J.R.); (E.W.); (D.M.); (C.R.); (P.A.)
| | - Devon Martin
- Department of Electrical and Computer Engineering, North Carolina State University, Raleigh, NC 27695-7911, USA; (J.R.); (E.W.); (D.M.); (C.R.); (P.A.)
| | - Caleb Readling
- Department of Electrical and Computer Engineering, North Carolina State University, Raleigh, NC 27695-7911, USA; (J.R.); (E.W.); (D.M.); (C.R.); (P.A.)
| | - Parvez Ahmmed
- Department of Electrical and Computer Engineering, North Carolina State University, Raleigh, NC 27695-7911, USA; (J.R.); (E.W.); (D.M.); (C.R.); (P.A.)
| | - Anders Huseth
- Department of Entomology and Plant Pathology and North Carolina Plant Sciences Initiative, North Carolina State University, Raleigh, NC 27695-8208, USA;
| | - Alper Bozkurt
- Department of Electrical and Computer Engineering, North Carolina State University, Raleigh, NC 27695-7911, USA; (J.R.); (E.W.); (D.M.); (C.R.); (P.A.)
| |
Collapse
|
19
|
Li B, Wang L, Feng H. Intelligent Correction Method of Shooting Action Based on Computer Vision. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE 2022; 2022:8753473. [PMID: 35860645 PMCID: PMC9293490 DOI: 10.1155/2022/8753473] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/12/2022] [Accepted: 06/22/2022] [Indexed: 12/05/2022]
Abstract
Aiming at the problem that the students' long-term use of nonstandard shooting action leads to poor basketball teaching effect, an intelligent correction method of shooting action based on computer vision is proposed. Combined with the principle of computer vision, the image acquisition model of basketball shooting action is constructed. The edge contour and adaptive feature segmentation of basketball images are detected, and abnormal shooting movements are recognized. The intelligent correction model of shooting action is constructed, and the intelligent correction of shooting action is realized. Finally, through experiments, it is proved that the visual analysis and intelligent correction effect of basketball shooting action are obviously better, and it can correct shooting action in real time and accurately.
Collapse
Affiliation(s)
- Bo Li
- School of Sport Sciences, Lingnan Normal Univesity, Zhanjiang 524048, China
| | - Lei Wang
- Department of Physical Education, Tangshan Normal University, Tangshan 063000, China
| | - Hao Feng
- Police Physical Education Department, Hebei Vocational College for Correctional Police, Shijiazhuang 050081, China
| |
Collapse
|
20
|
Geissmann Q, Abram PK, Wu D, Haney CH, Carrillo J. Sticky Pi is a high-frequency smart trap that enables the study of insect circadian activity under natural conditions. PLoS Biol 2022; 20:e3001689. [PMID: 35797311 PMCID: PMC9262196 DOI: 10.1371/journal.pbio.3001689] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2021] [Accepted: 05/26/2022] [Indexed: 11/18/2022] Open
Abstract
In the face of severe environmental crises that threaten insect biodiversity, new technologies are imperative to monitor both the identity and ecology of insect species. Traditionally, insect surveys rely on manual collection of traps, which provide abundance data but mask the large intra- and interday variations in insect activity, an important facet of their ecology. Although laboratory studies have shown that circadian processes are central to insects' biological functions, from feeding to reproduction, we lack the high-frequency monitoring tools to study insect circadian biology in the field. To address these issues, we developed the Sticky Pi, a novel, autonomous, open-source, insect trap that acquires images of sticky cards every 20 minutes. Using custom deep learning algorithms, we automatically and accurately scored where, when, and which insects were captured. First, we validated our device in controlled laboratory conditions with a classic chronobiological model organism, Drosophila melanogaster. Then, we deployed an array of Sticky Pis to the field to characterise the daily activity of an agricultural pest, Drosophila suzukii, and its parasitoid wasps. Finally, we demonstrate the wide scope of our smart trap by describing the sympatric arrangement of insect temporal niches in a community, without targeting particular taxa a priori. Together, the automatic identification and high sampling rate of our tool provide biologists with unique data that impacts research far beyond chronobiology, with applications to biodiversity monitoring and pest control as well as fundamental implications for phenology, behavioural ecology, and ecophysiology. We released the Sticky Pi project as an open community resource on https://doc.sticky-pi.com.
Collapse
Affiliation(s)
- Quentin Geissmann
- Department of Microbiology and Immunology, The University of British Columbia, Vancouver, British Columbia, Canada
- Michael Smith Laboratories, The University of British Columbia, Vancouver, British Columbia, Canada
- Faculty of Land and Food Systems, The University of British Columbia, Vancouver (Unceded xʼməθkʼəýəm Musqueam Territory), British Columbia, Canada
| | - Paul K. Abram
- Agriculture and Agri-Food Canada, Agassiz, British Columbia, Canada
| | - Di Wu
- Faculty of Land and Food Systems, The University of British Columbia, Vancouver (Unceded xʼməθkʼəýəm Musqueam Territory), British Columbia, Canada
| | - Cara H. Haney
- Department of Microbiology and Immunology, The University of British Columbia, Vancouver, British Columbia, Canada
- Michael Smith Laboratories, The University of British Columbia, Vancouver, British Columbia, Canada
| | - Juli Carrillo
- Faculty of Land and Food Systems, The University of British Columbia, Vancouver (Unceded xʼməθkʼəýəm Musqueam Territory), British Columbia, Canada
| |
Collapse
|
21
|
Wilson RJ, Siqueira AF, Brooks SJ, Price BW, Simon LM, Walt SJ, Fenberg PB. Applying computer vision to digitised natural history collections for climate change research: Temperature‐size responses in British butterflies. Methods Ecol Evol 2022. [DOI: 10.1111/2041-210x.13844] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Affiliation(s)
- Rebecca J. Wilson
- School of Ocean and Earth Sciences University of Southampton Southampton UK
- Department of Life Sciences Natural History Museum London UK
| | | | | | | | - Lea M. Simon
- School of Ocean and Earth Sciences University of Southampton Southampton UK
| | - Stéfan J. Walt
- Berkeley Institute for Data Science University of California Berkeley CA USA
| | - Phillip B. Fenberg
- School of Ocean and Earth Sciences University of Southampton Southampton UK
- Department of Life Sciences Natural History Museum London UK
| |
Collapse
|
22
|
Saradopoulos I, Potamitis I, Ntalampiras S, Konstantaras AI, Antonidakis EN. Edge Computing for Vision-Based, Urban-Insects Traps in the Context of Smart Cities. SENSORS (BASEL, SWITZERLAND) 2022; 22:s22052006. [PMID: 35271153 PMCID: PMC8914644 DOI: 10.3390/s22052006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/19/2022] [Revised: 02/23/2022] [Accepted: 03/01/2022] [Indexed: 05/15/2023]
Abstract
Our aim is to promote the widespread use of electronic insect traps that report captured pests to a human-controlled agency. This work reports on edge-computing as applied to camera-based insect traps. We present a low-cost device with high power autonomy and an adequate picture quality that reports an internal image of the trap to a server and counts the insects it contains based on quantized and embedded deep-learning models. The paper compares different aspects of performance of three different edge devices, namely ESP32, Raspberry Pi Model 4 (RPi), and Google Coral, running a deep learning framework (TensorFlow Lite). All edge devices were able to process images and report accuracy in counting exceeding 95%, but at different rates and power consumption. Our findings suggest that ESP32 appears to be the best choice in the context of this application according to our policy for low-cost devices.
Collapse
Affiliation(s)
- Ioannis Saradopoulos
- Department of Electronic Engineering, Hellenic Mediterranean University, 73133 Chania, Greece; (I.S.); (A.I.K.); (E.N.A.)
| | - Ilyas Potamitis
- Department of Music Technology and Acoustics, Hellenic Mediterranean University, 74100 Rethymno, Greece
- Correspondence:
| | | | - Antonios I. Konstantaras
- Department of Electronic Engineering, Hellenic Mediterranean University, 73133 Chania, Greece; (I.S.); (A.I.K.); (E.N.A.)
| | - Emmanuel N. Antonidakis
- Department of Electronic Engineering, Hellenic Mediterranean University, 73133 Chania, Greece; (I.S.); (A.I.K.); (E.N.A.)
| |
Collapse
|
23
|
Sondhi Y, Jo NJ, Alpizar B, Markee A, Dansby HE, Currea JP, Fabian ST, Ruiz C, Barredo E, Allen P, DeGennaro M, Kawahara AY, Theobald JC. Portable locomotion activity monitor (
pLAM
): A cost‐effective setup for robust activity tracking in small animals. Methods Ecol Evol 2022. [DOI: 10.1111/2041-210x.13809] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
Affiliation(s)
- Yash Sondhi
- Department of Biology Florida International University Miami FL USA
| | - Nicolas J. Jo
- Department of Biology Florida International University Miami FL USA
| | - Britney Alpizar
- Department of Biology Florida International University Miami FL USA
| | - Amanda Markee
- McGuire Center for Lepidoptera and Biodiversity, Florida Museum of Natural History University of Florida Gainesville FL USA
| | - Hailey E. Dansby
- McGuire Center for Lepidoptera and Biodiversity, Florida Museum of Natural History University of Florida Gainesville FL USA
| | - John Paul Currea
- Department of Psychology Florida International University Miami FL USA
| | | | - Carlos Ruiz
- Department of Biology Florida International University Miami FL USA
- McGuire Center for Lepidoptera and Biodiversity, Florida Museum of Natural History University of Florida Gainesville FL USA
| | - Elina Barredo
- Department of Biology Florida International University Miami FL USA
- Biomolecular Sciences Institute Florida International University Miami FL USA
| | - Pablo Allen
- Council on International Educational Exchange Monteverde Apto Costa Rica
| | - Matthew DeGennaro
- Department of Biology Florida International University Miami FL USA
- Biomolecular Sciences Institute Florida International University Miami FL USA
| | - Akito Y. Kawahara
- McGuire Center for Lepidoptera and Biodiversity, Florida Museum of Natural History University of Florida Gainesville FL USA
| | - Jamie C. Theobald
- Department of Biology Florida International University Miami FL USA
- Institute of Environment Florida International University Miami FL USA
| |
Collapse
|
24
|
An Open-Source Low-Cost Imaging System Plug-In for Pheromone Traps Aiding Remote Insect Pest Population Monitoring in Fruit Crops. MACHINES 2022. [DOI: 10.3390/machines10010052] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
This note describes the development of a plug-in imaging system for pheromone delta traps used in pest population monitoring. The plug-in comprises an RGB imaging sensor integrated with a microcontroller unit and associated hardware for optimized power usage and data capture. The plug-in can be attached to the top of a modified delta trap to realize periodic image capture of the trap liner (17.8 cm × 17.8 cm). As configured, the captured images are stored on a microSD card with ~0.01 cm2 pixel−1 spatial resolution. The plug-in hardware is configured to conserve power, as it enters in sleep mode during idle operation. Twenty traps with plug-in units were constructed and evaluated in the 2020 field season for codling moth (Cydia pomonella) population monitoring in a research study. The units reliably captured images at daily interval over the course of two weeks with a 350 mAh DC power source. The captured images provided the temporal population dynamics of codling moths, which would otherwise be achieved through daily manual trap monitoring. The system’s build cost is about $33 per unit, and it has potential for scaling to commercial applications through Internet of Things-enabled technologies integration.
Collapse
|
25
|
Villon S, Iovan C, Mangeas M, Vigliola L. Confronting Deep-Learning and Biodiversity Challenges for Automatic Video-Monitoring of Marine Ecosystems. SENSORS (BASEL, SWITZERLAND) 2022; 22:497. [PMID: 35062457 PMCID: PMC8781840 DOI: 10.3390/s22020497] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/29/2021] [Revised: 12/27/2021] [Accepted: 12/29/2021] [Indexed: 11/23/2022]
Abstract
With the availability of low-cost and efficient digital cameras, ecologists can now survey the world's biodiversity through image sensors, especially in the previously rather inaccessible marine realm. However, the data rapidly accumulates, and ecologists face a data processing bottleneck. While computer vision has long been used as a tool to speed up image processing, it is only since the breakthrough of deep learning (DL) algorithms that the revolution in the automatic assessment of biodiversity by video recording can be considered. However, current applications of DL models to biodiversity monitoring do not consider some universal rules of biodiversity, especially rules on the distribution of species abundance, species rarity and ecosystem openness. Yet, these rules imply three issues for deep learning applications: the imbalance of long-tail datasets biases the training of DL models; scarce data greatly lessens the performances of DL models for classes with few data. Finally, the open-world issue implies that objects that are absent from the training dataset are incorrectly classified in the application dataset. Promising solutions to these issues are discussed, including data augmentation, data generation, cross-entropy modification, few-shot learning and open set recognition. At a time when biodiversity faces the immense challenges of climate change and the Anthropocene defaunation, stronger collaboration between computer scientists and ecologists is urgently needed to unlock the automatic monitoring of biodiversity.
Collapse
Affiliation(s)
- Sébastien Villon
- Institut de Recherche pour le Developpement (IRD), UMR ENTROPIE (IRD, University of New-Caledonia, University of La Reunion, CNRS, Ifremer), 101 Promenade Roger Laroque, 98848 Noumea, France; (C.I.); (M.M.); (L.V.)
| | | | | | | |
Collapse
|
26
|
|
27
|
Wührl L, Pylatiuk C, Giersch M, Lapp F, von Rintelen T, Balke M, Schmidt S, Cerretti P, Meier R. DiversityScanner: Robotic handling of small invertebrates with machine learning methods. Mol Ecol Resour 2021; 22:1626-1638. [PMID: 34863029 DOI: 10.1111/1755-0998.13567] [Citation(s) in RCA: 19] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2021] [Revised: 11/20/2021] [Accepted: 11/30/2021] [Indexed: 01/04/2023]
Abstract
Invertebrate biodiversity remains poorly understood although it comprises much of the terrestrial animal biomass, most species and supplies many ecosystem services. The main obstacle is specimen-rich samples obtained with quantitative sampling techniques (e.g., Malaise trapping). Traditional sorting requires manual handling, while molecular techniques based on metabarcoding lose the association between individual specimens and sequences and thus struggle with obtaining precise abundance information. Here we present a sorting robot that prepares specimens from bulk samples for barcoding. It detects, images and measures individual specimens from a sample and then moves them into the wells of a 96-well microplate. We show that the images can be used to train convolutional neural networks (CNNs) that are capable of assigning the specimens to 14 insect taxa (usually families) that are particularly common in Malaise trap samples. The average assignment precision for all taxa is 91.4% (75%-100%). This ability of the robot to identify common taxa then allows for taxon-specific subsampling, because the robot can be instructed to only pick a prespecified number of specimens for abundant taxa. To obtain biomass information, the images are also used to measure specimen length and estimate body volume. We outline how the DiversityScanner can be a key component for tackling and monitoring invertebrate diversity by combining molecular and morphological tools: the images generated by the robot become training images for machine learning once they are labelled with taxonomic information from DNA barcodes. We suggest that a combination of automation, machine learning and DNA barcoding has the potential to tackle invertebrate diversity at an unprecedented scale.
Collapse
Affiliation(s)
- Lorenz Wührl
- Institute for Automation and Applied Informatics (IAI), Karlsruhe Institute of Technology (KIT), Karlsruhe, Germany
| | - Christian Pylatiuk
- Institute for Automation and Applied Informatics (IAI), Karlsruhe Institute of Technology (KIT), Karlsruhe, Germany
| | - Matthias Giersch
- Institute for Automation and Applied Informatics (IAI), Karlsruhe Institute of Technology (KIT), Karlsruhe, Germany
| | - Florian Lapp
- Institute for Automation and Applied Informatics (IAI), Karlsruhe Institute of Technology (KIT), Karlsruhe, Germany
| | - Thomas von Rintelen
- Museum für Naturkunde, Center for Integrative Biodiversity Discovery, Leibniz-Institut für Evolutions- und Biodiversitätsforschung, Berlin, Germany
| | - Michael Balke
- SNSB - Zoologische Staatssammlung München, Munich, Germany
| | - Stefan Schmidt
- SNSB - Zoologische Staatssammlung München, Munich, Germany
| | - Pierfilippo Cerretti
- Department of Biology and Biotechnology 'Charles Darwin', Sapienza University of Rome, Rome, Italy
| | - Rudolf Meier
- Museum für Naturkunde, Center for Integrative Biodiversity Discovery, Leibniz-Institut für Evolutions- und Biodiversitätsforschung, Berlin, Germany
| |
Collapse
|
28
|
Preti M, Favaro R, Knight AL, Angeli S. Remote monitoring of Cydia pomonella adults among an assemblage of nontargets in sex pheromone-kairomone-baited smart traps. PEST MANAGEMENT SCIENCE 2021; 77:4084-4090. [PMID: 33913618 PMCID: PMC8453955 DOI: 10.1002/ps.6433] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/07/2020] [Revised: 04/14/2021] [Accepted: 04/29/2021] [Indexed: 05/18/2023]
Abstract
BACKGROUND Captures of codling moth, Cydia pomonella (L.), in traps are used to establish action thresholds and time insecticide sprays. The need for frequent trap inspections in often remote orchards has created a niche for remote sensing smart traps. A smart trap baited with a five-component pheromone-kairomone blend was evaluated for codling moth monitoring among an assemblage of other nontargets in apple and pear orchards. RESULTS Codling moth captures did not differ between the smart trap and a standard trap when both were checked manually. However, the correlation between automatic and manual counts of codling moth in the smart traps was low, R2 = 0.66 ÷ 0.87. False-negative identifications by the smart trap were infrequent <5%, but false-positive identifications accounted for up to 67% of the count. These errors were primarily due to the misidentification of three moth species of fairly similar-size to codling moth: apple clearwing moth Synanthedon myopaeformis (Borkhausen), oriental fruit moth Grapholita molesta (Busck), and carnation tortrix Cacoecimorpha pronubana (Hübner). Other false-positive counts were less frequent and included the misidentifications of dipterans, other arthropods, patches of moth scales, and the double counting of some moths. CONCLUSION Codling moth was successfully monitored remotely with a smart trap baited with a nonselective sex pheromone-kairomone lure, but automatic counts were inflated in some orchards due to mischaracterizations of primarily similar-sized nontarget moths. Improved image-identification algorithms are needed for smart traps baited with less-selective lures and with lure sets targeting multiple species.
Collapse
Affiliation(s)
- Michele Preti
- Faculty of Science and TechnologyFree University of Bozen‐BolzanoBolzanoItaly
| | - Riccardo Favaro
- Faculty of Science and TechnologyFree University of Bozen‐BolzanoBolzanoItaly
| | | | - Sergio Angeli
- Faculty of Science and TechnologyFree University of Bozen‐BolzanoBolzanoItaly
| |
Collapse
|
29
|
Jolles JW. Broad‐scale applications of the Raspberry Pi: A review and guide for biologists. Methods Ecol Evol 2021. [DOI: 10.1111/2041-210x.13652] [Citation(s) in RCA: 33] [Impact Index Per Article: 11.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Affiliation(s)
- Jolle W. Jolles
- Zukunftskolleg University of Konstanz Konstanz Germany
- Department of Collective Behaviour Max Planck Institute of Animal Behaviour Konstanz Germany
- Centre for Research on Ecology and Forestry Applications (CREAF) Barcelona Spain
| |
Collapse
|
30
|
Abstract
Most animal species on Earth are insects, and recent reports suggest that their abundance is in drastic decline. Although these reports come from a wide range of insect taxa and regions, the evidence to assess the extent of the phenomenon is sparse. Insect populations are challenging to study, and most monitoring methods are labor intensive and inefficient. Advances in computer vision and deep learning provide potential new solutions to this global challenge. Cameras and other sensors can effectively, continuously, and noninvasively perform entomological observations throughout diurnal and seasonal cycles. The physical appearance of specimens can also be captured by automated imaging in the laboratory. When trained on these data, deep learning models can provide estimates of insect abundance, biomass, and diversity. Further, deep learning models can quantify variation in phenotypic traits, behavior, and interactions. Here, we connect recent developments in deep learning and computer vision to the urgent demand for more cost-efficient monitoring of insects and other invertebrates. We present examples of sensor-based monitoring of insects. We show how deep learning tools can be applied to exceptionally large datasets to derive ecological information and discuss the challenges that lie ahead for the implementation of such solutions in entomology. We identify four focal areas, which will facilitate this transformation: 1) validation of image-based taxonomic identification; 2) generation of sufficient training data; 3) development of public, curated reference databases; and 4) solutions to integrate deep learning and molecular tools.
Collapse
|