1
|
Nguyen M, Roman GW, Soibam B. Drosophila genotypes can be predicted from their exploration locomotive trajectories using supervised machine learning. Behav Processes 2023; 212:104944. [PMID: 37717930 DOI: 10.1016/j.beproc.2023.104944] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2022] [Revised: 08/25/2023] [Accepted: 09/13/2023] [Indexed: 09/19/2023]
Abstract
This study employs supervised machine learning algorithms to test whether locomotive features during exploratory activity in open field arenas can serve as predictors for the genotype of fruit flies. Because of the nonlinearity in locomotive trajectories, traditional statistical methods that are used to compare exploratory activity between genotypes of fruit flies may not reveal all insights. 10-minute-long trajectories of four different genotypes of fruit flies in an open-field arena environment were captured. Turn angles and step size features extracted from the trajectories were used for training supervised learning models to predict the genotype of the fruit flies. Using the first five minute locomotive trajectories, an accuracy of 83% was achieved in differentiating wild-type flies from three other mutant genotypes. Using the final 5 min and the entire ten minute duration decreased the performance indicating that the most variations between the genotypes in their exploratory activity are exhibited in the first few minutes. Feature importance analysis revealed that turn angle is a better predictor than step size in predicting fruit fly genotype. Overall, this study demonstrates that features of trajectories can be used to predict the genotype of fruit flies through supervised machine learning methods.
Collapse
Affiliation(s)
- Minh Nguyen
- Department of Computer Science and Engineering Technology, University of Houston-Downtown, One Main St, Houston, TX 77002, USA
| | - Gregg W Roman
- Department of Biomolecular Sciences, School of Pharmacy, University of Mississippi, 415W Faser Hall, University, MS 38677-1848, USA.
| | - Benjamin Soibam
- Department of Computer Science and Engineering Technology, University of Houston-Downtown, One Main St, Houston, TX 77002, USA.
| |
Collapse
|
2
|
Bjerge K, Frigaard CE, Karstoft H. Object Detection of Small Insects in Time-Lapse Camera Recordings. SENSORS (BASEL, SWITZERLAND) 2023; 23:7242. [PMID: 37631778 PMCID: PMC10459366 DOI: 10.3390/s23167242] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/15/2023] [Revised: 08/09/2023] [Accepted: 08/16/2023] [Indexed: 08/27/2023]
Abstract
As pollinators, insects play a crucial role in ecosystem management and world food production. However, insect populations are declining, necessitating efficient insect monitoring methods. Existing methods analyze video or time-lapse images of insects in nature, but analysis is challenging as insects are small objects in complex and dynamic natural vegetation scenes. In this work, we provide a dataset of primarily honeybees visiting three different plant species during two months of the summer. The dataset consists of 107,387 annotated time-lapse images from multiple cameras, including 9423 annotated insects. We present a method for detecting insects in time-lapse RGB images, which consists of a two-step process. Firstly, the time-lapse RGB images are preprocessed to enhance insects in the images. This motion-informed enhancement technique uses motion and colors to enhance insects in images. Secondly, the enhanced images are subsequently fed into a convolutional neural network (CNN) object detector. The method improves on the deep learning object detectors You Only Look Once (YOLO) and faster region-based CNN (Faster R-CNN). Using motion-informed enhancement, the YOLO detector improves the average micro F1-score from 0.49 to 0.71, and the Faster R-CNN detector improves the average micro F1-score from 0.32 to 0.56. Our dataset and proposed method provide a step forward for automating the time-lapse camera monitoring of flying insects.
Collapse
Affiliation(s)
- Kim Bjerge
- Department of Electrical and Computer Engineering, Aarhus University, 8200 Aarhus N, Denmark (H.K.)
| | | | | |
Collapse
|
3
|
Schneider S, Taylor GW, Kremer SC, Fryxell JM. Getting the bugs out of AI: Advancing ecological research on arthropods through computer vision. Ecol Lett 2023. [PMID: 37216316 DOI: 10.1111/ele.14239] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2022] [Revised: 03/30/2023] [Accepted: 04/03/2023] [Indexed: 05/24/2023]
Abstract
Deep learning for computer vision has shown promising results in the field of entomology, however, there still remains untapped potential. Deep learning performance is enabled primarily by large quantities of annotated data which, outside of rare circumstances, are limited in ecological studies. Currently, to utilize deep learning systems, ecologists undergo extensive data collection efforts, or limit their problem to niche tasks. These solutions do not scale to region agnostic models. However, there are solutions that employ data augmentation, simulators, generative models, and self-supervised learning that can supplement limited labelled data. Here, we highlight the success of deep learning for computer vision within entomology, discuss data collection efforts, provide methodologies for optimizing learning from limited annotations, and conclude with practical guidelines for how to achieve a foundation model for entomology capable of accessible automated ecological monitoring on a global scale.
Collapse
Affiliation(s)
| | | | - Stefan C Kremer
- School of Computer Science, University of Guelph, Guelph, Ontario, Canada
| | - John M Fryxell
- Department of Integrative Biology, University of Guelph, Guelph, Ontario, Canada
| |
Collapse
|
4
|
Høye TT, Dyrmann M, Kjær C, Nielsen J, Bruus M, Mielec CL, Vesterdal MS, Bjerge K, Madsen SA, Jeppesen MR, Melvad C. Accurate image-based identification of macroinvertebrate specimens using deep learning-How much training data is needed? PeerJ 2022; 10:e13837. [PMID: 36032940 PMCID: PMC9415355 DOI: 10.7717/peerj.13837] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2022] [Accepted: 07/13/2022] [Indexed: 01/18/2023] Open
Abstract
Image-based methods for species identification offer cost-efficient solutions for biomonitoring. This is particularly relevant for invertebrate studies, where bulk samples often represent insurmountable workloads for sorting, identifying, and counting individual specimens. On the other hand, image-based classification using deep learning tools have strict requirements for the amount of training data, which is often a limiting factor. Here, we examine how classification accuracy increases with the amount of training data using the BIODISCOVER imaging system constructed for image-based classification and biomass estimation of invertebrate specimens. We use a balanced dataset of 60 specimens of each of 16 taxa of freshwater macroinvertebrates to systematically quantify how classification performance of a convolutional neural network (CNN) increases for individual taxa and the overall community as the number of specimens used for training is increased. We show a striking 99.2% classification accuracy when the CNN (EfficientNet-B6) is trained on 50 specimens of each taxon, and also how the lower classification accuracy of models trained on less data is particularly evident for morphologically similar species placed within the same taxonomic order. Even with as little as 15 specimens used for training, classification accuracy reached 97%. Our results add to a recent body of literature showing the huge potential of image-based methods and deep learning for specimen-based research, and furthermore offers a perspective to future automatized approaches for deriving ecological data from bulk arthropod samples.
Collapse
Affiliation(s)
- Toke T. Høye
- Department of Ecoscience, Aarhus University, Aarhus, Denmark,Arctic Research Centre, Aarhus University, Aarhus, Denmark
| | - Mads Dyrmann
- Department of Electrical and Computer Engineering, Aarhus University, Aarhus, Denmark
| | - Christian Kjær
- Department of Ecoscience, Aarhus University, Aarhus, Denmark
| | - Johnny Nielsen
- Department of Ecoscience, Aarhus University, Aarhus, Denmark
| | - Marianne Bruus
- Department of Ecoscience, Aarhus University, Aarhus, Denmark
| | | | | | - Kim Bjerge
- Department of Electrical and Computer Engineering, Aarhus University, Aarhus, Denmark
| | - Sigurd A. Madsen
- Department of Mechanical and Production Engineering, Aarhus University, Aarhus, Denmark
| | - Mads R. Jeppesen
- Department of Mechanical and Production Engineering, Aarhus University, Aarhus, Denmark
| | - Claus Melvad
- Arctic Research Centre, Aarhus University, Aarhus, Denmark,Department of Mechanical and Production Engineering, Aarhus University, Aarhus, Denmark
| |
Collapse
|
5
|
van Klink R, August T, Bas Y, Bodesheim P, Bonn A, Fossøy F, Høye TT, Jongejans E, Menz MHM, Miraldo A, Roslin T, Roy HE, Ruczyński I, Schigel D, Schäffler L, Sheard JK, Svenningsen C, Tschan GF, Wäldchen J, Zizka VMA, Åström J, Bowler DE. Emerging technologies revolutionise insect ecology and monitoring. Trends Ecol Evol 2022; 37:872-885. [PMID: 35811172 DOI: 10.1016/j.tree.2022.06.001] [Citation(s) in RCA: 46] [Impact Index Per Article: 23.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2022] [Revised: 05/26/2022] [Accepted: 06/07/2022] [Indexed: 12/30/2022]
Abstract
Insects are the most diverse group of animals on Earth, but their small size and high diversity have always made them challenging to study. Recent technological advances have the potential to revolutionise insect ecology and monitoring. We describe the state of the art of four technologies (computer vision, acoustic monitoring, radar, and molecular methods), and assess their advantages, current limitations, and future potential. We discuss how these technologies can adhere to modern standards of data curation and transparency, their implications for citizen science, and their potential for integration among different monitoring programmes and technologies. We argue that they provide unprecedented possibilities for insect ecology and monitoring, but it will be important to foster international standards via collaboration.
Collapse
Affiliation(s)
- Roel van Klink
- German Centre for Integrative Biodiversity Research (iDiv) Halle Jena Leipzig, Puschstrasse 4, 04103, Leipzig, Germany; Martin Luther University-Halle Wittenberg, Department of Computer Science, 06099, Halle (Saale), Germany.
| | - Tom August
- UK Centre for Ecology & Hydrology, Benson Lane, Wallingford, OX10 8BB, UK
| | - Yves Bas
- Centre d'Écologie et des Sciences de la Conservation, Muséum National d'Histoire Naturelle, Paris, France; CEFE, Université Montpellier, CNRS, EPHE, IRD, Montpellier, France
| | - Paul Bodesheim
- Friedrich Schiller University Jena, Computer Vision Group, Ernst-Abbe-Platz 2, 07743, Jena, Germany
| | - Aletta Bonn
- German Centre for Integrative Biodiversity Research (iDiv) Halle Jena Leipzig, Puschstrasse 4, 04103, Leipzig, Germany; Helmholtz - Centre for Environmental Research - UFZ, Permoserstrasse 15, 04318, Leipzig, Germany; Friedrich Schiller University Jena, Institute of Biodiversity, Dornburger Strasse 159, 07743, Jena, Germany
| | - Frode Fossøy
- Norwegian Institute for Nature Research, P.O. Box 5685 Torgarden, 7485, Trondheim, Norway
| | - Toke T Høye
- Aarhus University, Department of Ecoscience and Arctic Research Centre, C.F. Møllers Allé 8, 8000, Aarhus, Denmark
| | - Eelke Jongejans
- Radboud University, Animal Ecology and Physiology, Heyendaalseweg 135, 6525, AJ, Nijmegen, The Netherlands; Netherlands Institute of Ecology, Animal Ecology, Droevendaalsesteeg 10, 6708 PB, Wageningen, The Netherlands
| | - Myles H M Menz
- Max Planck Institute for Animal Behaviour, Department of Migration, Am Obstberg 1, 78315, Radolfzell, Germany; College of Science and Engineering, James Cook University, Townsville, Qld, Australia
| | - Andreia Miraldo
- Swedish Museum of Natural Sciences, Department of Bioinformatics and Genetics, Frescativägen 40, 114 18, Stockholm, Sweden
| | - Tomas Roslin
- Swedish University of Agricultural Sciences (SLU), Department of Ecology, Ulls väg 18B, 75651, Uppsala, Sweden
| | - Helen E Roy
- UK Centre for Ecology & Hydrology, Benson Lane, Wallingford, OX10 8BB, UK
| | - Ireneusz Ruczyński
- Mammal Research Institute, Polish Academy of Sciences, Stoczek 1, 17-230, Białowieża, Poland
| | - Dmitry Schigel
- Global Biodiversity Information Facility (GBIF), Universitetsparken 15, 2100, Copenhagen, Denmark
| | - Livia Schäffler
- Leibniz Institute for the Analysis of Biodiversity Change, Museum Koenig Bonn, Adenauerallee 127, 53113, Bonn, Germany
| | - Julie K Sheard
- German Centre for Integrative Biodiversity Research (iDiv) Halle Jena Leipzig, Puschstrasse 4, 04103, Leipzig, Germany; Helmholtz - Centre for Environmental Research - UFZ, Permoserstrasse 15, 04318, Leipzig, Germany; Friedrich Schiller University Jena, Institute of Biodiversity, Dornburger Strasse 159, 07743, Jena, Germany; University of Copenhagen, Centre for Macroecology, Evolution and Climate, Globe Institute, Universitetsparken 15, bld. 3, 2100, Copenhagen, Denmark
| | - Cecilie Svenningsen
- University of Copenhagen, Natural History Museum of Denmark, Øster Voldgade 5-7, 1350, Copenhagen, Denmark
| | - Georg F Tschan
- Leibniz Institute for the Analysis of Biodiversity Change, Museum Koenig Bonn, Adenauerallee 127, 53113, Bonn, Germany
| | - Jana Wäldchen
- German Centre for Integrative Biodiversity Research (iDiv) Halle Jena Leipzig, Puschstrasse 4, 04103, Leipzig, Germany; Max Planck Institute for Biogeochemistry, Department of Biogeochemical Integration, Hans-Knoell-Str. 10, 07745, Jena, Germany
| | - Vera M A Zizka
- Leibniz Institute for the Analysis of Biodiversity Change, Museum Koenig Bonn, Adenauerallee 127, 53113, Bonn, Germany
| | - Jens Åström
- Norwegian Institute for Nature Research, P.O. Box 5685 Torgarden, 7485, Trondheim, Norway
| | - Diana E Bowler
- German Centre for Integrative Biodiversity Research (iDiv) Halle Jena Leipzig, Puschstrasse 4, 04103, Leipzig, Germany; UK Centre for Ecology & Hydrology, Benson Lane, Wallingford, OX10 8BB, UK; Helmholtz - Centre for Environmental Research - UFZ, Permoserstrasse 15, 04318, Leipzig, Germany; Friedrich Schiller University Jena, Institute of Biodiversity, Dornburger Strasse 159, 07743, Jena, Germany
| |
Collapse
|