1
|
Paulet J, Molina A, Beltzung B, Suzumura T, Yamamoto S, Sueur C. Deep learning for automatic facial detection and recognition in Japanese macaques: illuminating social networks. Primates 2024; 65:265-279. [PMID: 38758427 DOI: 10.1007/s10329-024-01137-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2024] [Accepted: 05/07/2024] [Indexed: 05/18/2024]
Abstract
Individual identification plays a pivotal role in ecology and ethology, notably as a tool for complex social structures understanding. However, traditional identification methods often involve invasive physical tags and can prove both disruptive for animals and time-intensive for researchers. In recent years, the integration of deep learning in research has offered new methodological perspectives through the automatisation of complex tasks. Harnessing object detection and recognition technologies is increasingly used by researchers to achieve identification on video footage. This study represents a preliminary exploration into the development of a non-invasive tool for face detection and individual identification of Japanese macaques (Macaca fuscata) through deep learning. The ultimate goal of this research is, using identification done on the dataset, to automatically generate a social network representation of the studied population. The current main results are promising: (i) the creation of a Japanese macaques' face detector (Faster-RCNN model), reaching an accuracy of 82.2% and (ii) the creation of an individual recogniser for the Kōjima Island macaque population (YOLOv8n model), reaching an accuracy of 83%. We also created a Kōjima population social network by traditional methods, based on co-occurrences on videos. Thus, we provide a benchmark against which the automatically generated network will be assessed for reliability. These preliminary results are a testament to the potential of this approach to provide the scientific community with a tool for tracking individuals and social network studies in Japanese macaques.
Collapse
Affiliation(s)
- Julien Paulet
- Université Jean Monnet, Saint-Etienne, France
- Wildlife Research Center, Kyoto University, Kyoto, Japan
| | - Axel Molina
- Ecole Normale Supérieure, Université PCL, Paris, France
| | | | | | - Shinya Yamamoto
- Wildlife Research Center, Kyoto University, Kyoto, Japan
- Kyoto University Institute for Advanced Study, Kyoto, Japan
| | - Cédric Sueur
- Université de Strasbourg, IPHC UMR7178, CNRS, Strasbourg, France.
- ANTHROPO-LAB, ETHICS EA 7446, Université Catholique de Lille, Lille, France.
- Institut Universitaire de France, Paris, France.
| |
Collapse
|
2
|
Czupryna AM, Estepho M, Lugelo A, Bigambo M, Sambo M, Changalucha J, Lushasi KS, Rooyakkers P, Hampson K, Lankester F. Testing novel facial recognition technology to identify dogs during vaccination campaigns. Sci Rep 2023; 13:22025. [PMID: 38086911 PMCID: PMC10716125 DOI: 10.1038/s41598-023-49522-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2023] [Accepted: 12/08/2023] [Indexed: 12/18/2023] Open
Abstract
A lack of methods to identify individual animals can be a barrier to zoonoses control. We developed and field-tested facial recognition technology for a mobile phone application to identify dogs, which we used to assess vaccination coverage against rabies in rural Tanzania. Dogs were vaccinated, registered using the application, and microchipped. During subsequent household visits to validate vaccination, dogs were registered using the application and their vaccination status determined by operators using the application to classify dogs as vaccinated (matched) or unvaccinated (unmatched), with microchips validating classifications. From 534 classified dogs (251 vaccinated, 283 unvaccinated), the application specificity was 98.9% and sensitivity 76.2%, with positive and negative predictive values of 98.4% and 82.8% respectively. The facial recognition algorithm correctly matched 249 (99.2%) vaccinated and microchipped dogs (true positives) and failed to match two (0.8%) vaccinated dogs (false negatives). Operators correctly identified 186 (74.1%) vaccinated dogs (true positives), and 280 (98.9%) unvaccinated dogs (true negatives), but incorrectly classified 58 (23.1%) vaccinated dogs as unmatched (false negatives). Reduced application sensitivity resulted from poor quality photos and light-associated color distortion. With development and operator training, this technology has potential to be a useful tool to identify dogs and support research and intervention programs.
Collapse
Affiliation(s)
- Anna Maria Czupryna
- Boyd Orr Centre for Population and Ecosystem Health, School of Biodiversity, One Health and Veterinary Medicine, University of Glasgow, Glasgow, G12 8QQ, UK
- Environmental Health and Ecological Sciences Thematic Group, Ifakara Health Institute, P.O. Box 78373, Dar es Salaam, Tanzania
| | - Mike Estepho
- PiP My Pet Technologies, Vancouver, British Colombia, Canada
| | - Ahmed Lugelo
- Environmental Health and Ecological Sciences Thematic Group, Ifakara Health Institute, P.O. Box 78373, Dar es Salaam, Tanzania
- Global Animal Health Tanzania, P.O. Box 1642, Arusha, Tanzania
- Department of Veterinary Medicine and Public Health, Sokoine University of Agriculture, P.O. Box 3105, Morogoro, Tanzania
| | | | - Maganga Sambo
- Boyd Orr Centre for Population and Ecosystem Health, School of Biodiversity, One Health and Veterinary Medicine, University of Glasgow, Glasgow, G12 8QQ, UK
- Environmental Health and Ecological Sciences Thematic Group, Ifakara Health Institute, P.O. Box 78373, Dar es Salaam, Tanzania
| | - Joel Changalucha
- Environmental Health and Ecological Sciences Thematic Group, Ifakara Health Institute, P.O. Box 78373, Dar es Salaam, Tanzania
- Global Animal Health Tanzania, P.O. Box 1642, Arusha, Tanzania
| | - Kennedy Selestin Lushasi
- Environmental Health and Ecological Sciences Thematic Group, Ifakara Health Institute, P.O. Box 78373, Dar es Salaam, Tanzania
- Department of Global Health and Biomedical Sciences, Nelson Mandela African Institute of Science and Technology, Arusha, Tanzania
| | | | - Katie Hampson
- Boyd Orr Centre for Population and Ecosystem Health, School of Biodiversity, One Health and Veterinary Medicine, University of Glasgow, Glasgow, G12 8QQ, UK
| | - Felix Lankester
- Global Animal Health Tanzania, P.O. Box 1642, Arusha, Tanzania.
- Paul G. Allen School for Global Health, Washington State University, Pullman, WA, 99164, USA.
| |
Collapse
|
3
|
Tecot SR, Birr M, Dixon J, Lahitsara JP, Razafindraibe D, Razanajatovo S, Arroyo AS, Tombotiana AV, Velontsara JB, Baden AL. Functional relationships between estradiol and paternal care in male red-bellied lemurs, Eulemur rubriventer. Horm Behav 2023; 150:105324. [PMID: 36774699 DOI: 10.1016/j.yhbeh.2023.105324] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/03/2022] [Revised: 01/26/2023] [Accepted: 01/30/2023] [Indexed: 02/12/2023]
Abstract
Fathers contribute substantially to infant care, yet the mechanisms facilitating paternal bonding and interactions with infants are not as well understood as they are in mothers. Several hormonal changes occur as males transition into parenthood, first in response to a partner's pregnancy, and next in response to interacting with the newborn. These changes may prepare fathers for parenting and help facilitate and maintain paternal care. Experimental studies with monkeys and rodents suggest that paternal care requires elevated estradiol levels, which increase when a male's partner is pregnant and are higher in fathers than non-fathers, but its role in the expression of paternal behaviors throughout infant development is unknown. To assess estradiol's role in paternal care, we analyzed the relationship between paternal estradiol metabolites and 1) offspring age, and 2) paternal care behavior (holding, carrying, huddling, playing, grooming), in wild, red-bellied lemurs (Eulemur rubriventer). We collected 146 fecal samples and 1597 h of behavioral data on 10 adult males who had newborn infants during the study. Estradiol metabolites increased four-fold in expectant males, and in new fathers they fluctuated and gradually decreased with time. Infant age, not paternal behavior, best predicted hormone levels in new fathers. These results suggest that hormonal changes occur in expectant males with facultative paternal care, but they do not support the hypothesis that estradiol is directly associated with the day-to-day expression of paternal care. Future research should explore estradiol's role in facilitating behaviors, including infant-directed attention and responsiveness, or preparing fathers for infant care generally.
Collapse
Affiliation(s)
- Stacey R Tecot
- School of Anthropology, University of Arizona, Tucson, AZ 85721, USA; Laboratory for the Evolutionary Endocrinology of Primates, University of Arizona, Tucson, AZ 85721, USA.
| | - Madalena Birr
- School of Anthropology, University of Arizona, Tucson, AZ 85721, USA; Laboratory for the Evolutionary Endocrinology of Primates, University of Arizona, Tucson, AZ 85721, USA; Department of Ecology and Evolutionary Biology, University of Arizona, Tucson, AZ 85721, USA
| | - Juliana Dixon
- School of Anthropology, University of Arizona, Tucson, AZ 85721, USA; Laboratory for the Evolutionary Endocrinology of Primates, University of Arizona, Tucson, AZ 85721, USA.
| | | | | | - Soafaniry Razanajatovo
- Department of Zoology and Animal Biodiversity, University of Antananarivo, Antananarivo, Madagascar
| | - Alicia S Arroyo
- Institute of Evolutionary Biology (IBE-UPF CSIC), Barcelona, Spain
| | | | | | - Andrea L Baden
- PhD programs in Anthropology and Biology, The Graduate Center of the City University of New York, New York, NY 10016, USA; New York Consortium in Evolutionary Primatology (NYCEP), New York, NY, USA; Department of Anthropology, Hunter College of the City University of New York, New York, NY 10065, USA.
| |
Collapse
|
4
|
An experiment on animal re-identification from video. ECOL INFORM 2023. [DOI: 10.1016/j.ecoinf.2023.101994] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/20/2023]
|
5
|
Shi C, Xu J, Roberts NJ, Liu D, Jiang G. Individual automatic detection and identification of big cats with the combination of different body parts. Integr Zool 2023; 18:157-168. [PMID: 35276755 DOI: 10.1111/1749-4877.12641] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/19/2023]
Abstract
The development of facial recognition technology has become an increasingly powerful tool in wild animal individual recognition. In this paper, we develop an automatic detection and recognition method with the combinations of body features of big cats based on the deep convolutional neural network (CNN). We collected dataset including 12 244 images from 47 individual Amur tigers (Panthera tigris altaica) at the Siberian Tiger Park by mobile phones and digital camera and 1940 images and videos of 12 individual wild Amur leopard (Panthera pardus orientalis) by infrared cameras. First, the single shot multibox detector algorithm is used to perform the automatic detection process of feature regions in each image. For the different feature regions of the image, like face stripe or spots, CNNs and multi-layer perceptron models were applied to automatically identify tiger and leopard individuals, independently. Our results show that the identification accuracy of Amur tiger can reach up to 93.27% for face front, 93.33% for right body stripe, and 93.46% for left body stripe. Furthermore, the combination of right face, left body stripe, and right body stripe achieves the highest accuracy rate, up to 95.55%. Consequently, the combination of different body parts can improve the individual identification accuracy. However, it is not the higher the number of body parts, the higher the accuracy rate. The combination model with 3 body parts has the highest accuracy. The identification accuracy of Amur leopard can reach up to 86.90% for face front, 89.13% for left body spots, and 88.33% for right body spots. The accuracy of different body parts combination is lower than the independent part. For wild Amur leopard, the combination of face with body spot part is not helpful for the improvement of identification accuracy. The most effective identification part is still the independent left or right body spot part. It can be applied in long-term monitoring of big cats, including big data analysis for animal behavior, and be helpful for the individual identification of other wildlife species.
Collapse
Affiliation(s)
- Chunmei Shi
- College of Mathematics and Computer Science, Zhejiang Agriculture and Forestry University, Hangzhou, China.,Department of Mathematics, School of Science, Northeast Forestry University, Harbin, China
| | - Jing Xu
- Department of Mathematics, School of Science, Northeast Forestry University, Harbin, China
| | - Nathan James Roberts
- Feline Research Center, National Forestry and Grassland Administration, College of Wildlife and Protected Areas, Northeast Forestry University, Harbin, China
| | - Dan Liu
- Siberian Tiger Park, Harbin, China
| | - Guangshun Jiang
- Feline Research Center, National Forestry and Grassland Administration, College of Wildlife and Protected Areas, Northeast Forestry University, Harbin, China
| |
Collapse
|
6
|
Brookes O, Gray S, Bennett P, Burgess KV, Clark FE, Roberts E, Burghardt T. Evaluating Cognitive Enrichment for Zoo-Housed Gorillas Using Facial Recognition. Front Vet Sci 2022; 9:886720. [PMID: 35664848 PMCID: PMC9161820 DOI: 10.3389/fvets.2022.886720] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2022] [Accepted: 04/21/2022] [Indexed: 11/13/2022] Open
Abstract
The use of computer technology within zoos is becoming increasingly popular to help achieve high animal welfare standards. However, despite its various positive applications to wildlife in recent years, there has been little uptake of machine learning in zoo animal care. In this paper, we describe how a facial recognition system, developed using machine learning, was embedded within a cognitive enrichment device (a vertical, modular finger maze) for a troop of seven Western lowland gorillas (Gorilla gorilla gorilla) at Bristol Zoo Gardens, UK. We explored whether machine learning could automatically identify individual gorillas through facial recognition, and automate the collection of device-use data including the order, frequency and duration of use by the troop. Concurrent traditional video recording and behavioral coding by eye was undertaken for comparison. The facial recognition system was very effective at identifying individual gorillas (97% mean average precision) and could automate specific downstream tasks (for example, duration of engagement). However, its development was a heavy investment, requiring specialized hardware and interdisciplinary expertise. Therefore, we suggest a system like this is only appropriate for long-term projects. Additionally, researcher input was still required to visually identify which maze modules were being used by gorillas and how. This highlights the need for additional technology, such as infrared sensors, to fully automate cognitive enrichment evaluation. To end, we describe a future system that combines machine learning and sensor technology which could automate the collection of data in real-time for use by researchers and animal care staff.
Collapse
Affiliation(s)
- Otto Brookes
- Department of Computer Science, Faculty of Engineering, University of Bristol, Bristol, United Kingdom
- *Correspondence: Otto Brookes
| | - Stuart Gray
- Centre for Entrepreneurship, Faculty of Engineering, University of Bristol, Bristol, United Kingdom
| | - Peter Bennett
- Department of Computer Science, Faculty of Engineering, University of Bristol, Bristol, United Kingdom
| | - Katy V. Burgess
- School of Psychological Science, Faculty of Life Sciences, University of Bristol, Bristol, United Kingdom
| | - Fay E. Clark
- School of Psychological Science, Faculty of Life Sciences, University of Bristol, Bristol, United Kingdom
- School of Life Sciences, Faculty of Science and Engineering, Anglia Ruskin University, Cambridge, United Kingdom
| | - Elisabeth Roberts
- Bristol Vet School, Faculty of Life Sciences, University of Bristol, Bristol, United Kingdom
| | - Tilo Burghardt
- Department of Computer Science, Faculty of Engineering, University of Bristol, Bristol, United Kingdom
| |
Collapse
|
7
|
Birenbaum Z, Do H, Horstmyer L, Orff H, Ingram K, Ay A. SEALNET: Facial recognition software for ecological studies of harbor seals. Ecol Evol 2022; 12:e8851. [PMID: 35505998 PMCID: PMC9047973 DOI: 10.1002/ece3.8851] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2022] [Revised: 03/25/2022] [Accepted: 03/30/2022] [Indexed: 11/06/2022] Open
Abstract
Methods for long‐term monitoring of coastal species such as harbor seals (Phoca vitulina) are often costly, time‐consuming, and highly invasive, underscoring the need for improved techniques for data collection and analysis. Here, we propose the use of automated facial recognition technology for identification of individual seals and demonstrate its utility in ecological and population studies. We created a software package, SealNet, that automates photo identification of seals, using a graphical user interface (GUI) software to detect, align, and chip seal faces from photographs and a deep convolutional neural network (CNN) suitable for small datasets (e.g., 100 seals with five photos per seal) to classify individual seals. We piloted the SealNet technology with a population of harbor seals located within Casco Bay on the coast of Maine, USA. Across two years of sampling, 2019 and 2020, at seven haul‐out sites in Middle Bay, we obtained a dataset optimized for the development and testing of SealNet. We processed 1752 images representing 408 individual seals and achieved 88% Rank‐1 and 96% Rank‐5 accuracy in closed set seal identification. In identifying individual seals, SealNet software outperformed a similar face recognition method, PrimNet, developed for primates but retrained on seals. The ease and wealth of image data that can be processed using SealNet software contributes a vital tool for ecological and behavioral studies of marine mammals in the developing field of conservation technology.
Collapse
Affiliation(s)
- Zach Birenbaum
- Department of Computer Science Colgate University Hamilton New York USA
| | - Hieu Do
- Department of Computer Science Colgate University Hamilton New York USA
- Department of Mathematics Colgate University Hamilton New York USA
| | | | - Hailey Orff
- Department of Biology Colgate University Hamilton New York USA
| | - Krista Ingram
- Department of Biology Colgate University Hamilton New York USA
| | - Ahmet Ay
- Department of Mathematics Colgate University Hamilton New York USA
- Department of Biology Colgate University Hamilton New York USA
| |
Collapse
|
8
|
An Adaptive Embedding Network with Spatial Constraints for the Use of Few-Shot Learning in Endangered-Animal Detection. ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION 2022. [DOI: 10.3390/ijgi11040256] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
Image recording is now ubiquitous in the fields of endangered-animal conservation and GIS. However, endangered animals are rarely seen, and, thus, only a few samples of images of them are available. In particular, the study of endangered-animal detection has a vital spatial component. We propose an adaptive, few-shot learning approach to endangered-animal detection through data augmentation by applying constraints on the mixture of foreground and background images based on species distributions. First, the pre-trained, salient network U2-Net segments the foregrounds and backgrounds of images of endangered animals. Then, the pre-trained image completion network CR-Fill is used to repair the incomplete environment. Furthermore, our approach identifies a foreground–background mixture of different images to produce multiple new image examples, using the relation network to permit a more realistic mixture of foreground and background images. It does not require further supervision, and it is easy to embed into existing networks, which learn to compensate for the uncertainties and nonstationarities of few-shot learning. Our experimental results are in excellent agreement with theoretical predictions by different evaluation metrics, and they unveil the future potential of video surveillance to address endangered-animal detection in studies of their behavior and conservation.
Collapse
|
9
|
Ueno M, Kabata R, Hayashi H, Terada K, Yamada K. Automatic individual recognition of Japanese macaques (
Macaca fuscata
) from sequential images. Ethology 2022. [DOI: 10.1111/eth.13277] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Affiliation(s)
- Masataka Ueno
- Faculty of Applied Sociology Kindai University Osaka Japan
| | - Ryosuke Kabata
- Graduate School of Natural Science and Technology Gifu University Gifu Japan
| | - Hidetaka Hayashi
- Graduate School of Natural Science and Technology Gifu University Gifu Japan
| | | | - Kazunori Yamada
- Graduate School of Human Sciences Osaka University Osaka Japan
| |
Collapse
|
10
|
Evaluating likelihood-based photogrammetry for individual recognition of four species of northern ungulates. Mamm Biol 2022. [DOI: 10.1007/s42991-021-00223-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
11
|
|
12
|
Abstract
AbstractObserving and quantifying primate behavior in the wild is challenging. Human presence affects primate behavior and habituation of new, especially terrestrial, individuals is a time-intensive process that carries with it ethical and health concerns, especially during the recent pandemic when primates are at even greater risk than usual. As a result, wildlife researchers, including primatologists, have increasingly turned to new technologies to answer questions and provide important data related to primate conservation. Tools and methods should be chosen carefully to maximize and improve the data that will be used to answer the research questions. We review here the role of four indirect methods—camera traps, acoustic monitoring, drones, and portable field labs—and improvements in machine learning that offer rapid, reliable means of combing through large datasets that these methods generate. We describe key applications and limitations of each tool in primate conservation, and where we anticipate primate conservation technology moving forward in the coming years.
Collapse
|
13
|
Vidal M, Wolf N, Rosenberg B, Harris BP, Mathis A. Perspectives on Individual Animal Identification from Biology and Computer Vision. Integr Comp Biol 2021; 61:900-916. [PMID: 34050741 PMCID: PMC8490693 DOI: 10.1093/icb/icab107] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
Identifying individual animals is crucial for many biological investigations. In response to some of the limitations of current identification methods, new automated computer vision approaches have emerged with strong performance. Here, we review current advances of computer vision identification techniques to provide both computer scientists and biologists with an overview of the available tools and discuss their applications. We conclude by offering recommendations for starting an animal identification project, illustrate current limitations, and propose how they might be addressed in the future.
Collapse
Affiliation(s)
- Maxime Vidal
- School of Life Sciences, Brain Mind Institute, Swiss Federal Institute of Technology (EPFL), Chemin des Mines 9, 1202 Geneva, Switzerland
- Center for Neuroprosthetics, Center for Intelligent Systems, Swiss Federal Institute of Technology (EPFL), Chemin des Mines 9, 1202 Geneva, Switzerland
| | - Nathan Wolf
- Fisheries, Aquatic Science, and Technology Laboratory, Alaska Pacific University, 4101 University Drive, Anchorage, Alaska 99508, USA
| | - Beth Rosenberg
- Fisheries, Aquatic Science, and Technology Laboratory, Alaska Pacific University, 4101 University Drive, Anchorage, Alaska 99508, USA
| | - Bradley P Harris
- Fisheries, Aquatic Science, and Technology Laboratory, Alaska Pacific University, 4101 University Drive, Anchorage, Alaska 99508, USA
| | - Alexander Mathis
- School of Life Sciences, Brain Mind Institute, Swiss Federal Institute of Technology (EPFL), Chemin des Mines 9, 1202 Geneva, Switzerland
- Center for Neuroprosthetics, Center for Intelligent Systems, Swiss Federal Institute of Technology (EPFL), Chemin des Mines 9, 1202 Geneva, Switzerland
| |
Collapse
|
14
|
Happy Cow or Thinking Pig? WUR Wolf—Facial Coding Platform for Measuring Emotions in Farm Animals. AI 2021. [DOI: 10.3390/ai2030021] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/13/2022] Open
Abstract
Emotions play an indicative and informative role in the investigation of farm animal behaviors. Systems that respond and can measure emotions provide a natural user interface in enabling the digitalization of animal welfare platforms. The faces of farm animals can be one of the richest channels for expressing emotions. WUR Wolf (Wageningen University & Research: Wolf Mascot), a real-time facial recognition platform that can automatically code the emotions of farm animals, is presented in this study. The developed Python-based algorithms detect and track the facial features of cows and pigs, analyze the appearance, ear postures, and eye white regions, and correlate these with the mental/emotional states of the farm animals. The system is trained on a dataset of facial features of images of farm animals collected in over six farms and has been optimized to operate with an average accuracy of 85%. From these, the emotional states of animals in real time are determined. The software detects 13 facial actions and an inferred nine emotional states, including whether the animal is aggressive, calm, or neutral. A real-time emotion recognition system based on YoloV3, a Faster YoloV4-based facial detection platform and an ensemble Convolutional Neural Networks (RCNN) is presented. Detecting facial features of farm animals simultaneously in real time enables many new interfaces for automated decision-making tools for livestock farmers. Emotion sensing offers a vast potential for improving animal welfare and animal–human interactions.
Collapse
|
15
|
|
16
|
Clapham M, Miller E, Nguyen M, Darimont CT. Automated facial recognition for wildlife that lack unique markings: A deep learning approach for brown bears. Ecol Evol 2020; 10:12883-12892. [PMID: 33304501 PMCID: PMC7713984 DOI: 10.1002/ece3.6840] [Citation(s) in RCA: 28] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2020] [Revised: 08/22/2020] [Accepted: 08/26/2020] [Indexed: 11/05/2022] Open
Abstract
Emerging technologies support a new era of applied wildlife research, generating data on scales from individuals to populations. Computer vision methods can process large datasets generated through image-based techniques by automating the detection and identification of species and individuals. With the exception of primates, however, there are no objective visual methods of individual identification for species that lack unique and consistent body markings. We apply deep learning approaches of facial recognition using object detection, landmark detection, a similarity comparison network, and an support vector machine-based classifier to identify individuals in a representative species, the brown bear Ursus arctos. Our open-source application, BearID, detects a bear's face in an image, rotates and extracts the face, creates an "embedding" for the face, and uses the embedding to classify the individual. We trained and tested the application using labeled images of 132 known individuals collected from British Columbia, Canada, and Alaska, USA. Based on 4,674 images, with an 80/20% split for training and testing, respectively, we achieved a facial detection (ability to find a face) average precision of 0.98 and an individual classification (ability to identify the individual) accuracy of 83.9%. BearID and its annotated source code provide a replicable methodology for applying deep learning methods of facial recognition applicable to many other species that lack distinguishing markings. Further analyses of performance should focus on the influence of certain parameters on recognition accuracy, such as age and body size. Combining BearID with camera trapping could facilitate fine-scale behavioral research such as individual spatiotemporal activity patterns, and a cost-effective method of population monitoring through mark-recapture studies, with implications for species and landscape conservation and management. Applications to practical conservation include identifying problem individuals in human-wildlife conflicts, and evaluating the intrapopulation variation in efficacy of conservation strategies, such as wildlife crossings.
Collapse
Affiliation(s)
- Melanie Clapham
- BearID ProjectSookeBCCanada
- Department of GeographyUniversity of VictoriaVictoriaBCCanada
| | | | | | - Chris T. Darimont
- Department of GeographyUniversity of VictoriaVictoriaBCCanada
- Raincoast Conservation FoundationBella BellaBCCanada
| |
Collapse
|
17
|
Camera traps provide a robust alternative to direct observations for constructing social networks of wild chimpanzees. Anim Behav 2019. [DOI: 10.1016/j.anbehav.2019.08.008] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/23/2023]
|
18
|
Schofield D, Nagrani A, Zisserman A, Hayashi M, Matsuzawa T, Biro D, Carvalho S. Chimpanzee face recognition from videos in the wild using deep learning. SCIENCE ADVANCES 2019; 5:eaaw0736. [PMID: 31517043 PMCID: PMC6726454 DOI: 10.1126/sciadv.aaw0736] [Citation(s) in RCA: 64] [Impact Index Per Article: 12.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/15/2018] [Accepted: 08/02/2019] [Indexed: 06/01/2023]
Abstract
Video recording is now ubiquitous in the study of animal behavior, but its analysis on a large scale is prohibited by the time and resources needed to manually process large volumes of data. We present a deep convolutional neural network (CNN) approach that provides a fully automated pipeline for face detection, tracking, and recognition of wild chimpanzees from long-term video records. In a 14-year dataset yielding 10 million face images from 23 individuals over 50 hours of footage, we obtained an overall accuracy of 92.5% for identity recognition and 96.2% for sex recognition. Using the identified faces, we generated co-occurrence matrices to trace changes in the social network structure of an aging population. The tools we developed enable easy processing and annotation of video datasets, including those from other species. Such automated analysis unveils the future potential of large-scale longitudinal video archives to address fundamental questions in behavior and conservation.
Collapse
Affiliation(s)
- Daniel Schofield
- Primate Models for Behavioural Evolution Lab, Institute of Cognitive and Evolutionary Anthropology, University of Oxford, Oxford, UK
| | - Arsha Nagrani
- Visual Geometry Group, Department of Engineering Science, University of Oxford, Oxford, UK
| | - Andrew Zisserman
- Visual Geometry Group, Department of Engineering Science, University of Oxford, Oxford, UK
| | - Misato Hayashi
- Primate Research Institute, Kyoto University, Inuyama, Japan
| | | | - Dora Biro
- Department of Zoology, University of Oxford, Oxford, UK
| | - Susana Carvalho
- Primate Models for Behavioural Evolution Lab, Institute of Cognitive and Evolutionary Anthropology, University of Oxford, Oxford, UK
- Gorongosa National Park, Sofala, Mozambique
- Interdisciplinary Center for Archaeology and Evolution of Human Behaviour (ICArEHB), Universidade do Algarve, Faro, Portugal
- Centre for Functional Ecology–Science for People & the Planet, Universidade de Coimbra, Coimbra, Portugal
| |
Collapse
|
19
|
Tecot SR, Baden AL. Profiling caregivers: Hormonal variation underlying allomaternal care in wild red-bellied lemurs, Eulemur rubriventer. Physiol Behav 2018; 193:135-148. [DOI: 10.1016/j.physbeh.2017.12.007] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2017] [Revised: 12/05/2017] [Accepted: 12/05/2017] [Indexed: 01/31/2023]
|
20
|
Kuncheva LI, Constance JH. Restricted Set Classification with prior probabilities: A case study on chessboard recognition. Pattern Recognit Lett 2018. [DOI: 10.1016/j.patrec.2018.04.018] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|