1
|
Mimura K, Matsumoto J, Mochihashi D, Nakamura T, Nishijo H, Higuchi M, Hirabayashi T, Minamimoto T. Unsupervised decomposition of natural monkey behavior into a sequence of motion motifs. Commun Biol 2024; 7:1080. [PMID: 39227400 PMCID: PMC11371840 DOI: 10.1038/s42003-024-06786-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2023] [Accepted: 08/27/2024] [Indexed: 09/05/2024] Open
Abstract
Nonhuman primates (NHPs) exhibit complex and diverse behavior that typifies advanced cognitive function and social communication, but quantitative and systematical measure of this natural nonverbal processing has been a technical challenge. Specifically, a method is required to automatically segment time series of behavior into elemental motion motifs, much like finding meaningful words in character strings. Here, we propose a solution called SyntacticMotionParser (SMP), a general-purpose unsupervised behavior parsing algorithm using a nonparametric Bayesian model. Using three-dimensional posture-tracking data from NHPs, SMP automatically outputs an optimized sequence of latent motion motifs classified into the most likely number of states. When applied to behavioral datasets from common marmosets and rhesus monkeys, SMP outperformed conventional posture-clustering models and detected a set of behavioral ethograms from publicly available data. SMP also quantified and visualized the behavioral effects of chemogenetic neural manipulations. SMP thus has the potential to dramatically improve our understanding of natural NHP behavior in a variety of contexts.
Collapse
Affiliation(s)
- Koki Mimura
- Advanced Neuroimaging Center, National Institutes for Quantum Science and Technology, Chiba, 263-8555, Japan.
- Research Center for Medical and Health Data Science, The Institute of Statistical Mathematics, Tokyo, 190-0014, Japan.
| | - Jumpei Matsumoto
- Department of System Emotional Science, Faculty of Medicine, University of Toyama, Toyama, 930-8555, Japan
- Research Center for Idling Brain Science, University of Toyama, Toyama, 930-8555, Japan
| | - Daichi Mochihashi
- Department of Statistical Inference and Mathematics, The Institute of Statistical Mathematics, Tokyo, 190-9562, Japan
| | - Tomoaki Nakamura
- Department of Mechanical Engineering and Intelligent Systems, The University of Electro-Communications, Tokyo, 182-8585, Japan
| | - Hisao Nishijo
- Department of System Emotional Science, Faculty of Medicine, University of Toyama, Toyama, 930-8555, Japan
- Research Center for Idling Brain Science, University of Toyama, Toyama, 930-8555, Japan
| | - Makoto Higuchi
- Advanced Neuroimaging Center, National Institutes for Quantum Science and Technology, Chiba, 263-8555, Japan
| | - Toshiyuki Hirabayashi
- Advanced Neuroimaging Center, National Institutes for Quantum Science and Technology, Chiba, 263-8555, Japan
| | - Takafumi Minamimoto
- Advanced Neuroimaging Center, National Institutes for Quantum Science and Technology, Chiba, 263-8555, Japan.
| |
Collapse
|
2
|
Gris VN, Crespo TR, Kaneko A, Okamoto M, Suzuki J, Teramae JN, Miyabe-Nishiwaki T. Deep Learning for Face Detection and Pain Assessment in Japanese macaques ( Macaca fuscata). JOURNAL OF THE AMERICAN ASSOCIATION FOR LABORATORY ANIMAL SCIENCE : JAALAS 2024; 63:403-411. [PMID: 38428929 PMCID: PMC11270042 DOI: 10.30802/aalas-jaalas-23-000056] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/13/2023] [Revised: 07/31/2023] [Accepted: 01/04/2024] [Indexed: 03/03/2024]
Abstract
Facial expressions have increasingly been used to assess emotional states in mammals. The recognition of pain in research animals is essential for their well-being and leads to more reliable research outcomes. Automating this process could contribute to early pain diagnosis and treatment. Artificial neural networks have become a popular option for image classification tasks in recent years due to the development of deep learning. In this study, we investigated the ability of a deep learning model to detect pain in Japanese macaques based on their facial expression. Thirty to 60 min of video footage from Japanese macaques undergoing laparotomy was used in the study. Macaques were recorded undisturbed in their cages before surgery (No Pain) and one day after the surgery before scheduled analgesia (Pain). Videos were processed for facial detection and image extraction with the algorithms RetinaFace (adding a bounding box around the face for image extraction) or Mask R-CNN (contouring the face for extraction). ResNet50 used 75% of the images to train systems; the other 25% were used for testing. Test accuracy varied from 48 to 54% after box extraction. The low accuracy of classification after box extraction was likely due to the incorporation of features that were not relevant for pain (for example, background, illumination, skin color, or objects in the enclosure). However, using contour extraction, preprocessing the images, and fine-tuning, the network resulted in 64% appropriate generalization. These results suggest that Mask R-CNN can be used for facial feature extractions and that the performance of the classifying model is relatively accurate for nonannotated single-frame images.
Collapse
Affiliation(s)
| | | | | | | | | | - Jun-Nosuke Teramae
- Department of Advanced Mathematical Sciences, Graduate School of Informatics, Kyoto University, Kyoto, Japan
| | | |
Collapse
|
3
|
Burchardt LS, van de Sande Y, Kehy M, Gamba M, Ravignani A, Pouw W. A toolkit for the dynamic study of air sacs in siamang and other elastic circular structures. PLoS Comput Biol 2024; 20:e1012222. [PMID: 38913743 PMCID: PMC11226135 DOI: 10.1371/journal.pcbi.1012222] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2023] [Revised: 07/05/2024] [Accepted: 06/03/2024] [Indexed: 06/26/2024] Open
Abstract
Biological structures are defined by rigid elements, such as bones, and elastic elements, like muscles and membranes. Computer vision advances have enabled automatic tracking of moving animal skeletal poses. Such developments provide insights into complex time-varying dynamics of biological motion. Conversely, the elastic soft-tissues of organisms, like the nose of elephant seals, or the buccal sac of frogs, are poorly studied and no computer vision methods have been proposed. This leaves major gaps in different areas of biology. In primatology, most critically, the function of air sacs is widely debated; many open questions on the role of air sacs in the evolution of animal communication, including human speech, remain unanswered. To support the dynamic study of soft-tissue structures, we present a toolkit for the automated tracking of semi-circular elastic structures in biological video data. The toolkit contains unsupervised computer vision tools (using Hough transform) and supervised deep learning (by adapting DeepLabCut) methodology to track inflation of laryngeal air sacs or other biological spherical objects (e.g., gular cavities). Confirming the value of elastic kinematic analysis, we show that air sac inflation correlates with acoustic markers that likely inform about body size. Finally, we present a pre-processed audiovisual-kinematic dataset of 7+ hours of closeup audiovisual recordings of siamang (Symphalangus syndactylus) singing. This toolkit (https://github.com/WimPouw/AirSacTracker) aims to revitalize the study of non-skeletal morphological structures across multiple species.
Collapse
Affiliation(s)
- Lara S. Burchardt
- Donders Institute for Brain, Cognition, and Behaviour, Radboud University, Nijmegen, Netherlands
- Leibniz-Zentrum Allgemeine Sprachwissenschaft, Berlin, Germany
| | - Yana van de Sande
- Donders Institute for Brain, Cognition, and Behaviour, Radboud University, Nijmegen, Netherlands
| | - Mounia Kehy
- Equipe de Neuro-Ethologie Sensorielle, Université Jean Monnet, France
| | - Marco Gamba
- Department of Life Sciences and Systems Biology, University of Turin, Turin, Italy
| | - Andrea Ravignani
- Comparative Bioacoustics Group, Max Planck Institute for Psycholinguistics, Nijmegen, Netherlands
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music, Aarhus, Denmark
- Department of Human Neurosciences, Sapienza University of Rome, Rome, Italy
| | - Wim Pouw
- Donders Institute for Brain, Cognition, and Behaviour, Radboud University, Nijmegen, Netherlands
| |
Collapse
|
4
|
Håkansson J, Quinn BL, Shultz AL, Swartz SM, Corcoran AJ. Application of a novel deep learning-based 3D videography workflow to bat flight. Ann N Y Acad Sci 2024; 1536:92-106. [PMID: 38652595 DOI: 10.1111/nyas.15143] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/25/2024]
Abstract
Studying the detailed biomechanics of flying animals requires accurate three-dimensional coordinates for key anatomical landmarks. Traditionally, this relies on manually digitizing animal videos, a labor-intensive task that scales poorly with increasing framerates and numbers of cameras. Here, we present a workflow that combines deep learning-powered automatic digitization with filtering and correction of mislabeled points using quality metrics from deep learning and 3D reconstruction. We tested our workflow using a particularly challenging scenario: bat flight. First, we documented four bats flying steadily in a 2 m3 wind tunnel test section. Wing kinematic parameters resulting from manually digitizing bats with markers applied to anatomical landmarks were not significantly different from those resulting from applying our workflow to the same bats without markers for five out of six parameters. Second, we compared coordinates from manual digitization against those yielded via our workflow for bats flying freely in a 344 m3 enclosure. Average distance between coordinates from our workflow and those from manual digitization was less than a millimeter larger than the average human-to-human coordinate distance. The improved efficiency of our workflow has the potential to increase the scalability of studies on animal flight biomechanics.
Collapse
Affiliation(s)
- Jonas Håkansson
- Department of Biology, University of Colorado Colorado Springs, Colorado Springs, Colorado, USA
| | - Brooke L Quinn
- Department of Ecology, Evolution, and Organismal Biology, Brown University, Providence, Rhode Island, USA
| | - Abigail L Shultz
- Department of Biology, University of Colorado Colorado Springs, Colorado Springs, Colorado, USA
| | - Sharon M Swartz
- Department of Ecology, Evolution, and Organismal Biology, Brown University, Providence, Rhode Island, USA
- School of Engineering, Brown University, Providence, Rhode Island, USA
| | - Aaron J Corcoran
- Department of Biology, University of Colorado Colorado Springs, Colorado Springs, Colorado, USA
| |
Collapse
|
5
|
Ardoin T, Sueur C. Automatic identification of stone-handling behaviour in Japanese macaques using LabGym artificial intelligence. Primates 2024; 65:159-172. [PMID: 38520479 DOI: 10.1007/s10329-024-01123-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2023] [Accepted: 02/22/2024] [Indexed: 03/25/2024]
Abstract
The latest advances in artificial intelligence technology have opened doors to the video analysis of complex behaviours. In light of this, ethologists are actively exploring the potential of these innovations to streamline the time-intensive behavioural analysis process using video data. Several tools have been developed for this purpose in primatology in the past decade. Nonetheless, each tool grapples with technical constraints. To address these limitations, we have established a comprehensive protocol designed to harness the capabilities of a cutting-edge artificial intelligence-assisted software, LabGym. The primary objective of this study was to evaluate the suitability of LabGym for the analysis of primate behaviour, focusing on Japanese macaques as our model subjects. First, we developed a model that accurately detects Japanese macaques, allowing us to analyse their actions using LabGym. Our behavioural analysis model succeeded in recognising stone-handling-like behaviours on video. However, the absence of quantitative data within the specified time frame limits the ability of our study to draw definitive conclusions regarding the quality of the behavioural analysis. Nevertheless, to the best of our knowledge, this study represents the first instance of applying the LabGym tool specifically for the analysis of primate behaviours, with our model focusing on the automated recognition and categorisation of specific behaviours in Japanese macaques. It lays the groundwork for future research in this promising field to complexify our model using the latest version of LabGym and associated tools, such as multi-class detection and interactive behaviour analysis.
Collapse
Affiliation(s)
- Théo Ardoin
- Master Biodiversité Ecologie Et Evolution, Université Paris-Saclay, Orsay, France
- Magistère de Biologie, Université Paris-Saclay, Orsay, France
| | - Cédric Sueur
- Université de Strasbourg, IPHC UMR7178, CNRS, Strasbourg, France.
- ANTHROPO-LAB, ETHICS EA 7446, Université Catholique de Lille, Lille, France.
- Institut Universitaire de France, Paris, France.
| |
Collapse
|
6
|
Testard C, Tremblay S, Parodi F, DiTullio RW, Acevedo-Ithier A, Gardiner KL, Kording K, Platt ML. Neural signatures of natural behaviour in socializing macaques. Nature 2024; 628:381-390. [PMID: 38480888 DOI: 10.1038/s41586-024-07178-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2023] [Accepted: 02/07/2024] [Indexed: 03/18/2024]
Abstract
Our understanding of the neurobiology of primate behaviour largely derives from artificial tasks in highly controlled laboratory settings, overlooking most natural behaviours that primate brains evolved to produce1-3. How primates navigate the multidimensional social relationships that structure daily life4 and shape survival and reproductive success5 remains largely unclear at the single-neuron level. Here we combine ethological analysis, computer vision and wireless recording technologies to identify neural signatures of natural behaviour in unrestrained, socially interacting pairs of rhesus macaques. Single-neuron and population activity in the prefrontal and temporal cortex robustly encoded 24 species-typical behaviours, as well as social context. Male-female partners demonstrated near-perfect reciprocity in grooming, a key behavioural mechanism supporting friendships and alliances6, and neural activity maintained a running account of these social investments. Confronted with an aggressive intruder, behavioural and neural population responses reflected empathy and were buffered by the presence of a partner. Our findings reveal a highly distributed neurophysiological ledger of social dynamics, a potential computational foundation supporting communal life in primate societies, including our own.
Collapse
Affiliation(s)
- Camille Testard
- Department of Neuroscience, University of Pennsylvania, Philadelphia, PA, USA.
- Department of Molecular and Cellular Biology, Harvard University, Cambridge, MA, USA.
| | - Sébastien Tremblay
- Department of Neuroscience, University of Pennsylvania, Philadelphia, PA, USA
- Department of Psychiatry & Neuroscience, Université Laval, Québec, Québec, Canada
| | - Felipe Parodi
- Department of Neuroscience, University of Pennsylvania, Philadelphia, PA, USA
| | - Ron W DiTullio
- Department of Neuroscience, University of Pennsylvania, Philadelphia, PA, USA
| | | | - Kristin L Gardiner
- Department of Pathobiology, University of Pennsylvania, Philadelphia, PA, USA
| | - Konrad Kording
- Department of Neuroscience, University of Pennsylvania, Philadelphia, PA, USA
- Department of Bioengineering, University of Pennsylvania, Philadelphia, PA, USA
| | - Michael L Platt
- Department of Neuroscience, University of Pennsylvania, Philadelphia, PA, USA
- Department of Marketing, University of Pennsylvania, Philadelphia, PA, USA
- Department of Psychology, University of Pennsylvania, Philadelphia, PA, USA
| |
Collapse
|
7
|
Ha LJ, Kim M, Yeo HG, Baek I, Kim K, Lee M, Lee Y, Choi HJ. Development of an assessment method for freely moving nonhuman primates' eating behavior using manual and deep learning analysis. Heliyon 2024; 10:e25561. [PMID: 38356587 PMCID: PMC10865331 DOI: 10.1016/j.heliyon.2024.e25561] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2023] [Revised: 01/22/2024] [Accepted: 01/29/2024] [Indexed: 02/16/2024] Open
Abstract
Purpose Although eating is imperative for survival, few comprehensive methods have been developed to assess freely moving nonhuman primates' eating behavior. In the current study, we distinguished eating behavior into appetitive and consummatory phases and developed nine indices to study them using manual and deep learning-based (DeepLabCut) techniques. Method The indices were utilized to three rhesus macaques by different palatability and hunger levels to validate their utility. To execute the experiment, we designed the eating behavior cage and manufactured the artificial food. The total number of trials was 3, with 1 trial conducted using natural food and 2 trials using artificial food. Result As a result, the indices of highest utility for hunger effect were approach frequency and consummatory duration. Appetitive composite score and consummatory duration showed the highest utility for palatability effect. To elucidate the effects of hunger and palatability, we developed 2D visualization plots based on manual indices. These 2D visualization methods could intuitively depict the palatability perception and hunger internal state. Furthermore, the developed deep learning-based analysis proved accurate and comparable with manual analysis. When comparing the time required for analysis, deep learning-based analysis was 24-times faster than manual analysis. Moreover, temporal and spatial dynamics were visualized via manual and deep learning-based analysis. Based on temporal dynamics analysis, the patterns were classified into four categories: early decline, steady decline, mid-peak with early incline, and late decline. Heatmap of spatial dynamics and trajectory-related visualization could elucidate a consumption posture and a higher spatial occupancy of food zone in hunger and with palatable food. Discussion Collectively, this study describes a newly developed and validated multi-phase method for assessing freely moving nonhuman primate eating behavior using manual and deep learning-based analyses. These effective tools will prove valuable in food reward (palatability effect) and homeostasis (hunger effect) research.
Collapse
Affiliation(s)
- Leslie Jaesun Ha
- Department of Biomedical Sciences, Wide River Institute of Immunology, Neuroscience Research Institute, Seoul National University College of Medicine, Republic of Korea
| | - Meelim Kim
- Department of Biomedical Sciences, Wide River Institute of Immunology, Neuroscience Research Institute, Seoul National University College of Medicine, Republic of Korea
- Department of Preventive Medicine, Yonsei University College of Medicine, Seoul, Republic of Korea
- Center for Wireless and Population Health Systems (CWPHS), University of California, San Diego, La Jolla, CA, 92093, USA
- Herbert Wertheim School of Public Health and Human Longevity Science, University of California San Diego, San Diego, CA, United States
| | - Hyeon-Gu Yeo
- National Primate Research Center, Korea Research Institute of Bioscience and Biotechnology (KRIBB), Republic of Korea
- KRIBB School of Bioscience, Korea National University of Science and Technology, Republic of Korea
| | - Inhyeok Baek
- Department of Biomedical Sciences, Wide River Institute of Immunology, Neuroscience Research Institute, Seoul National University College of Medicine, Republic of Korea
| | - Keonwoo Kim
- National Primate Research Center, Korea Research Institute of Bioscience and Biotechnology (KRIBB), Republic of Korea
- School of Life Sciences, BK21 Plus KNU Creative BioResearch Group, Kyungpook National University, Republic of Korea
| | - Miwoo Lee
- Department of Biomedical Sciences, Wide River Institute of Immunology, Neuroscience Research Institute, Seoul National University College of Medicine, Republic of Korea
| | - Youngjeon Lee
- National Primate Research Center, Korea Research Institute of Bioscience and Biotechnology (KRIBB), Republic of Korea
- KRIBB School of Bioscience, Korea National University of Science and Technology, Republic of Korea
| | - Hyung Jin Choi
- Department of Biomedical Sciences, Wide River Institute of Immunology, Neuroscience Research Institute, Seoul National University College of Medicine, Republic of Korea
| |
Collapse
|
8
|
Desai N, Bala P, Richardson R, Raper J, Zimmermann J, Hayden B. OpenApePose, a database of annotated ape photographs for pose estimation. eLife 2023; 12:RP86873. [PMID: 38078902 PMCID: PMC10712952 DOI: 10.7554/elife.86873] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/18/2023] Open
Abstract
Because of their close relationship with humans, non-human apes (chimpanzees, bonobos, gorillas, orangutans, and gibbons, including siamangs) are of great scientific interest. The goal of understanding their complex behavior would be greatly advanced by the ability to perform video-based pose tracking. Tracking, however, requires high-quality annotated datasets of ape photographs. Here we present OpenApePose, a new public dataset of 71,868 photographs, annotated with 16 body landmarks of six ape species in naturalistic contexts. We show that a standard deep net (HRNet-W48) trained on ape photos can reliably track out-of-sample ape photos better than networks trained on monkeys (specifically, the OpenMonkeyPose dataset) and on humans (COCO) can. This trained network can track apes almost as well as the other networks can track their respective taxa, and models trained without one of the six ape species can track the held-out species better than the monkey and human models can. Ultimately, the results of our analyses highlight the importance of large, specialized databases for animal tracking systems and confirm the utility of our new ape database.
Collapse
Affiliation(s)
- Nisarg Desai
- Department of Neuroscience and Center for Magnetic Resonance Research, University of MinnesotaMinneapolisUnited States
| | - Praneet Bala
- Department of Computer Science, University of MinnesotaMinneapolisUnited States
| | - Rebecca Richardson
- Emory National Primate Research Center, Emory UniversityAtlantaUnited States
| | - Jessica Raper
- Emory National Primate Research Center, Emory UniversityAtlantaUnited States
| | - Jan Zimmermann
- Department of Neuroscience and Center for Magnetic Resonance Research, University of MinnesotaMinneapolisUnited States
| | - Benjamin Hayden
- Department of Neuroscience and Center for Magnetic Resonance Research, University of MinnesotaMinneapolisUnited States
| |
Collapse
|
9
|
Testard C, Tremblay S, Parodi F, DiTullio RW, Acevedo-Ithier A, Gardiner K, Kording KP, Platt M. Neural signatures of natural behavior in socializing macaques. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.07.05.547833. [PMID: 37461580 PMCID: PMC10349985 DOI: 10.1101/2023.07.05.547833] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/05/2023]
Abstract
Our understanding of the neurobiology of primate behavior largely derives from artificial tasks in highly-controlled laboratory settings, overlooking most natural behaviors primate brains evolved to produce1. In particular, how primates navigate the multidimensional social relationships that structure daily life and shape survival and reproductive success remains largely unexplored at the single neuron level. Here, we combine ethological analysis with new wireless recording technologies to uncover neural signatures of natural behavior in unrestrained, socially interacting pairs of rhesus macaques within a larger colony. Population decoding of single neuron activity in prefrontal and temporal cortex unveiled robust encoding of 24 species-typical behaviors, which was strongly modulated by the presence and identity of surrounding monkeys. Male-female partners demonstrated near-perfect reciprocity in grooming, a key behavioral mechanism supporting friendships and alliances, and neural activity maintained a running account of these social investments. When confronted with an aggressive intruder, behavioral and neural population responses reflected empathy and were buffered by the presence of a partner. Surprisingly, neural signatures in prefrontal and temporal cortex were largely indistinguishable and irreducible to visual and motor contingencies. By employing an ethological approach to the study of primate neurobiology, we reveal a highly-distributed neurophysiological record of social dynamics, a potential computational foundation supporting communal life in primate societies, including our own.
Collapse
|
10
|
Butler DJ, Keim AP, Ray S, Azim E. Large-scale capture of hidden fluorescent labels for training generalizable markerless motion capture models. Nat Commun 2023; 14:5866. [PMID: 37752123 PMCID: PMC10522643 DOI: 10.1038/s41467-023-41565-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2022] [Accepted: 09/08/2023] [Indexed: 09/28/2023] Open
Abstract
Deep learning-based markerless tracking has revolutionized studies of animal behavior. Yet the generalizability of trained models tends to be limited, as new training data typically needs to be generated manually for each setup or visual environment. With each model trained from scratch, researchers track distinct landmarks and analyze the resulting kinematic data in idiosyncratic ways. Moreover, due to inherent limitations in manual annotation, only a sparse set of landmarks are typically labeled. To address these issues, we developed an approach, which we term GlowTrack, for generating orders of magnitude more training data, enabling models that generalize across experimental contexts. We describe: a) a high-throughput approach for producing hidden labels using fluorescent markers; b) a multi-camera, multi-light setup for simulating diverse visual conditions; and c) a technique for labeling many landmarks in parallel, enabling dense tracking. These advances lay a foundation for standardized behavioral pipelines and more complete scrutiny of movement.
Collapse
Affiliation(s)
- Daniel J Butler
- Molecular Neurobiology Laboratory, Salk Institute for Biological Studies, 10010 N. Torrey Pines Road, La Jolla, CA, 92037, USA
| | - Alexander P Keim
- Molecular Neurobiology Laboratory, Salk Institute for Biological Studies, 10010 N. Torrey Pines Road, La Jolla, CA, 92037, USA
| | - Shantanu Ray
- Molecular Neurobiology Laboratory, Salk Institute for Biological Studies, 10010 N. Torrey Pines Road, La Jolla, CA, 92037, USA
| | - Eiman Azim
- Molecular Neurobiology Laboratory, Salk Institute for Biological Studies, 10010 N. Torrey Pines Road, La Jolla, CA, 92037, USA.
| |
Collapse
|
11
|
Li C, Xiao Z, Li Y, Chen Z, Ji X, Liu Y, Feng S, Zhang Z, Zhang K, Feng J, Robbins TW, Xiong S, Chen Y, Xiao X. Deep learning-based activity recognition and fine motor identification using 2D skeletons of cynomolgus monkeys. Zool Res 2023; 44:967-980. [PMID: 37721106 PMCID: PMC10559098 DOI: 10.24272/j.issn.2095-8137.2022.449] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2023] [Accepted: 09/14/2023] [Indexed: 09/19/2023] Open
Abstract
Video-based action recognition is becoming a vital tool in clinical research and neuroscientific study for disorder detection and prediction. However, action recognition currently used in non-human primate (NHP) research relies heavily on intense manual labor and lacks standardized assessment. In this work, we established two standard benchmark datasets of NHPs in the laboratory: MonkeyinLab (MiL), which includes 13 categories of actions and postures, and MiL2D, which includes sequences of two-dimensional (2D) skeleton features. Furthermore, based on recent methodological advances in deep learning and skeleton visualization, we introduced the MonkeyMonitorKit (MonKit) toolbox for automatic action recognition, posture estimation, and identification of fine motor activity in monkeys. Using the datasets and MonKit, we evaluated the daily behaviors of wild-type cynomolgus monkeys within their home cages and experimental environments and compared these observations with the behaviors exhibited by cynomolgus monkeys possessing mutations in the MECP2 gene as a disease model of Rett syndrome (RTT). MonKit was used to assess motor function, stereotyped behaviors, and depressive phenotypes, with the outcomes compared with human manual detection. MonKit established consistent criteria for identifying behavior in NHPs with high accuracy and efficiency, thus providing a novel and comprehensive tool for assessing phenotypic behavior in monkeys.
Collapse
Affiliation(s)
- Chuxi Li
- School of Information Science and Technology Micro Nano System Center, Fudan University, Shanghai 200433, China
| | - Zifan Xiao
- Department of Anesthesiology, Huashan Hospital
- Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence, Ministry of Education
- Behavioral and Cognitive Neuroscience Center, Institute of Science and Technology for Brain-Inspired Intelligence, MOE Frontiers Center for Brain Science, Fudan University, Shanghai 200433, China
| | - Yerong Li
- School of Information Science and Technology Micro Nano System Center, Fudan University, Shanghai 200433, China
| | - Zhinan Chen
- School of Information Science and Technology Micro Nano System Center, Fudan University, Shanghai 200433, China
| | - Xun Ji
- Kuang Yaming Honors School, Nanjing University, Nanjing, Jiangsu 210023, China
| | - Yiqun Liu
- Shanghai Key Laboratory of Intelligent Information Processing, School of Computer Science, Fudan University, Shanghai 200433, China
| | - Shufei Feng
- State Key Laboratory of Primate Biomedical Research
- Institute of Primate Translational Medicine, Kunming University of Science and Technology, Kunming, Yunnan 650500, China
| | - Zhen Zhang
- State Key Laboratory of Primate Biomedical Research
- Institute of Primate Translational Medicine, Kunming University of Science and Technology, Kunming, Yunnan 650500, China
| | - Kaiming Zhang
- New Vision World LLC., Aliso Viejo, California 92656, USA
| | - Jianfeng Feng
- Department of Anesthesiology, Huashan Hospital
- Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence, Ministry of Education
- Behavioral and Cognitive Neuroscience Center, Institute of Science and Technology for Brain-Inspired Intelligence, MOE Frontiers Center for Brain Science, Fudan University, Shanghai 200433, China
| | - Trevor W Robbins
- Department of Anesthesiology, Huashan Hospital
- Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence, Ministry of Education
- Behavioral and Cognitive Neuroscience Center, Institute of Science and Technology for Brain-Inspired Intelligence, MOE Frontiers Center for Brain Science, Fudan University, Shanghai 200433, China
- Behavioural and Clinical Neuroscience Institute, University of Cambridge, Cambridge, CB2 1TN, UK
| | - Shisheng Xiong
- School of Information Science and Technology Micro Nano System Center, Fudan University, Shanghai 200433, China. E-mail:
| | - Yongchang Chen
- State Key Laboratory of Primate Biomedical Research
- Institute of Primate Translational Medicine, Kunming University of Science and Technology, Kunming, Yunnan 650500, China
| | - Xiao Xiao
- Department of Anesthesiology, Huashan Hospital
- Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence, Ministry of Education
- Behavioral and Cognitive Neuroscience Center, Institute of Science and Technology for Brain-Inspired Intelligence, MOE Frontiers Center for Brain Science, Fudan University, Shanghai 200433, China. E-mail:
| |
Collapse
|
12
|
Salgirli Demirbas Y, Isparta S, Saral B, Keskin Yılmaz N, Adıay D, Matsui H, Töre-Yargın G, Musa SA, Atilgan D, Öztürk H, Kul BC, Şafak CE, Ocklenburg S, Güntürkün O. Acute and chronic stress alter behavioral laterality in dogs. Sci Rep 2023; 13:4092. [PMID: 36906713 PMCID: PMC10008577 DOI: 10.1038/s41598-023-31213-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2022] [Accepted: 03/08/2023] [Indexed: 03/13/2023] Open
Abstract
Dogs are one of the key animal species in investigating the biological mechanisms of behavioral laterality. Cerebral asymmetries are assumed to be influenced by stress, but this subject has not yet been studied in dogs. This study aims to investigate the effect of stress on laterality in dogs by using two different motor laterality tests: the Kong™ Test and a Food-Reaching Test (FRT). Motor laterality of chronically stressed (n = 28) and emotionally/physically healthy dogs (n = 32) were determined in two different environments, i.e., a home environment and a stressful open field test (OFT) environment. Physiological parameters including salivary cortisol, respiratory rate, and heart rate were measured for each dog, under both conditions. Cortisol results showed that acute stress induction by OFT was successful. A shift towards ambilaterality was detected in dogs after acute stress. Results also showed a significantly lower absolute laterality index in the chronically stressed dogs. Moreover, the direction of the first paw used in FRT was a good predictor of the general paw preference of an animal. Overall, these results provide evidence that both acute and chronic stress exposure can change behavioral asymmetries in dogs.
Collapse
Affiliation(s)
| | - Sevim Isparta
- Biopsychology, Department of Psychology, Institute of Cognitive Neuroscience, Ruhr-University Bochum, Bochum, Germany.
- Department of Genetics, Faculty of Veterinary Medicine, Ankara University, Ankara, Turkey.
| | - Begum Saral
- Department of Physiology, Faculty of Veterinary Medicine, Ankara University, Ankara, Turkey
| | - Nevra Keskin Yılmaz
- Department of Internal Medicine, Faculty of Veterinary Medicine, Ankara University, Ankara, Turkey
| | - Deniz Adıay
- Department of Internal Medicine, Faculty of Veterinary Medicine, Ankara University, Ankara, Turkey
| | - Hiroshi Matsui
- Center for Human Nature, Artificial Intelligence, and Neuroscience, Hokkaido University, Hokkaido, Japan
| | - Gülşen Töre-Yargın
- Department of Industrial Design, Middle East Technical University, Ankara, Turkey
| | - Saad Adam Musa
- Department of Physiology, Faculty of Veterinary Medicine, Ankara University, Ankara, Turkey
| | - Durmus Atilgan
- Department of Physiology, Faculty of Veterinary Medicine, Ankara University, Ankara, Turkey
| | - Hakan Öztürk
- Department of Physiology, Faculty of Veterinary Medicine, Ankara University, Ankara, Turkey
| | - Bengi Cinar Kul
- Department of Genetics, Faculty of Veterinary Medicine, Ankara University, Ankara, Turkey
| | - C Etkin Şafak
- Department of Physiology, Faculty of Veterinary Medicine, Ankara University, Ankara, Turkey
| | - Sebastian Ocklenburg
- Biopsychology, Department of Psychology, Institute of Cognitive Neuroscience, Ruhr-University Bochum, Bochum, Germany
- Department of Psychology, Medical School Hamburg, Hamburg, Germany
- ICAN Institute for Cognitive and Affective Neuroscience, Medical School Hamburg, Hamburg, Germany
| | - Onur Güntürkün
- Biopsychology, Department of Psychology, Institute of Cognitive Neuroscience, Ruhr-University Bochum, Bochum, Germany
| |
Collapse
|
13
|
Liang F, Yu S, Pang S, Wang X, Jie J, Gao F, Song Z, Li B, Liao WH, Yin M. Non-human primate models and systems for gait and neurophysiological analysis. Front Neurosci 2023; 17:1141567. [PMID: 37188006 PMCID: PMC10175625 DOI: 10.3389/fnins.2023.1141567] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Accepted: 04/11/2023] [Indexed: 05/17/2023] Open
Abstract
Brain-computer interfaces (BCIs) have garnered extensive interest and become a groundbreaking technology to restore movement, tactile sense, and communication in patients. Prior to their use in human subjects, clinical BCIs require rigorous validation and verification (V&V). Non-human primates (NHPs) are often considered the ultimate and widely used animal model for neuroscience studies, including BCIs V&V, due to their proximity to humans. This literature review summarizes 94 NHP gait analysis studies until 1 June, 2022, including seven BCI-oriented studies. Due to technological limitations, most of these studies used wired neural recordings to access electrophysiological data. However, wireless neural recording systems for NHPs enabled neuroscience research in humans, and many on NHP locomotion, while posing numerous technical challenges, such as signal quality, data throughout, working distance, size, and power constraint, that have yet to be overcome. Besides neurological data, motion capture (MoCap) systems are usually required in BCI and gait studies to capture locomotion kinematics. However, current studies have exclusively relied on image processing-based MoCap systems, which have insufficient accuracy (error: ≥4° and 9 mm). While the role of the motor cortex during locomotion is still unclear and worth further exploration, future BCI and gait studies require simultaneous, high-speed, accurate neurophysiological, and movement measures. Therefore, the infrared MoCap system which has high accuracy and speed, together with a high spatiotemporal resolution neural recording system, may expand the scope and improve the quality of the motor and neurophysiological analysis in NHPs.
Collapse
Affiliation(s)
- Fengyan Liang
- Key Laboratory of Biomedical Engineering of Hainan Province, School of Biomedical Engineering, Hainan University, Haikou, China
- Department of Rehabilitation Medicine, Affiliated Haikou Hospital of Xiangya Medical College, Central South University, Haikou, China
| | - Shanshan Yu
- Key Laboratory of Biomedical Engineering of Hainan Province, School of Biomedical Engineering, Hainan University, Haikou, China
| | - Siqi Pang
- Key Laboratory of Biomedical Engineering of Hainan Province, School of Biomedical Engineering, Hainan University, Haikou, China
| | - Xiao Wang
- Key Laboratory of Biomedical Engineering of Hainan Province, School of Biomedical Engineering, Hainan University, Haikou, China
| | - Jing Jie
- Key Laboratory of Biomedical Engineering of Hainan Province, School of Biomedical Engineering, Hainan University, Haikou, China
| | - Fei Gao
- Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
| | - Zhenhua Song
- Department of Rehabilitation Medicine, Affiliated Haikou Hospital of Xiangya Medical College, Central South University, Haikou, China
| | - Binbin Li
- Department of Rehabilitation Medicine, Affiliated Haikou Hospital of Xiangya Medical College, Central South University, Haikou, China
| | - Wei-Hsin Liao
- Department of Mechanical and Automation Engineering, The Chinese University of Hong Kong, Shatin, China
| | - Ming Yin
- Key Laboratory of Biomedical Engineering of Hainan Province, School of Biomedical Engineering, Hainan University, Haikou, China
- *Correspondence: Ming Yin,
| |
Collapse
|
14
|
Negrete SB, Arai H, Natsume K, Shibata T. Multi-view image-based behavior classification of wet-dog shake in Kainate rat model. Front Behav Neurosci 2023; 17:1148549. [PMID: 37200783 PMCID: PMC10187480 DOI: 10.3389/fnbeh.2023.1148549] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2023] [Accepted: 04/05/2023] [Indexed: 05/20/2023] Open
Abstract
The wet-dog shake behavior (WDS) is a short-duration behavior relevant to the study of various animal disease models, including acute seizures, morphine abstinence, and nicotine withdrawal. However, no animal behavior detection system has included WDS. In this work, we present a multi-view animal behavior detection system based on image classification and use it to detect rats' WDS behavior. Our system uses a novel time-multi-view fusion scheme that does not rely on artificial features (feature engineering) and is flexible to adapt to other animals and behaviors. It can use one or more views for higher accuracy. We tested our framework to classify WDS behavior in rats and compared the results using different amounts of cameras. Our results show that the use of additional views increases the performance of WDS behavioral classification. With three cameras, we achieved a precision of 0.91 and a recall of 0.86. Our multi-view animal behavior detection system represents the first system capable of detecting WDS and has potential applications in various animal disease models.
Collapse
|
15
|
Deep MAnTra: deep learning-based multi-animal tracking for Japanese macaques. ARTIFICIAL LIFE AND ROBOTICS 2022. [DOI: 10.1007/s10015-022-00837-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
|
16
|
Weber RZ, Mulders G, Kaiser J, Tackenberg C, Rust R. Deep learning-based behavioral profiling of rodent stroke recovery. BMC Biol 2022; 20:232. [PMID: 36243716 PMCID: PMC9571460 DOI: 10.1186/s12915-022-01434-9] [Citation(s) in RCA: 14] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2022] [Accepted: 10/05/2022] [Indexed: 11/16/2022] Open
Abstract
BACKGROUND Stroke research heavily relies on rodent behavior when assessing underlying disease mechanisms and treatment efficacy. Although functional motor recovery is considered the primary targeted outcome, tests in rodents are still poorly reproducible and often unsuitable for unraveling the complex behavior after injury. RESULTS Here, we provide a comprehensive 3D gait analysis of mice after focal cerebral ischemia based on the new deep learning-based software (DeepLabCut, DLC) that only requires basic behavioral equipment. We demonstrate a high precision 3D tracking of 10 body parts (including all relevant joints and reference landmarks) in several mouse strains. Building on this rigor motion tracking, a comprehensive post-analysis (with >100 parameters) unveils biologically relevant differences in locomotor profiles after a stroke over a time course of 3 weeks. We further refine the widely used ladder rung test using deep learning and compare its performance to human annotators. The generated DLC-assisted tests were then benchmarked to five widely used conventional behavioral set-ups (neurological scoring, rotarod, ladder rung walk, cylinder test, and single-pellet grasping) regarding sensitivity, accuracy, time use, and costs. CONCLUSIONS We conclude that deep learning-based motion tracking with comprehensive post-analysis provides accurate and sensitive data to describe the complex recovery of rodents following a stroke. The experimental set-up and analysis can also benefit a range of other neurological injuries that affect locomotion.
Collapse
Affiliation(s)
- Rebecca Z Weber
- Institute for Regenerative Medicine (IREM), University of Zurich, Campus Schlieren, Wagistrasse 12, 8952, Schlieren, Switzerland
- Neuroscience Center Zurich, University of Zurich and ETH Zurich, Zurich, Switzerland
| | - Geertje Mulders
- Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland
| | - Julia Kaiser
- Burke Neurological Institute, White Plains, NY, USA
| | - Christian Tackenberg
- Institute for Regenerative Medicine (IREM), University of Zurich, Campus Schlieren, Wagistrasse 12, 8952, Schlieren, Switzerland.
- Neuroscience Center Zurich, University of Zurich and ETH Zurich, Zurich, Switzerland.
| | - Ruslan Rust
- Institute for Regenerative Medicine (IREM), University of Zurich, Campus Schlieren, Wagistrasse 12, 8952, Schlieren, Switzerland.
- Neuroscience Center Zurich, University of Zurich and ETH Zurich, Zurich, Switzerland.
| |
Collapse
|
17
|
Hayden BY, Park HS, Zimmermann J. Automated pose estimation in primates. Am J Primatol 2022; 84:e23348. [PMID: 34855257 PMCID: PMC9160209 DOI: 10.1002/ajp.23348] [Citation(s) in RCA: 11] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2021] [Revised: 11/05/2021] [Accepted: 11/09/2021] [Indexed: 11/11/2022]
Abstract
Understanding the behavior of primates is important for primatology, for psychology, and for biology more broadly. It is also important for biomedicine, where primates are an important model organism, and whose behavior is often an important variable of interest. Our ability to rigorously quantify behavior has, however, long been limited. On one hand, we can rigorously quantify low-information measures like preference, looking time, and reaction time; on the other, we can use more gestalt measures like behavioral categories tracked via ethogram, but at high cost and with high variability. Recent technological advances have led to a major revolution in behavioral measurement that offers affordable and scalable rigor. Specifically, digital video cameras and automated pose tracking software can provide measures of full-body position (i.e., pose) of primates over time (i.e., behavior) with high spatial and temporal resolution. Pose-tracking technology in turn can be used to infer behavioral states, such as eating, sleeping, and mating. We call this technological approach behavioral imaging. In this review, we situate the behavioral imaging revolution in the history of the study of behavior, argue for investment in and development of analytical and research techniques that can profit from the advent of the era of big behavior, and propose that primate centers and zoos will take on a more central role in relevant fields of research than they have in the past.
Collapse
Affiliation(s)
- Benjamin Y. Hayden
- Department of Neuroscience, Center for Magnetic Resonance Research, Department of Biomedical Engineering
| | - Hyun Soo Park
- Department of Computer Science and Engineering, University of Minnesota, Minneapolis MN 55455
| | - Jan Zimmermann
- Department of Neuroscience, Center for Magnetic Resonance Research, Department of Biomedical Engineering
| |
Collapse
|
18
|
Kirkpatrick NJ, Butera RJ, Chang YH. DeepLabCut increases markerless tracking efficiency in X-Ray video analysis of rodent locomotion. J Exp Biol 2022; 225:276294. [PMID: 35950365 DOI: 10.1242/jeb.244540] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2022] [Accepted: 07/23/2022] [Indexed: 11/20/2022]
Abstract
Despite the prevalence of rat models to study human disease and injury, existing methods for quantifying behavior through skeletal movements are problematic due to skin movement inaccuracies associated with optical video analysis; or, require invasive implanted markers or time consuming manual rotoscoping for x-ray video approaches. We examined the use of a machine learning tool, DeepLabCut, to perform automated, markerless tracking in bi-planar x-ray videos of locomoting rats. Models were trained on 590 pairs of video frames to identify 19 unique skeletal landmarks of the pelvic limb. Accuracy, precision, and time savings were assessed. Machine-identified landmarks deviated from manually labeled counterparts by 2.4±0.2 mm (n=1,710 landmarks). DeepLabCut decreased analysis time by over three orders of magnitude (1,627x) compared to manual labeling. Distribution of these models may enable the processing of a large volume of accurate x-ray kinematics locomotion data in a fraction of the time without requiring surgically implanted markers.
Collapse
Affiliation(s)
- Nathan J Kirkpatrick
- Wallace H. Coulter Department of Biomedical Engineering, Georgia Institute of Technology & Emory University, USA
| | - Robert J Butera
- Wallace H. Coulter Department of Biomedical Engineering, Georgia Institute of Technology & Emory University, USA.,School of Electrical and Computer Engineering, Georgia Institute of Technology, USA
| | - Young-Hui Chang
- Wallace H. Coulter Department of Biomedical Engineering, Georgia Institute of Technology & Emory University, USA.,School of Biological Sciences, Georgia Institute of Technology, USA
| |
Collapse
|
19
|
An Attention-Refined Light-Weight High-Resolution Network for Macaque Monkey Pose Estimation. INFORMATION 2022. [DOI: 10.3390/info13080356] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/10/2022] Open
Abstract
Macaque monkey is a rare substitute which plays an important role for human beings in relation to psychological and spiritual science research. It is essential for these studies to accurately estimate the pose information of macaque monkeys. Many large-scale models have achieved state-of-the-art results in pose macaque estimation. However, it is difficult to deploy when computing resources are limited. Combining the structure of high-resolution network and the design principle of light-weight network, we propose the attention-refined light-weight high-resolution network for macaque monkey pose estimation (HR-MPE). The multi-branch parallel structure is adopted to maintain high-resolution representation throughout the process. Moreover, a novel basic block is designed by a powerful transformer structure and polarized self-attention, where there is a simple structure and fewer parameters. Two attention refined blocks are added at the end of the parallel structure, which are composed of light-weight asymmetric convolutions and a triplet attention with almost no parameter, obtaining richer representation information. An unbiased data processing method is also utilized to obtain an accurate flipping result. The experiment is conducted on a macaque dataset containing more than 13,000 pictures. Our network has reached a 77.0 AP score, surpassing HRFormer with fewer parameters by 1.8 AP.
Collapse
|
20
|
Lutz CK, Coleman K, Hopper LM, Novak MA, Perlman JE, Pomerantz O. Nonhuman primate abnormal behavior: Etiology, assessment, and treatment. Am J Primatol 2022; 84:e23380. [PMID: 35383995 DOI: 10.1002/ajp.23380] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2022] [Revised: 03/07/2022] [Accepted: 03/12/2022] [Indexed: 12/29/2022]
Abstract
Across captive settings, nonhuman primates may develop an array of abnormal behaviors including stereotypic and self-injurious behavior. Abnormal behavior can indicate a state of poor welfare, since it is often associated with a suboptimal environment. However, this may not always be the case as some behaviors can develop independently of any psychological distress, be triggered in environments known to promote welfare, and be part of an animal's coping mechanism. Furthermore, not all animals develop abnormal behavior, which has led researchers to assess risk factors that differentiate individuals in the display of these behaviors. Intrinsic risk factors that have been identified include the animal's species and genetics, age, sex, temperament, and clinical condition, while environmental risk factors include variables such as the animal's rearing, housing condition, husbandry procedures, and research experiences. To identify specific triggers and at-risk animals, the expression of abnormal behavior in captive nonhuman primates should be routinely addressed in a consistent manner by appropriately trained staff. Which behaviors to assess, what assessment methods to use, which primates to monitor, and the aims of data collection should all be identified before proceeding to an intervention and/or treatment. This article provides guidance for this process, by presenting an overview of known triggers and risk factors that should be considered, steps to design a comprehensive evaluation plan, and strategies that might be used for prevention or treatment. It also outlines the tools and processes for assessing and evaluating behavior in an appendix. This process will lead to a better understanding of abnormal behavior in captive primate colonies and ultimately to improved welfare.
Collapse
Affiliation(s)
- Corrine K Lutz
- Institute for Laboratory Animal Research, The National Academies of Sciences, Engineering, and Medicine, Washington, District of Columbia, USA
| | - Kristine Coleman
- Division of Comparative Medicine, Oregon National Primate Research Center, Oregon Health and Science University, Beaverton, Oregon, USA
| | - Lydia M Hopper
- Department of Molecular and Comparative Pathobiology, Johns Hopkins University School of Medicine, Baltimore, Maryland, USA
| | - Melinda A Novak
- Department of Psychological and Brain Sciences, University of Massachusetts, Amherst, Massachusetts, USA
| | - Jaine E Perlman
- Division of Animal Resources, Yerkes National Primate Research Center, Emory University, Atlanta, Georgia, USA
| | - Ori Pomerantz
- Population and Behavioral Health Services, California National Primate Research Center, University of California, Davis, California, USA
| |
Collapse
|
21
|
Graham KE, Badihi G, Safryghin A, Grund C, Hobaiter C. A socio-ecological perspective on the gestural communication of great ape species, individuals, and social units. ETHOL ECOL EVOL 2022; 34:235-259. [PMID: 35529671 PMCID: PMC9067943 DOI: 10.1080/03949370.2021.1988722] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/05/2023]
Abstract
Over the last 30 years, most research on non-human primate gestural communication has been produced by psychologists, which has shaped the questions asked and the methods used. These researchers have drawn on concepts from philosophy, linguistics, anthropology, and ethology, but despite these broad influences the field has neglected to situate gestures into the socio-ecological context in which the diverse species, individuals, and social-units exist. In this review, we present current knowledge about great ape gestural communication in terms of repertoires, meanings, and development. We fold this into a conversation about variation in other types of ape social behaviour to identify areas for future research on variation in gestural communication. Given the large variation in socio-ecological factors across species and social-units (and the individuals within these groups), we may expect to find different preferences for specific gesture types; different needs for communicating specific meanings; and different rates of encountering specific contexts. New tools, such as machine-learning based automated movement tracking, may allow us to uncover potential variation in the speed and form of gesture actions or parts of gesture actions. New multi-group multi-generational datasets provide the opportunity to apply analyses, such as Bayesian modelling, which allows us to examine these rich behavioural landscapes. Together, by expanding our questions and our methods, researchers may finally be able to study great ape gestures from the perspective of the apes themselves and explore what this gestural communication system reveals about apes’ thinking and experience of their world.
Collapse
Affiliation(s)
- Kirsty E. Graham
- School of Psychology & Neuroscience, University of St Andrews, St Mary’s Quad, South St, St Andrews KY16 9JP, Scotland, UK
| | - Gal Badihi
- School of Psychology & Neuroscience, University of St Andrews, St Mary’s Quad, South St, St Andrews KY16 9JP, Scotland, UK
| | - Alexandra Safryghin
- School of Psychology & Neuroscience, University of St Andrews, St Mary’s Quad, South St, St Andrews KY16 9JP, Scotland, UK
| | - Charlotte Grund
- School of Psychology & Neuroscience, University of St Andrews, St Mary’s Quad, South St, St Andrews KY16 9JP, Scotland, UK
| | - Catherine Hobaiter
- School of Psychology & Neuroscience, University of St Andrews, St Mary’s Quad, South St, St Andrews KY16 9JP, Scotland, UK
| |
Collapse
|
22
|
Doornweerd JE, Kootstra G, Veerkamp RF, Ellen ED, van der Eijk JAJ, van de Straat T, Bouwman AC. Across-Species Pose Estimation in Poultry Based on Images Using Deep Learning. FRONTIERS IN ANIMAL SCIENCE 2021. [DOI: 10.3389/fanim.2021.791290] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Animal pose-estimation networks enable automated estimation of key body points in images or videos. This enables animal breeders to collect pose information repeatedly on a large number of animals. However, the success of pose-estimation networks depends in part on the availability of data to learn the representation of key body points. Especially with animals, data collection is not always easy, and data annotation is laborious and time-consuming. The available data is therefore often limited, but data from other species might be useful, either by itself or in combination with the target species. In this study, the across-species performance of animal pose-estimation networks and the performance of an animal pose-estimation network trained on multi-species data (turkeys and broilers) were investigated. Broilers and turkeys were video recorded during a walkway test representative of the situation in practice. Two single-species and one multi-species model were trained by using DeepLabCut and tested on two single-species test sets. Overall, the within-species models outperformed the multi-species model, and the models applied across species, as shown by a lower raw pixel error, normalized pixel error, and higher percentage of keypoints remaining (PKR). The multi-species model had slightly higher errors with a lower PKR than the within-species models but had less than half the number of annotated frames available from each species. Compared to the single-species broiler model, the multi-species model achieved lower errors for the head, left foot, and right knee keypoints, although with a lower PKR. Across species, keypoint predictions resulted in high errors and low to moderate PKRs and are unlikely to be of direct use for pose and gait assessments. A multi-species model may reduce annotation needs without a large impact on performance for pose assessment, however, with the recommendation to only be used if the species are comparable. If a single-species model exists it could be used as a pre-trained model for training a new model, and possibly require a limited amount of new data. Future studies should investigate the accuracy needed for pose and gait assessments and estimate genetic parameters for the new phenotypes before pose-estimation networks can be applied in practice.
Collapse
|
23
|
Zanon M, Lemaire BS, Vallortigara G. Steps towards a computational ethology: an automatized, interactive setup to investigate filial imprinting and biological predispositions. BIOLOGICAL CYBERNETICS 2021; 115:575-584. [PMID: 34272970 PMCID: PMC8642325 DOI: 10.1007/s00422-021-00886-6] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/05/2021] [Accepted: 07/06/2021] [Indexed: 06/13/2023]
Abstract
Soon after hatching, the young of precocial species, such as domestic chicks or ducklings, learn to recognize their social partner by simply being exposed to it (imprinting process). Even artificial objects or stimuli displayed on monitor screens can effectively trigger filial imprinting, though learning is canalized by spontaneous preferences for animacy signals, such as certain kinds of motion or a face-like appearance. Imprinting is used as a behavioural paradigm for studies on memory formation, early learning and predispositions, as well as number and space cognition, and brain asymmetries. Here, we present an automatized setup to expose and/or test animals for a variety of imprinting experiments. The setup consists of a cage with two high-frequency screens at the opposite ends where stimuli are shown. Provided with a camera covering the whole space of the cage, the behaviour of the animal is recorded continuously. A graphic user interface implemented in Matlab allows a custom configuration of the experimental protocol, that together with Psychtoolbox drives the presentation of images on the screens, with accurate time scheduling and a highly precise framerate. The setup can be implemented into a complete workflow to analyse behaviour in a fully automatized way by combining Matlab (and Psychtoolbox) to control the monitor screens and stimuli, DeepLabCut to track animals' behaviour, Python (and R) to extract data and perform statistical analyses. The automated setup allows neuro-behavioural scientists to perform standardized protocols during their experiments, with faster data collection and analyses, and reproducible results.
Collapse
Affiliation(s)
- Mirko Zanon
- Center for Mind/Brain Sciences, University of Trento, Rovereto, Italy.
| | - Bastien S Lemaire
- Center for Mind/Brain Sciences, University of Trento, Rovereto, Italy
| | | |
Collapse
|
24
|
Scott JT, Bourne JA. Modelling behaviors relevant to brain disorders in the nonhuman primate: Are we there yet? Prog Neurobiol 2021; 208:102183. [PMID: 34728308 DOI: 10.1016/j.pneurobio.2021.102183] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2021] [Revised: 10/27/2021] [Accepted: 10/27/2021] [Indexed: 12/30/2022]
Abstract
Recent years have seen a profound resurgence of activity with nonhuman primates (NHPs) to model human brain disorders. From marmosets to macaques, the study of NHP species offers a unique window into the function of primate-specific neural circuits that are impossible to examine in other models. Examining how these circuits manifest into the complex behaviors of primates, such as advanced cognitive and social functions, has provided enormous insights to date into the mechanisms underlying symptoms of numerous neurological and neuropsychiatric illnesses. With the recent optimization of modern techniques to manipulate and measure neural activity in vivo, such as optogenetics and calcium imaging, NHP research is more well-equipped than ever to probe the neural mechanisms underlying pathological behavior. However, methods for behavioral experimentation and analysis in NHPs have noticeably failed to keep pace with these advances. As behavior ultimately lies at the junction between preclinical findings and its translation to clinical outcomes for brain disorders, approaches to improve the integrity, reproducibility, and translatability of behavioral experiments in NHPs requires critical evaluation. In this review, we provide a unifying account of existing brain disorder models using NHPs, and provide insights into the present and emerging contributions of behavioral studies to the field.
Collapse
Affiliation(s)
- Jack T Scott
- Australian Regenerative Medicine Institute, Monash University, Clayton, VIC, Australia
| | - James A Bourne
- Australian Regenerative Medicine Institute, Monash University, Clayton, VIC, Australia.
| |
Collapse
|
25
|
Silvernagel MP, Ling AS, Nuyujukian P. A markerless platform for ambulatory systems neuroscience. Sci Robot 2021; 6:eabj7045. [PMID: 34516749 DOI: 10.1126/scirobotics.abj7045] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/06/2023]
Abstract
Motor systems neuroscience seeks to understand how the brain controls movement. To minimize confounding variables, large-animal studies typically constrain body movement from areas not under observation, ensuring consistent, repeatable behaviors. Such studies have fueled decades of research, but they may be artificially limiting the richness of neural data observed, preventing generalization to more natural movements and settings. Neuroscience studies of unconstrained movement would capture a greater range of behavior and a more complete view of neuronal activity, but instrumenting an experimental rig suitable for large animals presents substantial engineering challenges. Here, we present a markerless, full-body motion tracking and synchronized wireless neural electrophysiology platform for large, ambulatory animals. Composed of four depth (RGB-D) cameras that provide a 360° view of a 4.5-square-meters enclosed area, this system is designed to record a diverse range of neuroethologically relevant behaviors. This platform also allows for the simultaneous acquisition of hundreds of wireless neural recording channels in multiple brain regions. As behavioral and neuronal data are generated at rates below 200 megabytes per second, a single desktop can facilitate hours of continuous recording. This setup is designed for systems neuroscience and neuroengineering research, where synchronized kinematic behavior and neural data are the foundation for investigation. By enabling the study of previously unexplored movement tasks, this system can generate insights into the functioning of the mammalian motor system and provide a platform to develop brain-machine interfaces for unconstrained applications.
Collapse
Affiliation(s)
| | - Alissa S Ling
- Department of Electrical Engineering, Stanford University, Stanford, CA, USA
| | - Paul Nuyujukian
- Department of Electrical Engineering, Stanford University, Stanford, CA, USA.,Department of Bioengineering, Stanford University, Stanford, CA, USA.,Department of Neurosurgery, Stanford University, Stanford, CA, USA.,Wu Tsai Neurosciences Institute, Stanford University, Stanford, CA, USA.,Stanford Bio-X, Stanford University, Stanford, CA, USA
| | | |
Collapse
|
26
|
Testard C, Tremblay S, Platt M. From the field to the lab and back: neuroethology of primate social behavior. Curr Opin Neurobiol 2021; 68:76-83. [PMID: 33567386 PMCID: PMC8243779 DOI: 10.1016/j.conb.2021.01.005] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2020] [Revised: 01/11/2021] [Accepted: 01/12/2021] [Indexed: 12/21/2022]
Abstract
Social mammals with more numerous and stronger social relationships live longer, healthier lives. Despite the established importance of social relationships, our understanding of the neurobiological mechanisms by which they are pursued, formed, and maintained in primates remains largely confined to highly controlled laboratory settings which do not allow natural, dynamic social interactions to unfold. In this review, we argue that the neurobiological study of primate social behavior would benefit from adopting a neuroethological approach, that is, a perspective grounded in natural, species-typical behavior, with careful selection of animal models according to the scientific question at hand. We highlight macaques and marmosets as key animal models for human social behavior and summarize recent findings in the social domain for both species. We then review pioneering studies of dynamic social behaviors in small animals, which can inspire studies in larger primates where the technological landscape is now ripe for an ethological overhaul.
Collapse
Affiliation(s)
- Camille Testard
- Department of Neuroscience, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA 19104, USA.
| | - Sébastien Tremblay
- Department of Neuroscience, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA 19104, USA
| | - Michael Platt
- Department of Neuroscience, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA 19104, USA; Psychology Department, University of Pennsylvania, Philadelphia, PA 19104, USA; Marketing Department, The Wharton School of Business, University of Pennsylvania, Philadelphia, PA 19104, USA
| |
Collapse
|