1
|
Marcolan JA, Marino-Neto J. EthoWatcher OS: improving the reproducibility and quality of categorical and morphologic/kinematic data from behavioral recordings in laboratory animals. Med Biol Eng Comput 2025; 63:511-523. [PMID: 39397193 DOI: 10.1007/s11517-024-03212-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2024] [Accepted: 09/23/2024] [Indexed: 10/15/2024]
Abstract
Behavioral recordings annotated by human observers (HOs) from video recordings are a fundamental component of preclinical animal behavioral models of neurobiological diseases. These models are often criticized for their vulnerability to reproducibility issues. Here, we present the EthoWatcher-Open Source (EW-OS), with tools and procedures for the use of blind-to-condition categorical transcriptions that are simultaneous with tracking, for the assessment of HOs intra- and interobserver reliability during training and data collection, for producing video clips of samples of behavioral categories that are useful for observer training. The use of these tools can inform and optimize the performance of observers, thus favoring the reproducibility of the data obtained. Categorical and machine vision-derived outputs are presented in an open data format for increased interoperability with other applications, where behavioral categories are associated frame-by-frame with tracking, morphological and kinematic attributes of an animal's image. The center of mass (X and Y pixel coordinates), the animal's area in square millimeters, the length and width in millimeters, and the angle in degrees were recorded. It also assesses the variation in each morphological descriptor to produce kinematic descriptors. While the initial measurements are in pixels, they are later converted to millimeters using the scale calibrated by the user via the graphical user interfaces. This process enables the creation of databases suitable for machine learning processing and behavioral pharmacology studies. EW-OS is constructed for continued collaborative development, available through an open-source platform, to support initiatives toward the adoption of good scientific practices in behavioral analysis, including tools for evaluating the quality of the data that can alleviate problems associated with low reproducibility in the behavioral sciences.
Collapse
Affiliation(s)
- João Antônio Marcolan
- Laboratory of Computational Neuroscience, Institute of Biomedical Engineering, IEB-UFSC, EEL-CTC, Federal University of Santa Catarina, Florianópolis, SC, 88040-900, Brazil.
| | - José Marino-Neto
- Laboratory of Computational Neuroscience, Institute of Biomedical Engineering, IEB-UFSC, EEL-CTC, Federal University of Santa Catarina, Florianópolis, SC, 88040-900, Brazil
| |
Collapse
|
2
|
Chen Z, Jia G, Zhou Q, Zhang Y, Quan Z, Chen X, Fukuda T, Huang Q, Shi Q. ARBUR, a machine learning-based analysis system for relating behaviors and ultrasonic vocalizations of rats. iScience 2024; 27:109998. [PMID: 38947508 PMCID: PMC11214285 DOI: 10.1016/j.isci.2024.109998] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2024] [Revised: 04/01/2024] [Accepted: 05/14/2024] [Indexed: 07/02/2024] Open
Abstract
Deciphering how different behaviors and ultrasonic vocalizations (USVs) of rats interact can yield insights into the neural basis of social interaction. However, the behavior-vocalization interplay of rats remains elusive because of the challenges of relating the two communication media in complex social contexts. Here, we propose a machine learning-based analysis system (ARBUR) that can cluster without bias both non-step (continuous) and step USVs, hierarchically detect eight types of behavior of two freely behaving rats with high accuracy, and locate the vocal rat in 3-D space. ARBUR reveals that rats communicate via distinct USVs during different behaviors. Moreover, we show that ARBUR can indicate findings that are long neglected by former manual analysis, especially regarding the non-continuous USVs during easy-to-confuse social behaviors. This work could help mechanistically understand the behavior-vocalization interplay of rats and highlights the potential of machine learning algorithms in automatic animal behavioral and acoustic analysis.
Collapse
Affiliation(s)
- Zhe Chen
- School of Medical Technology, Beijing Institute of Technology, Beijing, China
- Key Laboratory of Biomimetic Robots and Systems, Beijing Institute of Technology, Ministry of Education, Beijing, China
| | - Guanglu Jia
- Key Laboratory of Biomimetic Robots and Systems, Beijing Institute of Technology, Ministry of Education, Beijing, China
- Intelligent Robotics Institute, School of Mechatronical Engineering, Beijing Institute of Technology, Beijing, China
| | - Qijie Zhou
- Key Laboratory of Biomimetic Robots and Systems, Beijing Institute of Technology, Ministry of Education, Beijing, China
- Intelligent Robotics Institute, School of Mechatronical Engineering, Beijing Institute of Technology, Beijing, China
| | - Yulai Zhang
- School of Medical Technology, Beijing Institute of Technology, Beijing, China
- Key Laboratory of Biomimetic Robots and Systems, Beijing Institute of Technology, Ministry of Education, Beijing, China
| | - Zhenzhen Quan
- Key Laboratory of Molecular Medicine and Biotherapy, School of Life Science, Beijing Institute of Technology, Beijing, China
| | - Xuechao Chen
- Key Laboratory of Biomimetic Robots and Systems, Beijing Institute of Technology, Ministry of Education, Beijing, China
- Intelligent Robotics Institute, School of Mechatronical Engineering, Beijing Institute of Technology, Beijing, China
| | - Toshio Fukuda
- Institute of Innovation for Future Society, Nagoya University, Nagoya, Japan
| | - Qiang Huang
- Key Laboratory of Biomimetic Robots and Systems, Beijing Institute of Technology, Ministry of Education, Beijing, China
- Intelligent Robotics Institute, School of Mechatronical Engineering, Beijing Institute of Technology, Beijing, China
| | - Qing Shi
- Key Laboratory of Biomimetic Robots and Systems, Beijing Institute of Technology, Ministry of Education, Beijing, China
- Intelligent Robotics Institute, School of Mechatronical Engineering, Beijing Institute of Technology, Beijing, China
| |
Collapse
|
3
|
Camilleri MPJ, Bains RS, Williams CKI. Of Mice and Mates: Automated Classification and Modelling of Mouse Behaviour in Groups Using a Single Model Across Cages. Int J Comput Vis 2024; 132:5491-5513. [PMID: 39554493 PMCID: PMC11568001 DOI: 10.1007/s11263-024-02118-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2023] [Accepted: 05/07/2024] [Indexed: 11/19/2024]
Abstract
Behavioural experiments often happen in specialised arenas, but this may confound the analysis. To address this issue, we provide tools to study mice in the home-cage environment, equipping biologists with the possibility to capture the temporal aspect of the individual's behaviour and model the interaction and interdependence between cage-mates with minimal human intervention. Our main contribution is the novel Global Behaviour Model (GBM) which summarises the joint behaviour of groups of mice across cages, using a permutation matrix to match the mouse identities in each cage to the model. In support of the above, we also (a) developed the Activity Labelling Module (ALM) to automatically classify mouse behaviour from video, and (b) released two datasets, ABODe for training behaviour classifiers and IMADGE for modelling behaviour. Supplementary Information The online version contains supplementary material available at 10.1007/s11263-024-02118-3.
Collapse
|
4
|
Adam AS, LaMalfa KS, Razavi Y, Kohut SJ, Kangas BD. A Multimodal Preclinical Assessment of MDMA in Female and Male Rats: Prohedonic, Cognition Disruptive, and Prosocial Effects. PSYCHEDELIC MEDICINE (NEW ROCHELLE, N.Y.) 2024; 2:96-108. [PMID: 39149579 PMCID: PMC11324000 DOI: 10.1089/psymed.2023.0049] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 08/17/2024]
Abstract
Background Frontline antidepressants such as selective serotonin reuptake inhibitors (SSRIs) leave many patients with unmet treatment needs. Moreover, even when SSRIs reduce depressive symptoms, anhedonia, the loss of pleasure to previously rewarding activities, often remains unabated. This state of affairs is disheartening and calls for the development of medications to more directly treat anhedonia. The atypical psychedelic 3,4-methylenedioxymethamphetamine (MDMA) might have promise as a prohedonic medication given its efficacious applications for treatment-resistant post-traumatic stress disorder and comorbid depression. However, in addition to its prosocial effects as an entactogen, MDMA is also associated with neurotoxic cognitive deficits. The present studies were designed to examine the relative potency of MDMA in female and male rats across three distinct behavioral domains to assist in defining a preclinical profile of MDMA as a candidate prohedonic therapeutic. Methods First, signal detection metrics of reward responsivity were examined using the touchscreen probabilistic reward task (PRT), a reverse-translated assay used to objectively quantify anhedonic phenotypes in humans. Second, to probe potential cognitive deficits, touchscreen-based assays of psychomotor vigilance and delayed matching-to-position were used to examine attentional processes and short-term spatial memory, respectively. Finally, MDMA's entactogenic effects were studied via pairwise assessments of social interaction facilitated by machine-learning analyses. Results Findings show (1) dose-dependent increases in reward responsivity as quantified by the PRT, (2) dose-dependent deficits in attention and short-term memory, and (3) dose-dependent increases in aspects of prosocial interaction in male but not female subjects. Neither the desirable (prohedonic) nor undesirable (cognition disruptive) effects of MDMA persisted beyond 24 h. Conclusions The present results characterize MDMA as a promising prohedonic treatment, notwithstanding some liability for short-lived cognitive impairment following acute administration.
Collapse
Affiliation(s)
- Abshir S. Adam
- Harvard Medical School, McLean Hospital, Belmont, Massachusetts, USA
| | | | - Yasaman Razavi
- Harvard Medical School, McLean Hospital, Belmont, Massachusetts, USA
| | - Stephen J. Kohut
- Harvard Medical School, McLean Hospital, Belmont, Massachusetts, USA
| | - Brian D. Kangas
- Harvard Medical School, McLean Hospital, Belmont, Massachusetts, USA
| |
Collapse
|
5
|
Le VA, Sterley TL, Cheng N, Bains JS, Murari K. Markerless Mouse Tracking for Social Experiments. eNeuro 2024; 11:ENEURO.0154-22.2023. [PMID: 38233144 PMCID: PMC10901195 DOI: 10.1523/eneuro.0154-22.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2022] [Revised: 09/18/2023] [Accepted: 10/31/2023] [Indexed: 01/19/2024] Open
Abstract
Automated behavior quantification in socially interacting animals requires accurate tracking. While many methods have been very successful and highly generalizable to different settings, issues of mistaken identities and lost information on key anatomical features are common, although they can be alleviated by increased human effort in training or post-processing. We propose a markerless video-based tool to simultaneously track two interacting mice of the same appearance in controlled settings for quantifying behaviors such as different types of sniffing, touching, and locomotion to improve tracking accuracy under these settings without increased human effort. It incorporates conventional handcrafted tracking and deep-learning-based techniques. The tool is trained on a small number of manually annotated images from a basic experimental setup and outputs body masks and coordinates of the snout and tail-base for each mouse. The method was tested on several commonly used experimental conditions including bedding in the cage and fiberoptic or headstage implants on the mice. Results obtained without any human corrections after the automated analysis showed a near elimination of identities switches and a ∼15% improvement in tracking accuracy over pure deep-learning-based pose estimation tracking approaches. Our approach can be optionally ensembled with such techniques for further improvement. Finally, we demonstrated an application of this approach in studies of social behavior of mice by quantifying and comparing interactions between pairs of mice in which some lack olfaction. Together, these results suggest that our approach could be valuable for studying group behaviors in rodents, such as social interactions.
Collapse
Affiliation(s)
- Van Anh Le
- Electrical and Software Engineering, University of Calgary, Calgary, AB T2N 1N4, Canada
| | - Toni-Lee Sterley
- Hotchkiss Brain Institute, University of Calgary, Calgary, AB T2N 1N4, Canada
| | - Ning Cheng
- Hotchkiss Brain Institute, University of Calgary, Calgary, AB T2N 1N4, Canada
- Faculty of Veterinary Medicine, University of Calgary, Calgary, AB T2N 1N4, Canada
- Alberta Children's Hospital Research Institute, University of Calgary, Calgary, AB T2N 1N4, Canada
| | - Jaideep S Bains
- Hotchkiss Brain Institute, University of Calgary, Calgary, AB T2N 1N4, Canada
| | - Kartikeya Murari
- Electrical and Software Engineering, University of Calgary, Calgary, AB T2N 1N4, Canada
- Hotchkiss Brain Institute, University of Calgary, Calgary, AB T2N 1N4, Canada
- Biomedical Engineering, University of Calgary, Calgary, AB T2N 1N4, Canada
| |
Collapse
|
6
|
Zhou T, Cheah CCH, Chin EWM, Chen J, Farm HJ, Goh ELK, Chiam KH. ContrastivePose: A contrastive learning approach for self-supervised feature engineering for pose estimation and behavorial classification of interacting animals. Comput Biol Med 2023; 165:107416. [PMID: 37660568 DOI: 10.1016/j.compbiomed.2023.107416] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2023] [Revised: 07/24/2023] [Accepted: 08/28/2023] [Indexed: 09/05/2023]
Abstract
In recent years, supervised machine learning models trained on videos of animals with pose estimation data and behavior labels have been used for automated behavior classification. Applications include, for example, automated detection of neurological diseases in animal models. However, we identify two potential problems of such supervised learning approach. First, such models require a large amount of labeled data but the labeling of behaviors frame by frame is a laborious manual process that is not easily scalable. Second, such methods rely on handcrafted features obtained from pose estimation data that are usually designed empirically. In this paper, we propose to overcome these two problems using contrastive learning for self-supervised feature engineering on pose estimation data. Our approach allows the use of unlabeled videos to learn feature representations and reduce the need for handcrafting of higher-level features from pose positions. We show that this approach to feature representation can achieve better classification performance compared to handcrafted features alone, and that the performance improvement is due to contrastive learning on unlabeled data rather than the neural network architecture. The method has the potential to reduce the bottleneck of scarce labeled videos for training and improve performance of supervised behavioral classification models for the study of interaction behaviors in animals.
Collapse
Affiliation(s)
| | - Calvin Chee Hoe Cheah
- Neuroscience and Mental Health Faculty, Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore
| | - Eunice Wei Mun Chin
- Neuroscience and Mental Health Faculty, Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore
| | - Jie Chen
- School of Biological Sciences, Nanyang Technological University, Singapore
| | - Hui Jia Farm
- Bioinformatics Institute, A*STAR, Singapore; Department of Computer Science, University of Oxford, Oxford, United Kingdom
| | - Eyleen Lay Keow Goh
- Neuroscience and Mental Health Faculty, Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore
| | - Keng Hwee Chiam
- Bioinformatics Institute, A*STAR, Singapore; School of Biological Sciences, Nanyang Technological University, Singapore.
| |
Collapse
|
7
|
Martin Lorenzo S, Muniz Moreno MDM, Atas H, Pellen M, Nalesso V, Raffelsberger W, Prevost G, Lindner L, Birling MC, Menoret S, Tesson L, Negroni L, Concordet JP, Anegon I, Herault Y. Changes in social behavior with MAPK2 and KCTD13/CUL3 pathways alterations in two new outbred rat models for the 16p11.2 syndromes with autism spectrum disorders. Front Neurosci 2023; 17:1148683. [PMID: 37465586 PMCID: PMC10350633 DOI: 10.3389/fnins.2023.1148683] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2023] [Accepted: 05/02/2023] [Indexed: 07/20/2023] Open
Abstract
Copy number variations (CNVs) of the human 16p11.2 locus are associated with several developmental/neurocognitive syndromes. Particularly, deletion and duplication of this genetic interval are found in patients with autism spectrum disorders, intellectual disability and other psychiatric traits. The high gene density associated with the region and the strong phenotypic variability of incomplete penetrance, make the study of the 16p11.2 syndromes extremely complex. To systematically study the effect of 16p11.2 CNVs and identify candidate genes and molecular mechanisms involved in the pathophysiology, mouse models were generated previously and showed learning and memory, and to some extent social deficits. To go further in understanding the social deficits caused by 16p11.2 syndromes, we engineered deletion and duplication of the homologous region to the human 16p11.2 genetic interval in two rat outbred strains, Sprague Dawley (SD) and Long Evans (LE). The 16p11.2 rat models displayed convergent defects in social behavior and in the novel object test in male carriers from both genetic backgrounds. Interestingly major pathways affecting MAPK1 and CUL3 were found altered in the rat 16p11.2 models with additional changes in males compared to females. Altogether, the consequences of the 16p11.2 genetic region dosage on social behavior are now found in three different species: humans, mice and rats. In addition, the rat models pointed to sexual dimorphism with lower severity of phenotypes in rat females compared to male mutants. This phenomenon is also observed in humans. We are convinced that the two rat models will be key to further investigating social behavior and understanding the brain mechanisms and specific brain regions that are key to controlling social behavior.
Collapse
Affiliation(s)
- Sandra Martin Lorenzo
- Université de Strasbourg, CNRS UMR7104, INSERM U1258, Institut de Génétique et de Biologie Moléculaire et Cellulaire, Illkirch, France
| | - Maria Del Mar Muniz Moreno
- Université de Strasbourg, CNRS UMR7104, INSERM U1258, Institut de Génétique et de Biologie Moléculaire et Cellulaire, Illkirch, France
| | - Helin Atas
- Université de Strasbourg, CNRS UMR7104, INSERM U1258, Institut de Génétique et de Biologie Moléculaire et Cellulaire, Illkirch, France
| | - Marion Pellen
- Université de Strasbourg, CNRS UMR7104, INSERM U1258, Institut de Génétique et de Biologie Moléculaire et Cellulaire, Illkirch, France
| | - Valérie Nalesso
- Université de Strasbourg, CNRS UMR7104, INSERM U1258, Institut de Génétique et de Biologie Moléculaire et Cellulaire, Illkirch, France
| | - Wolfgang Raffelsberger
- Université de Strasbourg, CNRS UMR7104, INSERM U1258, Institut de Génétique et de Biologie Moléculaire et Cellulaire, Illkirch, France
| | - Geraldine Prevost
- Université de Strasbourg, CNRS, INSERM, CELPHEDIA-PHENOMIN, Institut Clinique de la Souris, Illkirch, France
| | - Loic Lindner
- Université de Strasbourg, CNRS, INSERM, CELPHEDIA-PHENOMIN, Institut Clinique de la Souris, Illkirch, France
| | - Marie-Christine Birling
- Université de Strasbourg, CNRS, INSERM, CELPHEDIA-PHENOMIN, Institut Clinique de la Souris, Illkirch, France
| | - Séverine Menoret
- Nantes Université, CHU Nantes, INSERM, CNRS, SFR Santé, Inserm UMS 016 CNRS UMS 3556, Nantes, France
- INSERM, Centre de Recherche en Transplantation et Immunologie UMR1064, Nantes Université, Nantes, France
| | - Laurent Tesson
- INSERM, Centre de Recherche en Transplantation et Immunologie UMR1064, Nantes Université, Nantes, France
| | - Luc Negroni
- Université de Strasbourg, CNRS UMR7104, INSERM U1258, Institut de Génétique et de Biologie Moléculaire et Cellulaire, Illkirch, France
| | | | - Ignacio Anegon
- INSERM, Centre de Recherche en Transplantation et Immunologie UMR1064, Nantes Université, Nantes, France
| | - Yann Herault
- Université de Strasbourg, CNRS UMR7104, INSERM U1258, Institut de Génétique et de Biologie Moléculaire et Cellulaire, Illkirch, France
- Université de Strasbourg, CNRS, INSERM, CELPHEDIA-PHENOMIN, Institut Clinique de la Souris, Illkirch, France
| |
Collapse
|
8
|
Xie H, Gao Z, Jia G, Shimoda S, Shi Q. Learning Rat-Like Behavioral Interaction Using a Small-Scale Robotic Rat. CYBORG AND BIONIC SYSTEMS 2023; 4:0032. [PMID: 37342211 PMCID: PMC10278959 DOI: 10.34133/cbsystems.0032] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2023] [Accepted: 04/23/2023] [Indexed: 06/22/2023] Open
Abstract
In this paper, we propose a novel method for emulating rat-like behavioral interactions in robots using reinforcement learning. Specifically, we develop a state decision method to optimize the interaction process among 6 known behavior types that have been identified in previous research on rat interactions. The novelty of our method lies in using the temporal difference (TD) algorithm to optimize the state decision process, which enables the robots to make informed decisions about their behavior choices. To assess the similarity between robot and rat behavior, we use Pearson correlation. We then use TD-λ to update the state value function and make state decisions based on probability. The robots execute these decisions using our dynamics-based controller. Our results demonstrate that our method can generate rat-like behaviors on both short- and long-term timescales, with interaction information entropy comparable to that between real rats. Overall, our approach shows promise for controlling robots in robot-rat interactions and highlights the potential of using reinforcement learning to develop more sophisticated robotic systems.
Collapse
Affiliation(s)
- Hongzhao Xie
- Intelligent Robotics Institute, School of Mechatronical Engineering,
Beijing Institute of Technology, Beijing 100081, China
- Key Laboratory of Biomimetic Robots and Systems (Beijing Institute of Technology), Ministry of Education, Beijing 100081, China
| | - Zihang Gao
- Intelligent Robotics Institute, School of Mechatronical Engineering,
Beijing Institute of Technology, Beijing 100081, China
- Key Laboratory of Biomimetic Robots and Systems (Beijing Institute of Technology), Ministry of Education, Beijing 100081, China
| | - Guanglu Jia
- Intelligent Robotics Institute, School of Mechatronical Engineering,
Beijing Institute of Technology, Beijing 100081, China
- Key Laboratory of Biomimetic Robots and Systems (Beijing Institute of Technology), Ministry of Education, Beijing 100081, China
| | - Shingo Shimoda
- Nagoya University Graduate School of Medicine, Nagoya, Japan
| | - Qing Shi
- Intelligent Robotics Institute, School of Mechatronical Engineering,
Beijing Institute of Technology, Beijing 100081, China
- Key Laboratory of Biomimetic Robots and Systems (Beijing Institute of Technology), Ministry of Education, Beijing 100081, China
| |
Collapse
|
9
|
Yin J, Zhang C, Xie W, Liang G, Zhang L, Gui G. Anomaly traffic detection based on feature fluctuation for secure industrial internet of things. PEER-TO-PEER NETWORKING AND APPLICATIONS 2023; 16:1-16. [PMID: 37362098 PMCID: PMC10131526 DOI: 10.1007/s12083-023-01482-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/25/2022] [Accepted: 03/30/2023] [Indexed: 06/28/2023]
Abstract
The detection of anomaly traffic in internet of things (IoT) is mainly based on the original binary data at the traffic packet level and the structured data at the session flow level. This kind of dataset has a single feature extraction method and relies on prior manual knowledge. It is easy to lose critical information during data processing, which reduces the validity and robustness of the dataset. In this paper, we first construct a new anomaly traffic dataset based on the traffic packet and session flow data in the Iot-23 dataset. Second, we propose a feature extraction method based on feature fluctuation. Our proposed method can effectively solve the disadvantage that the data collected in different scenarios have different characteristics, which leads to the feature containing less information. Compared with the traditional anomaly traffic detection model, experiments show that our proposed method based on feature fluctuation has stronger robustness, can improve the accuracy of anomaly traffic detection and the generalization ability of the traditional model, and is more conducive to the detection of anomalous traffic in IoT.
Collapse
Affiliation(s)
- Jie Yin
- Computer Information and Cyber Security, Jiangsu Police Institute, Nanjing, 210031 China
| | - Chuntang Zhang
- Bell Honors School, Nanjing University of Posts and Telecommunications, Nanjing, 210023 China
| | - Wenwei Xie
- Network Security, Trend Micro Incorporated, Nanjing, 210012 China
| | - Guangjun Liang
- Computer Information and Cyber Security, Jiangsu Police Institute, Nanjing, 210031 China
| | - Lanping Zhang
- College of Telecommunications and Information Engineering, Nanjing University of Posts and Telecommunications, Nanjing, 210003 China
| | - Guan Gui
- College of Telecommunications and Information Engineering, Nanjing University of Posts and Telecommunications, Nanjing, 210003 China
| |
Collapse
|
10
|
Harris C, Finn KR, Kieseler ML, Maechler MR, Tse PU. DeepAction: a MATLAB toolbox for automated classification of animal behavior in video. Sci Rep 2023; 13:2688. [PMID: 36792716 PMCID: PMC9932075 DOI: 10.1038/s41598-023-29574-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2022] [Accepted: 02/07/2023] [Indexed: 02/17/2023] Open
Abstract
The identification of animal behavior in video is a critical but time-consuming task in many areas of research. Here, we introduce DeepAction, a deep learning-based toolbox for automatically annotating animal behavior in video. Our approach uses features extracted from raw video frames by a pretrained convolutional neural network to train a recurrent neural network classifier. We evaluate the classifier on two benchmark rodent datasets and one octopus dataset. We show that it achieves high accuracy, requires little training data, and surpasses both human agreement and most comparable existing methods. We also create a confidence score for classifier output, and show that our method provides an accurate estimate of classifier performance and reduces the time required by human annotators to review and correct automatically-produced annotations. We release our system and accompanying annotation interface as an open-source MATLAB toolbox.
Collapse
Affiliation(s)
- Carl Harris
- grid.254880.30000 0001 2179 2404Department of Psychological and Brain Science, Dartmouth College, Hanover, NH 03755 USA
| | - Kelly R. Finn
- grid.254880.30000 0001 2179 2404Department of Psychological and Brain Science, Dartmouth College, Hanover, NH 03755 USA ,grid.254880.30000 0001 2179 2404Neukom Institute, Dartmouth College, Hanover, NH 03755 USA
| | - Marie-Luise Kieseler
- grid.254880.30000 0001 2179 2404Department of Psychological and Brain Science, Dartmouth College, Hanover, NH 03755 USA
| | - Marvin R. Maechler
- grid.254880.30000 0001 2179 2404Department of Psychological and Brain Science, Dartmouth College, Hanover, NH 03755 USA
| | - Peter U. Tse
- grid.254880.30000 0001 2179 2404Department of Psychological and Brain Science, Dartmouth College, Hanover, NH 03755 USA
| |
Collapse
|
11
|
Xie H, Jia G, Al-Khulaqui M, Gao Z, Guo X, Fukuda T, Shi Q. A Motion Generation Strategy of Robotic Rat Using Imitation Learning for Behavioral Interaction. IEEE Robot Autom Lett 2022. [DOI: 10.1109/lra.2022.3182472] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Affiliation(s)
- Hongzhao Xie
- Key Laboratory of Biomimetic Robots and Systems, Beijing Institute of Technology, Ministry of Education, Beijing, China
| | - Guanglu Jia
- Key Laboratory of Biomimetic Robots and Systems, Beijing Institute of Technology, Ministry of Education, Beijing, China
| | - Mohamed Al-Khulaqui
- Key Laboratory of Biomimetic Robots and Systems, Beijing Institute of Technology, Ministry of Education, Beijing, China
| | - Zihang Gao
- Key Laboratory of Biomimetic Robots and Systems, Beijing Institute of Technology, Ministry of Education, Beijing, China
| | - Xiaowen Guo
- Key Laboratory of Biomimetic Robots and Systems, Beijing Institute of Technology, Ministry of Education, Beijing, China
| | - Toshio Fukuda
- Key Laboratory of Biomimetic Robots and Systems, Beijing Institute of Technology, Ministry of Education, Beijing, China
| | - Qing Shi
- Key Laboratory of Biomimetic Robots and Systems, Beijing Institute of Technology, Ministry of Education, Beijing, China
| |
Collapse
|
12
|
Murine Motion Behavior Recognition Based on DeepLabCut and Convolutional Long Short-Term Memory Network. Symmetry (Basel) 2022. [DOI: 10.3390/sym14071340] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Murine behavior recognition is widely used in biology, neuroscience, pharmacology, and other aspects of research, and provides a basis for judging the psychological and physiological state of mice. To solve the problem whereby traditional behavior recognition methods only model behavioral changes in mice over time or space, we propose a symmetrical algorithm that can capture spatiotemporal information based on behavioral changes. The algorithm first uses the improved DeepLabCut keypoint detection algorithm to locate the nose, left ear, right ear, and tail root of the mouse, and then uses the ConvLSTM network to extract spatiotemporal information from the keypoint feature map sequence to classify five behaviors of mice: walking straight, resting, grooming, standing upright, and turning. We developed a murine keypoint detection and behavior recognition dataset, and experiments showed that the method achieved a percentage of correct keypoints (PCK) of 87±1% at three scales and against four backgrounds, while the classification accuracy for the five kinds of behaviors reached 93±1%. The proposed method is thus accurate for keypoint detection and behavior recognition, and is a useful tool for murine motion behavior recognition.
Collapse
|
13
|
Jia Y, Li S, Guo X, Lei B, Hu J, Xu XH, Zhang W. Selfee, self-supervised features extraction of animal behaviors. eLife 2022; 11:e76218. [PMID: 35708244 PMCID: PMC9296132 DOI: 10.7554/elife.76218] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2021] [Accepted: 06/15/2022] [Indexed: 11/15/2022] Open
Abstract
Fast and accurately characterizing animal behaviors is crucial for neuroscience research. Deep learning models are efficiently used in laboratories for behavior analysis. However, it has not been achieved to use an end-to-end unsupervised neural network to extract comprehensive and discriminative features directly from social behavior video frames for annotation and analysis purposes. Here, we report a self-supervised feature extraction (Selfee) convolutional neural network with multiple downstream applications to process video frames of animal behavior in an end-to-end way. Visualization and classification of the extracted features (Meta-representations) validate that Selfee processes animal behaviors in a way similar to human perception. We demonstrate that Meta-representations can be efficiently used to detect anomalous behaviors that are indiscernible to human observation and hint in-depth analysis. Furthermore, time-series analyses of Meta-representations reveal the temporal dynamics of animal behaviors. In conclusion, we present a self-supervised learning approach to extract comprehensive and discriminative features directly from raw video recordings of animal behaviors and demonstrate its potential usage for various downstream applications.
Collapse
Affiliation(s)
- Yinjun Jia
- School of Life Sciences, IDG/McGovern Institute for Brain Research, Tsinghua UniversityBeijingChina
- Tsinghua-Peking Center for Life SciencesBeijingChina
| | - Shuaishuai Li
- Institute of Neuroscience, State Key Laboratory of Neuroscience, Chinese Academy of Sciences Center for Excellence in Brain Science and Intelligence TechnologyShanghaiChina
- Shanghai Center for Brain Science and Brain-Inspired Intelligence TechnologyShanghaiChina
| | - Xuan Guo
- School of Life Sciences, IDG/McGovern Institute for Brain Research, Tsinghua UniversityBeijingChina
- Tsinghua-Peking Center for Life SciencesBeijingChina
| | - Bo Lei
- School of Life Sciences, IDG/McGovern Institute for Brain Research, Tsinghua UniversityBeijingChina
- Tsinghua-Peking Center for Life SciencesBeijingChina
| | - Junqiang Hu
- School of Life Sciences, IDG/McGovern Institute for Brain Research, Tsinghua UniversityBeijingChina
| | - Xiao-Hong Xu
- Institute of Neuroscience, State Key Laboratory of Neuroscience, Chinese Academy of Sciences Center for Excellence in Brain Science and Intelligence TechnologyShanghaiChina
- Shanghai Center for Brain Science and Brain-Inspired Intelligence TechnologyShanghaiChina
| | - Wei Zhang
- School of Life Sciences, IDG/McGovern Institute for Brain Research, Tsinghua UniversityBeijingChina
- Tsinghua-Peking Center for Life SciencesBeijingChina
| |
Collapse
|
14
|
Grieco F, Bernstein BJ, Biemans B, Bikovski L, Burnett CJ, Cushman JD, van Dam EA, Fry SA, Richmond-Hacham B, Homberg JR, Kas MJH, Kessels HW, Koopmans B, Krashes MJ, Krishnan V, Logan S, Loos M, McCann KE, Parduzi Q, Pick CG, Prevot TD, Riedel G, Robinson L, Sadighi M, Smit AB, Sonntag W, Roelofs RF, Tegelenbosch RAJ, Noldus LPJJ. Measuring Behavior in the Home Cage: Study Design, Applications, Challenges, and Perspectives. Front Behav Neurosci 2021; 15:735387. [PMID: 34630052 PMCID: PMC8498589 DOI: 10.3389/fnbeh.2021.735387] [Citation(s) in RCA: 46] [Impact Index Per Article: 11.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2021] [Accepted: 08/27/2021] [Indexed: 12/14/2022] Open
Abstract
The reproducibility crisis (or replication crisis) in biomedical research is a particularly existential and under-addressed issue in the field of behavioral neuroscience, where, in spite of efforts to standardize testing and assay protocols, several known and unknown sources of confounding environmental factors add to variance. Human interference is a major contributor to variability both within and across laboratories, as well as novelty-induced anxiety. Attempts to reduce human interference and to measure more "natural" behaviors in subjects has led to the development of automated home-cage monitoring systems. These systems enable prolonged and longitudinal recordings, and provide large continuous measures of spontaneous behavior that can be analyzed across multiple time scales. In this review, a diverse team of neuroscientists and product developers share their experiences using such an automated monitoring system that combines Noldus PhenoTyper® home-cages and the video-based tracking software, EthoVision® XT, to extract digital biomarkers of motor, emotional, social and cognitive behavior. After presenting our working definition of a "home-cage", we compare home-cage testing with more conventional out-of-cage tests (e.g., the open field) and outline the various advantages of the former, including opportunities for within-subject analyses and assessments of circadian and ultradian activity. Next, we address technical issues pertaining to the acquisition of behavioral data, such as the fine-tuning of the tracking software and the potential for integration with biotelemetry and optogenetics. Finally, we provide guidance on which behavioral measures to emphasize, how to filter, segment, and analyze behavior, and how to use analysis scripts. We summarize how the PhenoTyper has applications to study neuropharmacology as well as animal models of neurodegenerative and neuropsychiatric illness. Looking forward, we examine current challenges and the impact of new developments. Examples include the automated recognition of specific behaviors, unambiguous tracking of individuals in a social context, the development of more animal-centered measures of behavior and ways of dealing with large datasets. Together, we advocate that by embracing standardized home-cage monitoring platforms like the PhenoTyper, we are poised to directly assess issues pertaining to reproducibility, and more importantly, measure features of rodent behavior under more ethologically relevant scenarios.
Collapse
Affiliation(s)
| | - Briana J Bernstein
- Neurobiology Laboratory, National Institute of Environmental Health Sciences, National Institutes of Health, Research Triangle Park, NC, United States
| | | | - Lior Bikovski
- Myers Neuro-Behavioral Core Facility, Sackler Faculty of Medicine, Tel Aviv University, Tel Aviv, Israel
- School of Behavioral Sciences, Netanya Academic College, Netanya, Israel
| | - C Joseph Burnett
- Nash Family Department of Neuroscience, Icahn School of Medicine at Mount Sinai, New York, NY, United States
| | - Jesse D Cushman
- Neurobiology Laboratory, National Institute of Environmental Health Sciences, National Institutes of Health, Research Triangle Park, NC, United States
| | | | - Sydney A Fry
- Neurobiology Laboratory, National Institute of Environmental Health Sciences, National Institutes of Health, Research Triangle Park, NC, United States
| | - Bar Richmond-Hacham
- Department of Anatomy and Anthropology, Sackler School of Medicine, Tel Aviv University, Tel Aviv, Israel
| | - Judith R Homberg
- Department of Cognitive Neuroscience, Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen Medical Centre, Nijmegen, Netherlands
| | - Martien J H Kas
- Groningen Institute for Evolutionary Life Sciences, University of Groningen, Groningen, Netherlands
| | - Helmut W Kessels
- Swammerdam Institute for Life Sciences, University of Amsterdam, Amsterdam, Netherlands
| | | | - Michael J Krashes
- National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, Bethesda, MD, United States
| | - Vaishnav Krishnan
- Laboratory of Epilepsy and Emotional Behavior, Baylor Comprehensive Epilepsy Center, Departments of Neurology, Neuroscience, and Psychiatry & Behavioral Sciences, Baylor College of Medicine, Houston, TX, United States
| | - Sreemathi Logan
- Department of Rehabilitation Sciences, College of Allied Health, University of Oklahoma Health Sciences Center, Oklahoma City, OK, United States
| | - Maarten Loos
- Sylics (Synaptologics BV), Amsterdam, Netherlands
| | - Katharine E McCann
- Neurobiology Laboratory, National Institute of Environmental Health Sciences, National Institutes of Health, Research Triangle Park, NC, United States
| | | | - Chaim G Pick
- Department of Anatomy and Anthropology, Sackler School of Medicine, Tel Aviv University, Tel Aviv, Israel
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
- The Dr. Miriam and Sheldon G. Adelson Chair and Center for the Biology of Addictive Diseases, Tel Aviv University, Tel Aviv, Israel
| | - Thomas D Prevot
- Centre for Addiction and Mental Health and Department of Psychiatry, University of Toronto, Toronto, ON, Canada
| | - Gernot Riedel
- Institute of Medical Sciences, University of Aberdeen, Aberdeen, United Kingdom
| | - Lianne Robinson
- Institute of Medical Sciences, University of Aberdeen, Aberdeen, United Kingdom
| | - Mina Sadighi
- Department of Cognitive Neuroscience, Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen Medical Centre, Nijmegen, Netherlands
| | - August B Smit
- Department of Molecular and Cellular Neurobiology, Center for Neurogenomics and Cognitive Research, VU University Amsterdam, Amsterdam, Netherlands
| | - William Sonntag
- Department of Biochemistry & Molecular Biology, Center for Geroscience, University of Oklahoma Health Sciences Center, Oklahoma City, OK, United States
| | | | | | - Lucas P J J Noldus
- Noldus Information Technology BV, Wageningen, Netherlands
- Department of Biophysics, Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, Netherlands
| |
Collapse
|
15
|
Let's get wild: A review of free-ranging rat assays as context-enriched supplements to traditional laboratory models. J Neurosci Methods 2021; 362:109303. [PMID: 34352335 DOI: 10.1016/j.jneumeth.2021.109303] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2021] [Revised: 07/21/2021] [Accepted: 07/29/2021] [Indexed: 01/30/2023]
Abstract
More than 24,000 rodent studies are published annually, with the vast majority of these studies focused on genetically undiverse animals in highly-controlled laboratory settings. However, findings from the laboratory have become increasingly unreliable for predicting outcomes in field and clinical settings, leading to a perceived crisis in translational research. One cause of this disparity might be that most human societies, in contrast to laboratory rodents, are genetically diverse and live in super-enriched environments. Methods for importing wild rats into the laboratory, and also exporting laboratory-style chambers into natural environments are not well-known outside their respective disciplines. Therefore, we have reviewed the current status of supplements to the laboratory rodent assay. We progress logically from highly-controlled experiments with natural breeding colonies to purely naturalistic approaches with free-ranging rats. We then highlight a number of approaches that allow genetically-diverse wild rats to be utilized in context-enriched paradigms. While considering the benefits and shortcomings of each available approach, we detail protocols for random sampling, remote-sensing, and deployment of laboratory chambers in the field. As supplements to standardized laboratory trials, some of these assays could offer key insights to help unify outcomes between laboratory and field studies. However, we note several outstanding questions that must be addressed such as: the trade-off between control and context, possible reductions in sample size, ramifications for the 'standardization fallacy', and ethical dilemmas of working with wild animals. Given these challenges, further innovation will be required before supplemental assays can be made broadly-accessible and thus, transferrable across disciplines.
Collapse
|
16
|
Defensor EB, Lim MA, Schaevitz LR. Biomonitoring and Digital Data Technology as an Opportunity for Enhancing Animal Study Translation. ILAR J 2021; 62:223-231. [PMID: 34097730 DOI: 10.1093/ilar/ilab018] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2019] [Accepted: 03/17/2021] [Indexed: 02/01/2023] Open
Abstract
The failure of animal studies to translate to effective clinical therapeutics has driven efforts to identify underlying cause and develop solutions that improve the reproducibility and translatability of preclinical research. Common issues revolve around study design, analysis, and reporting as well as standardization between preclinical and clinical endpoints. To address these needs, recent advancements in digital technology, including biomonitoring of digital biomarkers, development of software systems and database technologies, as well as application of artificial intelligence to preclinical datasets can be used to increase the translational relevance of preclinical animal research. In this review, we will describe how a number of innovative digital technologies are being applied to overcome recurring challenges in study design, execution, and data sharing as well as improving scientific outcome measures. Examples of how these technologies are applied to specific therapeutic areas are provided. Digital technologies can enhance the quality of preclinical research and encourage scientific collaboration, thus accelerating the development of novel therapeutics.
Collapse
|
17
|
Improved 3D tracking and automated classification of rodents' behavioral activity using depth-sensing cameras. Behav Res Methods 2021; 52:2156-2167. [PMID: 32232737 DOI: 10.3758/s13428-020-01381-9] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Analysis of rodents' behavior/activity is of fundamental importance in many research fields. However, many behavioral experiments still rely on manual scoring, with obvious problems in reproducibility. Despite important advances in video-analysis systems and computational ethology, automated behavior quantification is still a challenge. The need for large training datasets, background stability requirements, and reduction to two-dimensional analysis (impairing full posture characterization), limit their use. Here we present a novel integrated solution for behavioral analysis of individual rats, combining video segmentation, tracking of body parts, and automated classification of behaviors, using machine learning and computer vision methods. Low-cost depth cameras (RGB-D) are used to enable three-dimensional tracking and classification in dark conditions and absence of color contrast. Our solution automatically tracks five anatomical landmarks in dynamic environments and recognizes seven distinct behaviors, within the accuracy range of human annotations. The developed free software was validated in experiments where behavioral differences between Wistar Kyoto and Wistar rats were automatically quantified. The results reveal the capability for effective automated phenotyping. An extended annotated RGB-D dataset is also made publicly available. The proposed solution is an easy-to-use tool, with low-cost setup and powerful 3D segmentation methods (in static/dynamic environments). The ability to work in dark conditions means that natural animal behavior is not affected by recording lights. Furthermore, automated classification is possible with only ~30 minutes of annotated videos. By creating conditions for high-throughput analysis and reproducible quantitative measurements of animal behavior experiments, we believe this contribution can greatly improve behavioral analysis research.
Collapse
|
18
|
von Ziegler L, Sturman O, Bohacek J. Big behavior: challenges and opportunities in a new era of deep behavior profiling. Neuropsychopharmacology 2021; 46:33-44. [PMID: 32599604 PMCID: PMC7688651 DOI: 10.1038/s41386-020-0751-7] [Citation(s) in RCA: 57] [Impact Index Per Article: 14.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/17/2020] [Revised: 06/19/2020] [Accepted: 06/22/2020] [Indexed: 12/11/2022]
Abstract
The assessment of rodent behavior forms a cornerstone of preclinical assessment in neuroscience research. Nonetheless, the true and almost limitless potential of behavioral analysis has been inaccessible to scientists until very recently. Now, in the age of machine vision and deep learning, it is possible to extract and quantify almost infinite numbers of behavioral variables, to break behaviors down into subcategories and even into small behavioral units, syllables or motifs. However, the rapidly growing field of behavioral neuroethology is experiencing birthing pains. The community has not yet consolidated its methods, and new algorithms transfer poorly between labs. Benchmarking experiments as well as the large, well-annotated behavior datasets required are missing. Meanwhile, big data problems have started arising and we currently lack platforms for sharing large datasets-akin to sequencing repositories in genomics. Additionally, the average behavioral research lab does not have access to the latest tools to extract and analyze behavior, as their implementation requires advanced computational skills. Even so, the field is brimming with excitement and boundless opportunity. This review aims to highlight the potential of recent developments in the field of behavioral analysis, whilst trying to guide a consensus on practical issues concerning data collection and data sharing.
Collapse
Affiliation(s)
- Lukas von Ziegler
- Department of Health Sciences and Technology, ETH, Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Zurich, Switzerland
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Oliver Sturman
- Department of Health Sciences and Technology, ETH, Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Zurich, Switzerland
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Johannes Bohacek
- Department of Health Sciences and Technology, ETH, Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Zurich, Switzerland.
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland.
| |
Collapse
|
19
|
Ahmed M, Ramzan M, Ullah Khan H, Iqbal S, Attique Khan M, Choi JI, Nam Y, Kadry S. Real-Time Violent Action Recognition Using Key Frames Extraction and Deep Learning. COMPUTERS, MATERIALS & CONTINUA 2021; 69:2217-2230. [DOI: 10.32604/cmc.2021.018103] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/24/2021] [Accepted: 04/24/2021] [Indexed: 08/25/2024]
|
20
|
Sturman O, von Ziegler L, Schläppi C, Akyol F, Privitera M, Slominski D, Grimm C, Thieren L, Zerbi V, Grewe B, Bohacek J. Deep learning-based behavioral analysis reaches human accuracy and is capable of outperforming commercial solutions. Neuropsychopharmacology 2020; 45:1942-1952. [PMID: 32711402 PMCID: PMC7608249 DOI: 10.1038/s41386-020-0776-y] [Citation(s) in RCA: 89] [Impact Index Per Article: 17.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/01/2020] [Revised: 07/13/2020] [Accepted: 07/15/2020] [Indexed: 12/27/2022]
Abstract
To study brain function, preclinical research heavily relies on animal monitoring and the subsequent analyses of behavior. Commercial platforms have enabled semi high-throughput behavioral analyses by automating animal tracking, yet they poorly recognize ethologically relevant behaviors and lack the flexibility to be employed in variable testing environments. Critical advances based on deep-learning and machine vision over the last couple of years now enable markerless tracking of individual body parts of freely moving rodents with high precision. Here, we compare the performance of commercially available platforms (EthoVision XT14, Noldus; TSE Multi-Conditioning System, TSE Systems) to cross-verified human annotation. We provide a set of videos-carefully annotated by several human raters-of three widely used behavioral tests (open field test, elevated plus maze, forced swim test). Using these data, we then deployed the pose estimation software DeepLabCut to extract skeletal mouse representations. Using simple post-analyses, we were able to track animals based on their skeletal representation in a range of classic behavioral tests at similar or greater accuracy than commercial behavioral tracking systems. We then developed supervised machine learning classifiers that integrate the skeletal representation with the manual annotations. This new combined approach allows us to score ethologically relevant behaviors with similar accuracy to humans, the current gold standard, while outperforming commercial solutions. Finally, we show that the resulting machine learning approach eliminates variation both within and between human annotators. In summary, our approach helps to improve the quality and accuracy of behavioral data, while outperforming commercial systems at a fraction of the cost.
Collapse
Grants
- ETH Zurich, ETH Project Grant ETH-20 19-1, the SNSF Grant CRSII5-173721, Swiss Data Science Center C17-18, Neuroscience Center Zurich Project Grants Oxford/McGill/Zurich Partnership.
- ETH Zurich, ETH Project Grant ETH-20 19-1, the SNSF Grant 310030_172889/1, Forschungskredit of the University of Zurich FK-15-035, Vontobel-Foundation, Novartis Foundation for Medical Biological Research, EMDO-Foundation, Olga Mayenfisch Foundation, Betty and David Koetser Foundation for Brain Research, Neuroscience Center Zurich Project Grants Oxford/McGill/Zurich Partnership
Collapse
Affiliation(s)
- Oliver Sturman
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Lukas von Ziegler
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Christa Schläppi
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Furkan Akyol
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Mattia Privitera
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Daria Slominski
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Christina Grimm
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
- Neural Control of Movement Lab, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland
| | - Laetitia Thieren
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
- Experimental Imaging and Neuroenergetics, Institute of Pharmacology and Toxicology, University of Zurich, Zurich, Switzerland
| | - Valerio Zerbi
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
- Neural Control of Movement Lab, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland
| | - Benjamin Grewe
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
- Institute of Neuroinformatics, University of Zurich and ETH Zurich, Zurich, Switzerland
- Department of Information Technology and Electrical Engineering, ETH Zurich, Zurich, Switzerland
| | - Johannes Bohacek
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland.
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland.
| |
Collapse
|
21
|
Riedel G, Grant R, Sullivan M, Spink A. Preface: Special issue on Measuring Behaviour 2018. J Neurosci Methods 2020; 337:108681. [PMID: 32145226 DOI: 10.1016/j.jneumeth.2020.108681] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
22
|
Abstract
Neuropharmacological interventions in preclinical translational models of impulsivity have tremendously contributed to a better understanding of the neurochemistry and neural basis of impulsive behaviour. In this regard, much progress has been made over the last years, also due to the introduction of novel techniques in behavioural neuroscience such as optogenetics and chemogenetics. In this chapter, we will provide an update of how the behavioural pharmacology field has progressed and built upon existing data since an earlier review we wrote in 2008. To this aim, we will first give a brief background on preclinical translational models of impulsivity. Next, recent interesting evidence of monoaminergic modulation of impulsivity will be highlighted with a focus on the neurotransmitters dopamine and noradrenaline. Finally, we will close the chapter by discussing some novel directions and drug leads in the neuropharmacological modulation of impulsivity.
Collapse
Affiliation(s)
- Tommy Pattij
- Department of Anatomy and Neurosciences, Amsterdam Neuroscience, Amsterdam University Medical Centers, VU University Medical Center, Amsterdam, The Netherlands.
| | - Louk J M J Vanderschuren
- Division of Behavioural Neuroscience, Department of Animals in Science and Society, Faculty of Veterinary Medicine, Utrecht University, Utrecht, The Netherlands
| |
Collapse
|
23
|
A Robust Real-Time Detecting and Tracking Framework for Multiple Kinds of Unmarked Object. SENSORS 2019; 20:s20010002. [PMID: 31861254 PMCID: PMC6982905 DOI: 10.3390/s20010002] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/09/2019] [Revised: 12/10/2019] [Accepted: 12/13/2019] [Indexed: 11/16/2022]
Abstract
A rodent real-time tracking framework is proposed to automatically detect and track multi-objects in real time and output the coordinates of each object, which combines deep learning (YOLO v3: You Only Look Once, v3), the Kalman Filter, improved Hungarian algorithm, and the nine-point position correction algorithm. A model of a Rat-YOLO is trained in our experiment. The Kalman Filter model is established in an acceleration model to predict the position of the rat in the next frame. The predicted data is used to fill the losing position of rats if the Rat-YOLO doesn't work in the current frame, and to associate the ID between the last frame and current frame. The Hungarian assigned algorithm is used to show the relationship between the objects of the last frame and the objects of the current frame and match the ID of the objects. The nine-point position correction algorithm is presented to adjust the correctness of the Rat-YOLO result and the predicted results. As the training of deep learning needs more datasets than our experiment, and it is time-consuming to process manual marking, automatic software for generating labeled datasets is proposed under a fixed scene and the labeled datasets are manually verified in term of their correctness. Besides this, in an off-line experiment, a mask is presented to remove the highlight. In this experiment, we select the 500 frames of the data as the training datasets and label these images with the automatic label generating software. A video (of 2892 frames) is tested by the trained Rat model and the accuracy of detecting all the three rats is around 72.545%, however, the Rat-YOLO combining the Kalman Filter and nine-point position correction arithmetic improved the accuracy to 95.194%.
Collapse
|
24
|
Gulinello M, Mitchell HA, Chang Q, Timothy O'Brien W, Zhou Z, Abel T, Wang L, Corbin JG, Veeraragavan S, Samaco RC, Andrews NA, Fagiolini M, Cole TB, Burbacher TM, Crawley JN. Rigor and reproducibility in rodent behavioral research. Neurobiol Learn Mem 2019; 165:106780. [PMID: 29307548 PMCID: PMC6034984 DOI: 10.1016/j.nlm.2018.01.001] [Citation(s) in RCA: 62] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2017] [Revised: 12/22/2017] [Accepted: 01/03/2018] [Indexed: 01/08/2023]
Abstract
Behavioral neuroscience research incorporates the identical high level of meticulous methodologies and exacting attention to detail as all other scientific disciplines. To achieve maximal rigor and reproducibility of findings, well-trained investigators employ a variety of established best practices. Here we explicate some of the requirements for rigorous experimental design and accurate data analysis in conducting mouse and rat behavioral tests. Novel object recognition is used as an example of a cognitive assay which has been conducted successfully with a range of methods, all based on common principles of appropriate procedures, controls, and statistics. Directors of Rodent Core facilities within Intellectual and Developmental Disabilities Research Centers contribute key aspects of their own novel object recognition protocols, offering insights into essential similarities and less-critical differences. Literature cited in this review article will lead the interested reader to source papers that provide step-by-step protocols which illustrate optimized methods for many standard rodent behavioral assays. Adhering to best practices in behavioral neuroscience will enhance the value of animal models for the multiple goals of understanding biological mechanisms, evaluating consequences of genetic mutations, and discovering efficacious therapeutics.
Collapse
Affiliation(s)
- Maria Gulinello
- IDDRC Behavioral Core Facility, Neuroscience Department, Albert Einstein College of Medicine, Bronx, NY 10461, USA
| | - Heather A Mitchell
- IDD Models Core, Waisman Center, University of Wisconsin Madison, Madison, WI 53705, USA
| | - Qiang Chang
- IDD Models Core, Waisman Center, University of Wisconsin Madison, Madison, WI 53705, USA
| | - W Timothy O'Brien
- IDDRC Preclinical Models Core, Children's Hospital of Philadelphia and University of Pennsylvania Perelman School of Medicine, Philadelphia, PA 19104, USA
| | - Zhaolan Zhou
- IDDRC Preclinical Models Core, Children's Hospital of Philadelphia and University of Pennsylvania Perelman School of Medicine, Philadelphia, PA 19104, USA
| | - Ted Abel
- IDDRC Preclinical Models Core, Children's Hospital of Philadelphia and University of Pennsylvania Perelman School of Medicine, Philadelphia, PA 19104, USA; Current affiliation: Iowa Neuroscience Institute, University of Iowa, Iowa City, IA 52242, USA
| | - Li Wang
- IDDRC Neurobehavioral Core, Center for Neuroscience Research, Children's National Health System, Washington, DC 20010, USA
| | - Joshua G Corbin
- IDDRC Neurobehavioral Core, Center for Neuroscience Research, Children's National Health System, Washington, DC 20010, USA
| | - Surabi Veeraragavan
- IDDRC Neurobehavioral Core, Baylor College of Medicine, Houston, TX 77030, USA
| | - Rodney C Samaco
- IDDRC Neurobehavioral Core, Baylor College of Medicine, Houston, TX 77030, USA
| | - Nick A Andrews
- IDDRC Neurodevelopmental Behavior Core, Boston Children's Hospital, Boston, MA 02115, USA
| | - Michela Fagiolini
- IDDRC Neurodevelopmental Behavior Core, Boston Children's Hospital, Boston, MA 02115, USA
| | - Toby B Cole
- IDDRC Rodent Behavior Laboratory, Center on Human Development and Disability, University of Washington, Seattle, WA 98195, USA
| | - Thomas M Burbacher
- IDDRC Rodent Behavior Laboratory, Center on Human Development and Disability, University of Washington, Seattle, WA 98195, USA
| | - Jacqueline N Crawley
- IDDRC Rodent Behavior Core, MIND Institute, University of California Davis School of Medicine, Sacramento, CA 95817, USA.
| |
Collapse
|
25
|
Preface: Special issue on measuring behaviour 2016. J Neurosci Methods 2019; 300:1-3. [PMID: 29606274 DOI: 10.1016/j.jneumeth.2018.03.003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022]
|
26
|
Guo B, Luo G, Weng Z, Zhu Y. Annular Sector Model for tracking multiple indistinguishable and deformable objects in occlusions. Neurocomputing 2019. [DOI: 10.1016/j.neucom.2018.12.054] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
27
|
Wang Z, Mirbozorgi SA, Ghovanloo M. An automated behavior analysis system for freely moving rodents using depth image. Med Biol Eng Comput 2018; 56:1807-1821. [PMID: 29560548 DOI: 10.1007/s11517-018-1816-1] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2017] [Accepted: 03/08/2018] [Indexed: 11/26/2022]
Abstract
A rodent behavior analysis system is presented, capable of automated tracking, pose estimation, and recognition of nine behaviors in freely moving animals. The system tracks three key points on the rodent body (nose, center of body, and base of tail) to estimate its pose and head rotation angle in real time. A support vector machine (SVM)-based model, including label optimization steps, is trained to classify on a frame-by-frame basis: resting, walking, bending, grooming, sniffing, rearing supported, rearing unsupported, micro-movements, and "other" behaviors. Compared to conventional red-green-blue (RGB) camera-based methods, the proposed system operates on 3D depth images provided by the Kinect infrared (IR) camera, enabling stable performance regardless of lighting conditions and animal color contrast with the background. This is particularly beneficial for monitoring nocturnal animals' behavior. 3D features are designed to be extracted directly from the depth stream and combined with contour-based 2D features to further improve recognition accuracies. The system is validated on three freely behaving rats for 168 min in total. The behavior recognition model achieved a cross-validation accuracy of 86.8% on the rat used for training and accuracies of 82.1 and 83% on the other two "testing" rats. The automated head angle estimation aided by behavior recognition resulted in 0.76 correlation with human expert annotation. Graphical abstract Top view of a rat freely behaving in a standard homecage, captured by Kinect-v2 sensors. The depth image is used for constructing a 3D topography of the animal for pose estimation, behavior recognition, and head angle calculation. Results of the processed data are displayed on the user interface in various forms.
Collapse
Affiliation(s)
- Zheyuan Wang
- GT-Bionics Lab, School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA, 30308, USA
| | - S Abdollah Mirbozorgi
- GT-Bionics Lab, School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA, 30308, USA
| | - Maysam Ghovanloo
- GT-Bionics Lab, School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA, 30308, USA.
| |
Collapse
|