1
|
Marcolan JA, Marino-Neto J. EthoWatcher OS: improving the reproducibility and quality of categorical and morphologic/kinematic data from behavioral recordings in laboratory animals. Med Biol Eng Comput 2024:10.1007/s11517-024-03212-x. [PMID: 39397193 DOI: 10.1007/s11517-024-03212-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2024] [Accepted: 09/23/2024] [Indexed: 10/15/2024]
Abstract
Behavioral recordings annotated by human observers (HOs) from video recordings are a fundamental component of preclinical animal behavioral models of neurobiological diseases. These models are often criticized for their vulnerability to reproducibility issues. Here, we present the EthoWatcher-Open Source (EW-OS), with tools and procedures for the use of blind-to-condition categorical transcriptions that are simultaneous with tracking, for the assessment of HOs intra- and interobserver reliability during training and data collection, for producing video clips of samples of behavioral categories that are useful for observer training. The use of these tools can inform and optimize the performance of observers, thus favoring the reproducibility of the data obtained. Categorical and machine vision-derived outputs are presented in an open data format for increased interoperability with other applications, where behavioral categories are associated frame-by-frame with tracking, morphological and kinematic attributes of an animal's image. The center of mass (X and Y pixel coordinates), the animal's area in square millimeters, the length and width in millimeters, and the angle in degrees were recorded. It also assesses the variation in each morphological descriptor to produce kinematic descriptors. While the initial measurements are in pixels, they are later converted to millimeters using the scale calibrated by the user via the graphical user interfaces. This process enables the creation of databases suitable for machine learning processing and behavioral pharmacology studies. EW-OS is constructed for continued collaborative development, available through an open-source platform, to support initiatives toward the adoption of good scientific practices in behavioral analysis, including tools for evaluating the quality of the data that can alleviate problems associated with low reproducibility in the behavioral sciences.
Collapse
Affiliation(s)
- João Antônio Marcolan
- Laboratory of Computational Neuroscience, Institute of Biomedical Engineering, IEB-UFSC, EEL-CTC, Federal University of Santa Catarina, Florianópolis, SC, 88040-900, Brazil.
| | - José Marino-Neto
- Laboratory of Computational Neuroscience, Institute of Biomedical Engineering, IEB-UFSC, EEL-CTC, Federal University of Santa Catarina, Florianópolis, SC, 88040-900, Brazil
| |
Collapse
|
2
|
Hobkirk ER, Twiss SD. Domestication constrains the ability of dogs to convey emotions via facial expressions in comparison to their wolf ancestors. Sci Rep 2024; 14:10491. [PMID: 38714729 PMCID: PMC11076640 DOI: 10.1038/s41598-024-61110-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2023] [Accepted: 05/02/2024] [Indexed: 05/10/2024] Open
Abstract
Dogs (Canis lupus familiaris) are the domestically bred descendant of wolves (Canis lupus). However, selective breeding has profoundly altered facial morphologies of dogs compared to their wolf ancestors. We demonstrate that these morphological differences limit the abilities of dogs to successfully produce the same affective facial expressions as wolves. We decoded facial movements of captive wolves during social interactions involving nine separate affective states. We used linear discriminant analyses to predict affective states based on combinations of facial movements. The resulting confusion matrix demonstrates that specific combinations of facial movements predict nine distinct affective states in wolves; the first assessment of this many affective facial expressions in wolves. However, comparative analyses with kennelled rescue dogs revealed reduced ability to predict affective states. Critically, there was a very low predictive power for specific affective states, with confusion occurring between negative and positive states, such as Friendly and Fear. We show that the varying facial morphologies of dogs (specifically non-wolf-like morphologies) limit their ability to produce the same range of affective facial expressions as wolves. Confusion among positive and negative states could be detrimental to human-dog interactions, although our analyses also suggest dogs likely use vocalisations to compensate for limitations in facial communication.
Collapse
Affiliation(s)
- Elana R Hobkirk
- Department of Biosciences, Durham University, Durham, DH1 3LE, UK
| | - Sean D Twiss
- Department of Biosciences, Durham University, Durham, DH1 3LE, UK.
| |
Collapse
|
3
|
Hardin A, Schlupp I. Using machine learning and DeepLabCut in animal behavior. Acta Ethol 2022. [DOI: 10.1007/s10211-022-00397-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
|
4
|
McDermott‐Rouse A, Minga E, Barlow I, Feriani L, Harlow PH, Flemming AJ, Brown AEX. Behavioral fingerprints predict insecticide and anthelmintic mode of action. Mol Syst Biol 2021; 17:e10267. [PMID: 34031985 PMCID: PMC8144879 DOI: 10.15252/msb.202110267] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2021] [Revised: 04/21/2021] [Accepted: 04/22/2021] [Indexed: 12/26/2022] Open
Abstract
Novel invertebrate-killing compounds are required in agriculture and medicine to overcome resistance to existing treatments. Because insecticides and anthelmintics are discovered in phenotypic screens, a crucial step in the discovery process is determining the mode of action of hits. Visible whole-organism symptoms are combined with molecular and physiological data to determine mode of action. However, manual symptomology is laborious and requires symptoms that are strong enough to see by eye. Here, we use high-throughput imaging and quantitative phenotyping to measure Caenorhabditis elegans behavioral responses to compounds and train a classifier that predicts mode of action with an accuracy of 88% for a set of ten common modes of action. We also classify compounds within each mode of action to discover substructure that is not captured in broad mode-of-action labels. High-throughput imaging and automated phenotyping could therefore accelerate mode-of-action discovery in invertebrate-targeting compound development and help to refine mode-of-action categories.
Collapse
Affiliation(s)
- Adam McDermott‐Rouse
- MRC London Institute of Medical SciencesLondonUK
- Faculty of MedicineInstitute of Clinical SciencesImperial College LondonLondonUK
| | - Eleni Minga
- MRC London Institute of Medical SciencesLondonUK
- Faculty of MedicineInstitute of Clinical SciencesImperial College LondonLondonUK
| | - Ida Barlow
- MRC London Institute of Medical SciencesLondonUK
- Faculty of MedicineInstitute of Clinical SciencesImperial College LondonLondonUK
| | - Luigi Feriani
- MRC London Institute of Medical SciencesLondonUK
- Faculty of MedicineInstitute of Clinical SciencesImperial College LondonLondonUK
| | | | | | - André E X Brown
- MRC London Institute of Medical SciencesLondonUK
- Faculty of MedicineInstitute of Clinical SciencesImperial College LondonLondonUK
| |
Collapse
|
5
|
von Ziegler L, Sturman O, Bohacek J. Big behavior: challenges and opportunities in a new era of deep behavior profiling. Neuropsychopharmacology 2021; 46:33-44. [PMID: 32599604 PMCID: PMC7688651 DOI: 10.1038/s41386-020-0751-7] [Citation(s) in RCA: 52] [Impact Index Per Article: 17.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/17/2020] [Revised: 06/19/2020] [Accepted: 06/22/2020] [Indexed: 12/11/2022]
Abstract
The assessment of rodent behavior forms a cornerstone of preclinical assessment in neuroscience research. Nonetheless, the true and almost limitless potential of behavioral analysis has been inaccessible to scientists until very recently. Now, in the age of machine vision and deep learning, it is possible to extract and quantify almost infinite numbers of behavioral variables, to break behaviors down into subcategories and even into small behavioral units, syllables or motifs. However, the rapidly growing field of behavioral neuroethology is experiencing birthing pains. The community has not yet consolidated its methods, and new algorithms transfer poorly between labs. Benchmarking experiments as well as the large, well-annotated behavior datasets required are missing. Meanwhile, big data problems have started arising and we currently lack platforms for sharing large datasets-akin to sequencing repositories in genomics. Additionally, the average behavioral research lab does not have access to the latest tools to extract and analyze behavior, as their implementation requires advanced computational skills. Even so, the field is brimming with excitement and boundless opportunity. This review aims to highlight the potential of recent developments in the field of behavioral analysis, whilst trying to guide a consensus on practical issues concerning data collection and data sharing.
Collapse
Affiliation(s)
- Lukas von Ziegler
- Department of Health Sciences and Technology, ETH, Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Zurich, Switzerland
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Oliver Sturman
- Department of Health Sciences and Technology, ETH, Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Zurich, Switzerland
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Johannes Bohacek
- Department of Health Sciences and Technology, ETH, Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Zurich, Switzerland.
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland.
| |
Collapse
|
6
|
Leng X, Wohl M, Ishii K, Nayak P, Asahina K. Quantifying influence of human choice on the automated detection of Drosophila behavior by a supervised machine learning algorithm. PLoS One 2020; 15:e0241696. [PMID: 33326445 PMCID: PMC7743940 DOI: 10.1371/journal.pone.0241696] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2020] [Accepted: 10/16/2020] [Indexed: 11/22/2022] Open
Abstract
Automated quantification of behavior is increasingly prevalent in neuroscience research. Human judgments can influence machine-learning-based behavior classification at multiple steps in the process, for both supervised and unsupervised approaches. Such steps include the design of the algorithm for machine learning, the methods used for animal tracking, the choice of training images, and the benchmarking of classification outcomes. However, how these design choices contribute to the interpretation of automated behavioral classifications has not been extensively characterized. Here, we quantify the effects of experimenter choices on the outputs of automated classifiers of Drosophila social behaviors. Drosophila behaviors contain a considerable degree of variability, which was reflected in the confidence levels associated with both human and computer classifications. We found that a diversity of sex combinations and tracking features was important for robust performance of the automated classifiers. In particular, features concerning the relative position of flies contained useful information for training a machine-learning algorithm. These observations shed light on the importance of human influence on tracking algorithms, the selection of training images, and the quality of annotated sample images used to benchmark the performance of a classifier (the ‘ground truth’). Evaluation of these factors is necessary for researchers to accurately interpret behavioral data quantified by a machine-learning algorithm and to further improve automated classifications.
Collapse
Affiliation(s)
- Xubo Leng
- Salk Institute for Biological Studies, La Jolla, California, United States of America
- Department of Electrical and Computer Engineering, University of California, San Diego, La Jolla, California, United States of America
| | - Margot Wohl
- Salk Institute for Biological Studies, La Jolla, California, United States of America
- Neuroscience Graduate Program, University of California, San Diego, La Jolla, California, United States of America
| | - Kenichi Ishii
- Salk Institute for Biological Studies, La Jolla, California, United States of America
| | - Pavan Nayak
- Salk Institute for Biological Studies, La Jolla, California, United States of America
| | - Kenta Asahina
- Salk Institute for Biological Studies, La Jolla, California, United States of America
- * E-mail:
| |
Collapse
|
7
|
Sturman O, von Ziegler L, Schläppi C, Akyol F, Privitera M, Slominski D, Grimm C, Thieren L, Zerbi V, Grewe B, Bohacek J. Deep learning-based behavioral analysis reaches human accuracy and is capable of outperforming commercial solutions. Neuropsychopharmacology 2020; 45:1942-1952. [PMID: 32711402 PMCID: PMC7608249 DOI: 10.1038/s41386-020-0776-y] [Citation(s) in RCA: 73] [Impact Index Per Article: 18.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/01/2020] [Revised: 07/13/2020] [Accepted: 07/15/2020] [Indexed: 12/27/2022]
Abstract
To study brain function, preclinical research heavily relies on animal monitoring and the subsequent analyses of behavior. Commercial platforms have enabled semi high-throughput behavioral analyses by automating animal tracking, yet they poorly recognize ethologically relevant behaviors and lack the flexibility to be employed in variable testing environments. Critical advances based on deep-learning and machine vision over the last couple of years now enable markerless tracking of individual body parts of freely moving rodents with high precision. Here, we compare the performance of commercially available platforms (EthoVision XT14, Noldus; TSE Multi-Conditioning System, TSE Systems) to cross-verified human annotation. We provide a set of videos-carefully annotated by several human raters-of three widely used behavioral tests (open field test, elevated plus maze, forced swim test). Using these data, we then deployed the pose estimation software DeepLabCut to extract skeletal mouse representations. Using simple post-analyses, we were able to track animals based on their skeletal representation in a range of classic behavioral tests at similar or greater accuracy than commercial behavioral tracking systems. We then developed supervised machine learning classifiers that integrate the skeletal representation with the manual annotations. This new combined approach allows us to score ethologically relevant behaviors with similar accuracy to humans, the current gold standard, while outperforming commercial solutions. Finally, we show that the resulting machine learning approach eliminates variation both within and between human annotators. In summary, our approach helps to improve the quality and accuracy of behavioral data, while outperforming commercial systems at a fraction of the cost.
Collapse
Grants
- ETH Zurich, ETH Project Grant ETH-20 19-1, the SNSF Grant CRSII5-173721, Swiss Data Science Center C17-18, Neuroscience Center Zurich Project Grants Oxford/McGill/Zurich Partnership.
- ETH Zurich, ETH Project Grant ETH-20 19-1, the SNSF Grant 310030_172889/1, Forschungskredit of the University of Zurich FK-15-035, Vontobel-Foundation, Novartis Foundation for Medical Biological Research, EMDO-Foundation, Olga Mayenfisch Foundation, Betty and David Koetser Foundation for Brain Research, Neuroscience Center Zurich Project Grants Oxford/McGill/Zurich Partnership
Collapse
Affiliation(s)
- Oliver Sturman
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Lukas von Ziegler
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Christa Schläppi
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Furkan Akyol
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Mattia Privitera
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Daria Slominski
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
| | - Christina Grimm
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
- Neural Control of Movement Lab, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland
| | - Laetitia Thieren
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
- Experimental Imaging and Neuroenergetics, Institute of Pharmacology and Toxicology, University of Zurich, Zurich, Switzerland
| | - Valerio Zerbi
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
- Neural Control of Movement Lab, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland
| | - Benjamin Grewe
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland
- Institute of Neuroinformatics, University of Zurich and ETH Zurich, Zurich, Switzerland
- Department of Information Technology and Electrical Engineering, ETH Zurich, Zurich, Switzerland
| | - Johannes Bohacek
- Laboratory of Molecular and Behavioral Neuroscience, Institute for Neuroscience, Department of Health Sciences and Technology, ETH Zurich, Zurich, Switzerland.
- Neuroscience Center Zurich, ETH Zurich and University of Zurich, Zurich, Switzerland.
| |
Collapse
|
8
|
Wiltschko AB, Johnson MJ, Iurilli G, Peterson RE, Katon JM, Pashkovski SL, Abraira VE, Adams RP, Datta SR. Mapping Sub-Second Structure in Mouse Behavior. Neuron 2015; 88:1121-1135. [PMID: 26687221 PMCID: PMC4708087 DOI: 10.1016/j.neuron.2015.11.031] [Citation(s) in RCA: 378] [Impact Index Per Article: 42.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2015] [Revised: 09/13/2015] [Accepted: 11/18/2015] [Indexed: 10/22/2022]
Abstract
Complex animal behaviors are likely built from simpler modules, but their systematic identification in mammals remains a significant challenge. Here we use depth imaging to show that 3D mouse pose dynamics are structured at the sub-second timescale. Computational modeling of these fast dynamics effectively describes mouse behavior as a series of reused and stereotyped modules with defined transition probabilities. We demonstrate this combined 3D imaging and machine learning method can be used to unmask potential strategies employed by the brain to adapt to the environment, to capture both predicted and previously hidden phenotypes caused by genetic or neural manipulations, and to systematically expose the global structure of behavior within an experiment. This work reveals that mouse body language is built from identifiable components and is organized in a predictable fashion; deciphering this language establishes an objective framework for characterizing the influence of environmental cues, genes and neural activity on behavior.
Collapse
Affiliation(s)
- Alexander B Wiltschko
- Department of Neurobiology, Harvard Medical School, Boston, MA 02115, USA; School of Engineering and Applied Sciences, Harvard University, Cambridge, MA 02138, USA
| | - Matthew J Johnson
- Department of Neurobiology, Harvard Medical School, Boston, MA 02115, USA; School of Engineering and Applied Sciences, Harvard University, Cambridge, MA 02138, USA
| | - Giuliano Iurilli
- Department of Neurobiology, Harvard Medical School, Boston, MA 02115, USA
| | - Ralph E Peterson
- Department of Neurobiology, Harvard Medical School, Boston, MA 02115, USA
| | - Jesse M Katon
- Department of Neurobiology, Harvard Medical School, Boston, MA 02115, USA
| | - Stan L Pashkovski
- Department of Neurobiology, Harvard Medical School, Boston, MA 02115, USA
| | - Victoria E Abraira
- Department of Neurobiology, Harvard Medical School, Boston, MA 02115, USA
| | - Ryan P Adams
- School of Engineering and Applied Sciences, Harvard University, Cambridge, MA 02138, USA
| | | |
Collapse
|