1
|
Liao Y, Qin C, Zhang X, Ye J, Xu Z, Zong H, Hu N, Zhang D. A dual-mode, image-enhanced, miniaturized microscopy system for incubator-compatible monitoring of live cells. Talanta 2024; 278:126537. [PMID: 38996561 DOI: 10.1016/j.talanta.2024.126537] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2024] [Revised: 06/26/2024] [Accepted: 07/08/2024] [Indexed: 07/14/2024]
Abstract
Imaging live cells under stable culture conditions is essential to investigate cell physiological activities and proliferation. To achieve this goal, typically, a specialized incubation chamber that creates desired culture conditions needs to be incorporated into a microscopy system to perform cell monitoring. However, such imaging systems are generally large and costly, hampering their wide applications. Recent advances in the field of miniaturized microscopy systems have enabled incubator cell monitoring, providing a hospitable environment for live cells. Although these systems are more cost-effective, they are usually limited in imaging modalities and spatial temporal resolution. Here, we present a dual-mode, image-enhanced, miniaturized microscopy system (termed MiniCube) for direct monitoring of live cells inside incubators. MiniCube enables both bright field imaging and fluorescence imaging with single-cell spatial resolution and sub-second temporal resolution. Moreover, this system can also perform cell monitoring inside the incubator with tunable time scales ranging from a few seconds to days. Meanwhile, automatic cell segmentation and image enhancement are realized by the proposed data analysis pipeline of this system, and the signal-to-noise ratio (SNR) of acquired data is significantly improved using a deep learning based image denoising algorithm. Image data can be acquired with 5 times lower light exposure while maintaining comparable SNR. The versatility of this miniaturized microscopy system lends itself to various applications in biology studies, providing a practical platform and method for studying live cell dynamics within the incubator.
Collapse
Affiliation(s)
- Yuheng Liao
- Research Center for Novel Computing Sensing and Intelligent Processing, Zhejiang Laboratory, Hangzhou, 311121, China
| | - Chunlian Qin
- Department of Chemistry, Zhejiang-Israel Joint Laboratory of Self-Assembling Functional Materials, ZJU-Hangzhou Global Scientific and Technological Innovation Center, Zhejiang University, Hangzhou, 310058, China; General Surgery Department, Children's Hospital, Zhejiang University School of Medicine, National Clinical Research Center for Children's Health, Hangzhou, 310052, China
| | - Xiaoyu Zhang
- Research Center for Novel Computing Sensing and Intelligent Processing, Zhejiang Laboratory, Hangzhou, 311121, China
| | - Jing Ye
- Research Center for Novel Computing Sensing and Intelligent Processing, Zhejiang Laboratory, Hangzhou, 311121, China
| | - Zhongyuan Xu
- Research Center for Novel Computing Sensing and Intelligent Processing, Zhejiang Laboratory, Hangzhou, 311121, China
| | - Haotian Zong
- Research Center for Novel Computing Sensing and Intelligent Processing, Zhejiang Laboratory, Hangzhou, 311121, China
| | - Ning Hu
- Department of Chemistry, Zhejiang-Israel Joint Laboratory of Self-Assembling Functional Materials, ZJU-Hangzhou Global Scientific and Technological Innovation Center, Zhejiang University, Hangzhou, 310058, China; General Surgery Department, Children's Hospital, Zhejiang University School of Medicine, National Clinical Research Center for Children's Health, Hangzhou, 310052, China.
| | - Diming Zhang
- Research Center for Novel Computing Sensing and Intelligent Processing, Zhejiang Laboratory, Hangzhou, 311121, China.
| |
Collapse
|
2
|
Ryu J, Nejatbakhsh A, Torkashvand M, Gangadharan S, Seyedolmohadesin M, Kim J, Paninski L, Venkatachalam V. Versatile multiple object tracking in sparse 2D/3D videos via deformable image registration. PLoS Comput Biol 2024; 20:e1012075. [PMID: 38768230 PMCID: PMC11142724 DOI: 10.1371/journal.pcbi.1012075] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2023] [Revised: 05/31/2024] [Accepted: 04/14/2024] [Indexed: 05/22/2024] Open
Abstract
Tracking body parts in behaving animals, extracting fluorescence signals from cells embedded in deforming tissue, and analyzing cell migration patterns during development all require tracking objects with partially correlated motion. As dataset sizes increase, manual tracking of objects becomes prohibitively inefficient and slow, necessitating automated and semi-automated computational tools. Unfortunately, existing methods for multiple object tracking (MOT) are either developed for specific datasets and hence do not generalize well to other datasets, or require large amounts of training data that are not readily available. This is further exacerbated when tracking fluorescent sources in moving and deforming tissues, where the lack of unique features and sparsely populated images create a challenging environment, especially for modern deep learning techniques. By leveraging technology recently developed for spatial transformer networks, we propose ZephIR, an image registration framework for semi-supervised MOT in 2D and 3D videos. ZephIR can generalize to a wide range of biological systems by incorporating adjustable parameters that encode spatial (sparsity, texture, rigidity) and temporal priors of a given data class. We demonstrate the accuracy and versatility of our approach in a variety of applications, including tracking the body parts of a behaving mouse and neurons in the brain of a freely moving C. elegans. We provide an open-source package along with a web-based graphical user interface that allows users to provide small numbers of annotations to interactively improve tracking results.
Collapse
Affiliation(s)
- James Ryu
- Department of Physics, Northeastern University, Boston, Massachusetts, United States of America
| | - Amin Nejatbakhsh
- Department of Neuroscience, Columbia University, New York, New York, United States of America
| | - Mahdi Torkashvand
- Department of Physics, Northeastern University, Boston, Massachusetts, United States of America
| | - Sahana Gangadharan
- Department of Physics, Northeastern University, Boston, Massachusetts, United States of America
| | - Maedeh Seyedolmohadesin
- Department of Physics, Northeastern University, Boston, Massachusetts, United States of America
| | - Jinmahn Kim
- Department of Physics, Northeastern University, Boston, Massachusetts, United States of America
| | - Liam Paninski
- Department of Neuroscience, Columbia University, New York, New York, United States of America
| | - Vivek Venkatachalam
- Department of Physics, Northeastern University, Boston, Massachusetts, United States of America
| |
Collapse
|
3
|
Yurimoto T, Kumita W, Sato K, Kikuchi R, Oka G, Shibuki Y, Hashimoto R, Kamioka M, Hayasegawa Y, Yamazaki E, Kurotaki Y, Goda N, Kitakami J, Fujita T, Inoue T, Sasaki E. Development of a 3D tracking system for multiple marmosets under free-moving conditions. Commun Biol 2024; 7:216. [PMID: 38383741 PMCID: PMC10881507 DOI: 10.1038/s42003-024-05864-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2022] [Accepted: 01/26/2024] [Indexed: 02/23/2024] Open
Abstract
Assessment of social interactions and behavioral changes in nonhuman primates is useful for understanding brain function changes during life events and pathogenesis of neurological diseases. The common marmoset (Callithrix jacchus), which lives in a nuclear family like humans, is a useful model, but longitudinal automated behavioral observation of multiple animals has not been achieved. Here, we developed a Full Monitoring and Animal Identification (FulMAI) system for longitudinal detection of three-dimensional (3D) trajectories of each individual in multiple marmosets under free-moving conditions by combining video tracking, Light Detection and Ranging, and deep learning. Using this system, identification of each animal was more than 97% accurate. Location preferences and inter-individual distance could be calculated, and deep learning could detect grooming behavior. The FulMAI system allows us to analyze the natural behavior of individuals in a family over their lifetime and understand how behavior changes due to life events together with other data.
Collapse
Affiliation(s)
- Terumi Yurimoto
- Department of Marmoset Biology and Medicine, Central Institute for Experimental Medicine and Life Science, Kawasaki, 210-0821, Japan
| | - Wakako Kumita
- Department of Marmoset Biology and Medicine, Central Institute for Experimental Medicine and Life Science, Kawasaki, 210-0821, Japan
| | - Kenya Sato
- Department of Marmoset Biology and Medicine, Central Institute for Experimental Medicine and Life Science, Kawasaki, 210-0821, Japan
| | - Rika Kikuchi
- Department of Marmoset Biology and Medicine, Central Institute for Experimental Medicine and Life Science, Kawasaki, 210-0821, Japan
| | - Gohei Oka
- Department of Marmoset Biology and Medicine, Central Institute for Experimental Medicine and Life Science, Kawasaki, 210-0821, Japan
| | - Yusuke Shibuki
- Department of Marmoset Biology and Medicine, Central Institute for Experimental Medicine and Life Science, Kawasaki, 210-0821, Japan
| | - Rino Hashimoto
- Department of Marmoset Biology and Medicine, Central Institute for Experimental Medicine and Life Science, Kawasaki, 210-0821, Japan
| | - Michiko Kamioka
- Department of Marmoset Biology and Medicine, Central Institute for Experimental Medicine and Life Science, Kawasaki, 210-0821, Japan
| | - Yumi Hayasegawa
- Department of Marmoset Biology and Medicine, Central Institute for Experimental Medicine and Life Science, Kawasaki, 210-0821, Japan
| | - Eiko Yamazaki
- Department of Marmoset Biology and Medicine, Central Institute for Experimental Medicine and Life Science, Kawasaki, 210-0821, Japan
| | - Yoko Kurotaki
- Center of Basic Technology in Marmoset, Central Institute for Experimental Medicine and Life Science, Kawasaki, 210-0821, Japan
| | - Norio Goda
- Public Digital Transformation Department, Hitachi, Ltd., Shinagawa, 140-8512, Japan
| | - Junichi Kitakami
- Vision AI Solution Design Department Hitachi Solutions Technology, Ltd, Tachikawa, 190-0014, Japan
| | - Tatsuya Fujita
- Engineering Department Eastern Japan division, Totec Amenity Limited, Shinjuku, 163-0417, Japan
| | - Takashi Inoue
- Department of Marmoset Biology and Medicine, Central Institute for Experimental Medicine and Life Science, Kawasaki, 210-0821, Japan
| | - Erika Sasaki
- Department of Marmoset Biology and Medicine, Central Institute for Experimental Medicine and Life Science, Kawasaki, 210-0821, Japan.
| |
Collapse
|
4
|
Le VA, Sterley TL, Cheng N, Bains JS, Murari K. Markerless Mouse Tracking for Social Experiments. eNeuro 2024; 11:ENEURO.0154-22.2023. [PMID: 38233144 PMCID: PMC10901195 DOI: 10.1523/eneuro.0154-22.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2022] [Revised: 09/18/2023] [Accepted: 10/31/2023] [Indexed: 01/19/2024] Open
Abstract
Automated behavior quantification in socially interacting animals requires accurate tracking. While many methods have been very successful and highly generalizable to different settings, issues of mistaken identities and lost information on key anatomical features are common, although they can be alleviated by increased human effort in training or post-processing. We propose a markerless video-based tool to simultaneously track two interacting mice of the same appearance in controlled settings for quantifying behaviors such as different types of sniffing, touching, and locomotion to improve tracking accuracy under these settings without increased human effort. It incorporates conventional handcrafted tracking and deep-learning-based techniques. The tool is trained on a small number of manually annotated images from a basic experimental setup and outputs body masks and coordinates of the snout and tail-base for each mouse. The method was tested on several commonly used experimental conditions including bedding in the cage and fiberoptic or headstage implants on the mice. Results obtained without any human corrections after the automated analysis showed a near elimination of identities switches and a ∼15% improvement in tracking accuracy over pure deep-learning-based pose estimation tracking approaches. Our approach can be optionally ensembled with such techniques for further improvement. Finally, we demonstrated an application of this approach in studies of social behavior of mice by quantifying and comparing interactions between pairs of mice in which some lack olfaction. Together, these results suggest that our approach could be valuable for studying group behaviors in rodents, such as social interactions.
Collapse
Affiliation(s)
- Van Anh Le
- Electrical and Software Engineering, University of Calgary, Calgary, AB T2N 1N4, Canada
| | - Toni-Lee Sterley
- Hotchkiss Brain Institute, University of Calgary, Calgary, AB T2N 1N4, Canada
| | - Ning Cheng
- Hotchkiss Brain Institute, University of Calgary, Calgary, AB T2N 1N4, Canada
- Faculty of Veterinary Medicine, University of Calgary, Calgary, AB T2N 1N4, Canada
- Alberta Children's Hospital Research Institute, University of Calgary, Calgary, AB T2N 1N4, Canada
| | - Jaideep S Bains
- Hotchkiss Brain Institute, University of Calgary, Calgary, AB T2N 1N4, Canada
| | - Kartikeya Murari
- Electrical and Software Engineering, University of Calgary, Calgary, AB T2N 1N4, Canada
- Hotchkiss Brain Institute, University of Calgary, Calgary, AB T2N 1N4, Canada
- Biomedical Engineering, University of Calgary, Calgary, AB T2N 1N4, Canada
| |
Collapse
|
5
|
Jankowski MM, Polterovich A, Kazakov A, Niediek J, Nelken I. An automated, low-latency environment for studying the neural basis of behavior in freely moving rats. BMC Biol 2023; 21:172. [PMID: 37568111 PMCID: PMC10416379 DOI: 10.1186/s12915-023-01660-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2022] [Accepted: 07/10/2023] [Indexed: 08/13/2023] Open
Abstract
BACKGROUND Behavior consists of the interaction between an organism and its environment, and is controlled by the brain. Brain activity varies at sub-second time scales, but behavioral measures are usually coarse (often consisting of only binary trial outcomes). RESULTS To overcome this mismatch, we developed the Rat Interactive Foraging Facility (RIFF): a programmable interactive arena for freely moving rats with multiple feeding areas, multiple sound sources, high-resolution behavioral tracking, and simultaneous electrophysiological recordings. The paper provides detailed information about the construction of the RIFF and the software used to control it. To illustrate the flexibility of the RIFF, we describe two complex tasks implemented in the RIFF, a foraging task and a sound localization task. Rats quickly learned to obtain rewards in both tasks. Neurons in the auditory cortex as well as neurons in the auditory field in the posterior insula had sound-driven activity during behavior. Remarkably, neurons in both structures also showed sensitivity to non-auditory parameters such as location in the arena and head-to-body angle. CONCLUSIONS The RIFF provides insights into the cognitive capabilities and learning mechanisms of rats and opens the way to a better understanding of how brains control behavior. The ability to do so depends crucially on the combination of wireless electrophysiology and detailed behavioral documentation available in the RIFF.
Collapse
Affiliation(s)
- Maciej M Jankowski
- The Edmond and Lily Safra Center for Brain Sciences and the Department of Neurobiology, Silberman Institute of Life Sciences, the Hebrew University of Jerusalem, Jerusalem, Israel
- BioTechMed Center, Multimedia Systems Department, Faculty of Electronics, Telecommunications and Informatics, Gdansk University of Technology, Gdansk, Poland
| | - Ana Polterovich
- The Edmond and Lily Safra Center for Brain Sciences and the Department of Neurobiology, Silberman Institute of Life Sciences, the Hebrew University of Jerusalem, Jerusalem, Israel
| | - Alex Kazakov
- The Edmond and Lily Safra Center for Brain Sciences and the Department of Neurobiology, Silberman Institute of Life Sciences, the Hebrew University of Jerusalem, Jerusalem, Israel
| | - Johannes Niediek
- The Edmond and Lily Safra Center for Brain Sciences and the Department of Neurobiology, Silberman Institute of Life Sciences, the Hebrew University of Jerusalem, Jerusalem, Israel
| | - Israel Nelken
- The Edmond and Lily Safra Center for Brain Sciences and the Department of Neurobiology, Silberman Institute of Life Sciences, the Hebrew University of Jerusalem, Jerusalem, Israel.
| |
Collapse
|
6
|
Jabarin R, Netser S, Wagner S. Beyond the three-chamber test: toward a multimodal and objective assessment of social behavior in rodents. Mol Autism 2022; 13:41. [PMID: 36284353 PMCID: PMC9598038 DOI: 10.1186/s13229-022-00521-6] [Citation(s) in RCA: 24] [Impact Index Per Article: 12.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2022] [Accepted: 10/06/2022] [Indexed: 12/31/2022] Open
Abstract
MAIN: In recent years, substantial advances in social neuroscience have been realized, including the generation of numerous rodent models of autism spectrum disorder. Still, it can be argued that those methods currently being used to analyze animal social behavior create a bottleneck that significantly slows down progress in this field. Indeed, the bulk of research still relies on a small number of simple behavioral paradigms, the results of which are assessed without considering behavioral dynamics. Moreover, only few variables are examined in each paradigm, thus overlooking a significant portion of the complexity that characterizes social interaction between two conspecifics, subsequently hindering our understanding of the neural mechanisms governing different aspects of social behavior. We further demonstrate these constraints by discussing the most commonly used paradigm for assessing rodent social behavior, the three-chamber test. We also point to the fact that although emotions greatly influence human social behavior, we lack reliable means for assessing the emotional state of animals during social tasks. As such, we also discuss current evidence supporting the existence of pro-social emotions and emotional cognition in animal models. We further suggest that adequate social behavior analysis requires a novel multimodal approach that employs automated and simultaneous measurements of multiple behavioral and physiological variables at high temporal resolution in socially interacting animals. We accordingly describe several computerized systems and computational tools for acquiring and analyzing such measurements. Finally, we address several behavioral and physiological variables that can be used to assess socio-emotional states in animal models and thus elucidate intricacies of social behavior so as to attain deeper insight into the brain mechanisms that mediate such behaviors. CONCLUSIONS: In summary, we suggest that combining automated multimodal measurements with machine-learning algorithms will help define socio-emotional states and determine their dynamics during various types of social tasks, thus enabling a more thorough understanding of the complexity of social behavior.
Collapse
Affiliation(s)
- Renad Jabarin
- Sagol Department of Neurobiology, Faculty of Natural Sciences, University of Haifa, Haifa, Israel.
| | - Shai Netser
- Sagol Department of Neurobiology, Faculty of Natural Sciences, University of Haifa, Haifa, Israel
| | - Shlomo Wagner
- Sagol Department of Neurobiology, Faculty of Natural Sciences, University of Haifa, Haifa, Israel
| |
Collapse
|
7
|
Marshall JD, Li T, Wu JH, Dunn TW. Leaving flatland: Advances in 3D behavioral measurement. Curr Opin Neurobiol 2022; 73:102522. [PMID: 35453000 DOI: 10.1016/j.conb.2022.02.002] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2021] [Revised: 01/25/2022] [Accepted: 02/02/2022] [Indexed: 01/10/2023]
Abstract
Animals move in three dimensions (3D). Thus, 3D measurement is necessary to report the true kinematics of animal movement. Existing 3D measurement techniques draw on specialized hardware, such as motion capture or depth cameras, as well as deep multi-view and monocular computer vision. Continued advances at the intersection of deep learning and computer vision will facilitate 3D tracking across more anatomical features, with less training data, in additional species, and within more natural, occlusive environments. 3D behavioral measurement enables unique applications in phenotyping, investigating the neural basis of behavior, and designing artificial agents capable of imitating animal behavior.
Collapse
Affiliation(s)
- Jesse D Marshall
- Harvard University, Department of Organismic and Evolutionary Biology, Cambridge, MA 02138, USA.
| | - Tianqing Li
- Duke University, Pratt School of Engineering, Department of Biomedical Engineering, Durham, NC 27708, USA. https://twitter.com/tianqingxli
| | - Joshua H Wu
- Duke University, Pratt School of Engineering, Department of Biomedical Engineering, Durham, NC 27708, USA
| | - Timothy W Dunn
- Duke University, Pratt School of Engineering, Department of Biomedical Engineering, Durham, NC 27708, USA.
| |
Collapse
|
8
|
Klein CJMI, Budiman T, Homberg JR, Verma D, Keijer J, van Schothorst EM. Measuring Locomotor Activity and Behavioral Aspects of Rodents Living in the Home-Cage. Front Behav Neurosci 2022; 16:877323. [PMID: 35464142 PMCID: PMC9021872 DOI: 10.3389/fnbeh.2022.877323] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2022] [Accepted: 03/14/2022] [Indexed: 11/13/2022] Open
Abstract
Automatization and technological advances have led to a larger number of methods and systems to monitor and measure locomotor activity and more specific behavior of a wide variety of animal species in various environmental conditions in laboratory settings. In rodents, the majority of these systems require the animals to be temporarily taken away from their home-cage into separate observation cage environments which requires manual handling and consequently evokes distress for the animal and may alter behavioral responses. An automated high-throughput approach can overcome this problem. Therefore, this review describes existing automated methods and technologies which enable the measurement of locomotor activity and behavioral aspects of rodents in their most meaningful and stress-free laboratory environment: the home-cage. In line with the Directive 2010/63/EU and the 3R principles (replacement, reduction, refinement), this review furthermore assesses their suitability and potential for group-housed conditions as a refinement strategy, highlighting their current technological and practical limitations. It covers electrical capacitance technology and radio-frequency identification (RFID), which focus mainly on voluntary locomotor activity in both single and multiple rodents, respectively. Infrared beams and force plates expand the detection beyond locomotor activity toward basic behavioral traits but discover their full potential in individually housed rodents only. Despite the great premises of these approaches in terms of behavioral pattern recognition, more sophisticated methods, such as (RFID-assisted) video tracking technology need to be applied to enable the automated analysis of advanced behavioral aspects of individual animals in social housing conditions.
Collapse
Affiliation(s)
- Christian J. M. I. Klein
- Human and Animal Physiology, Wageningen University and Research, Wageningen, Netherlands
- TSE Systems GmbH, Berlin, Germany
| | | | - Judith R. Homberg
- Department of Cognitive Neuroscience, Donders Institute for Brain, Cognition and Behavior, Radboud University Medical Center, Nijmegen, Netherlands
| | | | - Jaap Keijer
- Human and Animal Physiology, Wageningen University and Research, Wageningen, Netherlands
| | | |
Collapse
|
9
|
Ebbesen CL, Froemke RC. Automatic mapping of multiplexed social receptive fields by deep learning and GPU-accelerated 3D videography. Nat Commun 2022; 13:593. [PMID: 35105858 PMCID: PMC8807631 DOI: 10.1038/s41467-022-28153-7] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2021] [Accepted: 01/06/2022] [Indexed: 12/25/2022] Open
Abstract
Social interactions powerfully impact the brain and the body, but high-resolution descriptions of these important physical interactions and their neural correlates are lacking. Currently, most studies rely on labor-intensive methods such as manual annotation. Scalable and objective tracking methods are required to understand the neural circuits underlying social behavior. Here we describe a hardware/software system and analysis pipeline that combines 3D videography, deep learning, physical modeling, and GPU-accelerated robust optimization, with automatic analysis of neuronal receptive fields recorded in interacting mice. Our system ("3DDD Social Mouse Tracker") is capable of fully automatic multi-animal tracking with minimal errors (including in complete darkness) during complex, spontaneous social encounters, together with simultaneous electrophysiological recordings. We capture posture dynamics of multiple unmarked mice with high spatiotemporal precision (~2 mm, 60 frames/s). A statistical model that relates 3D behavior and neural activity reveals multiplexed 'social receptive fields' of neurons in barrel cortex. Our approach could be broadly useful for neurobehavioral studies of multiple animals interacting in complex low-light environments.
Collapse
Affiliation(s)
- Christian L Ebbesen
- Skirball Institute of Biomolecular Medicine, New York University School of Medicine, New York, NY, 10016, USA.
- Neuroscience Institute, New York University School of Medicine, New York, NY, 10016, USA.
- Department of Otolaryngology, New York University School of Medicine, New York, NY, 10016, USA.
- Department of Neuroscience and Physiology, New York University School of Medicine, New York, NY, 10016, USA.
- Center for Neural Science, New York University, New York, NY, 10003, USA.
| | - Robert C Froemke
- Skirball Institute of Biomolecular Medicine, New York University School of Medicine, New York, NY, 10016, USA.
- Neuroscience Institute, New York University School of Medicine, New York, NY, 10016, USA.
- Department of Otolaryngology, New York University School of Medicine, New York, NY, 10016, USA.
- Department of Neuroscience and Physiology, New York University School of Medicine, New York, NY, 10016, USA.
- Center for Neural Science, New York University, New York, NY, 10003, USA.
| |
Collapse
|
10
|
MouseVenue3D: A Markerless Three-Dimension Behavioral Tracking System for Matching Two-Photon Brain Imaging in Free-Moving Mice. Neurosci Bull 2021; 38:303-317. [PMID: 34637091 PMCID: PMC8975979 DOI: 10.1007/s12264-021-00778-6] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2021] [Accepted: 06/23/2021] [Indexed: 10/20/2022] Open
Abstract
Understanding the connection between brain and behavior in animals requires precise monitoring of their behaviors in three-dimensional (3-D) space. However, there is no available three-dimensional behavior capture system that focuses on rodents. Here, we present MouseVenue3D, an automated and low-cost system for the efficient capture of 3-D skeleton trajectories in markerless rodents. We improved the most time-consuming step in 3-D behavior capturing by developing an automatic calibration module. Then, we validated this process in behavior recognition tasks, and showed that 3-D behavioral data achieved higher accuracy than 2-D data. Subsequently, MouseVenue3D was combined with fast high-resolution miniature two-photon microscopy for synchronous neural recording and behavioral tracking in the freely-moving mouse. Finally, we successfully decoded spontaneous neuronal activity from the 3-D behavior of mice. Our findings reveal that subtle, spontaneous behavior modules are strongly correlated with spontaneous neuronal activity patterns.
Collapse
|
11
|
Mimura K, Nagai Y, Inoue KI, Matsumoto J, Hori Y, Sato C, Kimura K, Okauchi T, Hirabayashi T, Nishijo H, Yahata N, Takada M, Suhara T, Higuchi M, Minamimoto T. Chemogenetic activation of nigrostriatal dopamine neurons in freely moving common marmosets. iScience 2021; 24:103066. [PMID: 34568790 PMCID: PMC8449082 DOI: 10.1016/j.isci.2021.103066] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2021] [Revised: 07/19/2021] [Accepted: 08/26/2021] [Indexed: 11/10/2022] Open
Abstract
To interrogate particular neuronal pathways in nonhuman primates under natural and stress-free conditions, we applied designer receptors exclusively activated by designer drugs (DREADDs) technology to common marmosets. We injected adeno-associated virus vectors expressing the excitatory DREADD hM3Dq into the unilateral substantia nigra (SN) in four marmosets. Using multi-tracer positron emission tomography imaging, we detected DREADD expression in vivo, which was confirmed in nigrostriatal dopamine neurons by immunohistochemistry, as well as by assessed activation of the SN following agonist administration. The marmosets rotated in a contralateral direction relative to the activated side 30-90 min after consuming food containing the highly potent DREADD agonist deschloroclozapine (DCZ) but not on the following days without DCZ. These results indicate that non-invasive and reversible DREADD manipulation will extend the utility of marmosets as a primate model for linking neuronal activity and natural behavior in various contexts.
Collapse
Affiliation(s)
- Koki Mimura
- Department of Functional Brain Imaging, National Institutes for Quantum and Radiological Science and Technology, Chiba 263-8555 Japan
| | - Yuji Nagai
- Department of Functional Brain Imaging, National Institutes for Quantum and Radiological Science and Technology, Chiba 263-8555 Japan
| | - Ken-ichi Inoue
- Systems Neuroscience Section, Primate Research Institute, Kyoto University, Inuyama, Aichi 484-8506, Japan
| | - Jumpei Matsumoto
- Department of System Emotional Science, Faculty of Medicine, University of Toyama, Toyama 930-8555, Japan
- Research Center for Idling Brain Science, University of Toyama, Toyama 930-8555, Japan
| | - Yukiko Hori
- Department of Functional Brain Imaging, National Institutes for Quantum and Radiological Science and Technology, Chiba 263-8555 Japan
| | - Chika Sato
- Quantum Life Informatics Group, Institute for Quantum Life Science, National Institutes for Quantum and Radiological Science and Technology, Chiba 263-8555 Japan
- Applied MRI Research, Department of Molecular Imaging and Theranostics, National Institutes for Quantum and Radiological Science and Technology, Chiba 263-8555 Japan
| | - Kei Kimura
- Systems Neuroscience Section, Primate Research Institute, Kyoto University, Inuyama, Aichi 484-8506, Japan
| | - Takashi Okauchi
- Department of Functional Brain Imaging, National Institutes for Quantum and Radiological Science and Technology, Chiba 263-8555 Japan
| | - Toshiyuki Hirabayashi
- Department of Functional Brain Imaging, National Institutes for Quantum and Radiological Science and Technology, Chiba 263-8555 Japan
| | - Hisao Nishijo
- Department of System Emotional Science, Faculty of Medicine, University of Toyama, Toyama 930-8555, Japan
- Research Center for Idling Brain Science, University of Toyama, Toyama 930-8555, Japan
| | - Noriaki Yahata
- Quantum Life Informatics Group, Institute for Quantum Life Science, National Institutes for Quantum and Radiological Science and Technology, Chiba 263-8555 Japan
- Applied MRI Research, Department of Molecular Imaging and Theranostics, National Institutes for Quantum and Radiological Science and Technology, Chiba 263-8555 Japan
| | - Masahiko Takada
- Systems Neuroscience Section, Primate Research Institute, Kyoto University, Inuyama, Aichi 484-8506, Japan
| | - Tetsuya Suhara
- Department of Functional Brain Imaging, National Institutes for Quantum and Radiological Science and Technology, Chiba 263-8555 Japan
| | - Makoto Higuchi
- Department of Functional Brain Imaging, National Institutes for Quantum and Radiological Science and Technology, Chiba 263-8555 Japan
| | - Takafumi Minamimoto
- Department of Functional Brain Imaging, National Institutes for Quantum and Radiological Science and Technology, Chiba 263-8555 Japan
| |
Collapse
|
12
|
Jiang Z, Zhou F, Zhao A, Li X, Li L, Tao D, Li X, Zhou H. Multi-View Mouse Social Behaviour Recognition With Deep Graphic Model. IEEE TRANSACTIONS ON IMAGE PROCESSING : A PUBLICATION OF THE IEEE SIGNAL PROCESSING SOCIETY 2021; 30:5490-5504. [PMID: 34048344 DOI: 10.1109/tip.2021.3083079] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Home-cage social behaviour analysis of mice is an invaluable tool to assess therapeutic efficacy of neurodegenerative diseases. Despite tremendous efforts made within the research community, single-camera video recordings are mainly used for such analysis. Because of the potential to create rich descriptions for mouse social behaviors, the use of multi-view video recordings for rodent observations is increasingly receiving much attention. However, identifying social behaviours from various views is still challenging due to the lack of correspondence across data sources. To address this problem, we here propose a novel multi-view latent-attention and dynamic discriminative model that jointly learns view-specific and view-shared sub-structures, where the former captures unique dynamics of each view whilst the latter encodes the interaction between the views. Furthermore, a novel multi-view latent-attention variational autoencoder model is introduced in learning the acquired features, enabling us to learn discriminative features in each view. Experimental results on the standard CRMI13 and our multi-view Parkinson's Disease Mouse Behaviour (PDMB) datasets demonstrate that our proposed model outperforms the other state of the arts technologies, has lower computational cost than the other graphical models and effectively deals with the imbalanced data problem.
Collapse
|
13
|
Ebbesen CL, Froemke RC. Body language signals for rodent social communication. Curr Opin Neurobiol 2021; 68:91-106. [PMID: 33582455 PMCID: PMC8243782 DOI: 10.1016/j.conb.2021.01.008] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2020] [Revised: 01/09/2021] [Accepted: 01/25/2021] [Indexed: 12/15/2022]
Abstract
Integration of social cues to initiate adaptive emotional and behavioral responses is a fundamental aspect of animal and human behavior. In humans, social communication includes prominent nonverbal components, such as social touch, gestures and facial expressions. Comparative studies investigating the neural basis of social communication in rodents has historically been centered on olfactory signals and vocalizations, with relatively less focus on non-verbal social cues. Here, we outline two exciting research directions: First, we will review recent observations pointing to a role of social facial expressions in rodents. Second, we will review observations that point to a role of 'non-canonical' rodent body language: body posture signals beyond stereotyped displays in aggressive and sexual behavior. In both sections, we will outline how social neuroscience can build on recent advances in machine learning, robotics and micro-engineering to push these research directions forward towards a holistic systems neurobiology of rodent body language.
Collapse
Affiliation(s)
- Christian L Ebbesen
- Skirball Institute of Biomolecular Medicine, Neuroscience Institute, Departments of Otolaryngology, Neuroscience and Physiology, New York University School of Medicine, New York, NY, 10016, USA; Center for Neural Science, New York University, New York, NY, 10003, USA.
| | - Robert C Froemke
- Skirball Institute of Biomolecular Medicine, Neuroscience Institute, Departments of Otolaryngology, Neuroscience and Physiology, New York University School of Medicine, New York, NY, 10016, USA; Center for Neural Science, New York University, New York, NY, 10003, USA; Howard Hughes Medical Institute Faculty Scholar, USA.
| |
Collapse
|
14
|
Liu X, Yu SY, Flierman NA, Loyola S, Kamermans M, Hoogland TM, De Zeeuw CI. OptiFlex: Multi-Frame Animal Pose Estimation Combining Deep Learning With Optical Flow. Front Cell Neurosci 2021; 15:621252. [PMID: 34122011 PMCID: PMC8194069 DOI: 10.3389/fncel.2021.621252] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2020] [Accepted: 04/26/2021] [Indexed: 11/29/2022] Open
Abstract
Animal pose estimation tools based on deep learning have greatly improved animal behaviour quantification. These tools perform pose estimation on individual video frames, but do not account for variability of animal body shape in their prediction and evaluation. Here, we introduce a novel multi-frame animal pose estimation framework, referred to as OptiFlex. This framework integrates a flexible base model (i.e., FlexibleBaseline), which accounts for variability in animal body shape, with an OpticalFlow model that incorporates temporal context from nearby video frames. Pose estimation can be optimised using multi-view information to leverage all four dimensions (3D space and time). We evaluate FlexibleBaseline using datasets of four different lab animal species (mouse, fruit fly, zebrafish, and monkey) and introduce an intuitive evaluation metric-adjusted percentage of correct key points (aPCK). Our analyses show that OptiFlex provides prediction accuracy that outperforms current deep learning based tools, highlighting its potential for studying a wide range of behaviours across different animal species.
Collapse
Affiliation(s)
- XiaoLe Liu
- Faculty of Mathematics, University of Waterloo, Waterloo, ON, Canada
| | - Si-yang Yu
- Department of Neuroscience, Erasmus MC, Rotterdam, Netherlands
| | - Nico A. Flierman
- Department of Neuroscience, Erasmus MC, Rotterdam, Netherlands
- Netherlands Institute for Neuroscience, Royal Academy of Arts and Sciences, Amsterdam, Netherlands
| | - Sebastián Loyola
- Netherlands Institute for Neuroscience, Royal Academy of Arts and Sciences, Amsterdam, Netherlands
| | - Maarten Kamermans
- Netherlands Institute for Neuroscience, Royal Academy of Arts and Sciences, Amsterdam, Netherlands
- Department of Biomedical Physics and Biomedical Photonics, Amsterdam UMC location AMC, University of Amsterdam, Amsterdam, Netherlands
| | - Tycho M. Hoogland
- Department of Neuroscience, Erasmus MC, Rotterdam, Netherlands
- Netherlands Institute for Neuroscience, Royal Academy of Arts and Sciences, Amsterdam, Netherlands
| | - Chris I. De Zeeuw
- Department of Neuroscience, Erasmus MC, Rotterdam, Netherlands
- Netherlands Institute for Neuroscience, Royal Academy of Arts and Sciences, Amsterdam, Netherlands
| |
Collapse
|
15
|
Huang K, Han Y, Chen K, Pan H, Zhao G, Yi W, Li X, Liu S, Wei P, Wang L. A hierarchical 3D-motion learning framework for animal spontaneous behavior mapping. Nat Commun 2021; 12:2784. [PMID: 33986265 PMCID: PMC8119960 DOI: 10.1038/s41467-021-22970-y] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2020] [Accepted: 04/06/2021] [Indexed: 02/03/2023] Open
Abstract
Animal behavior usually has a hierarchical structure and dynamics. Therefore, to understand how the neural system coordinates with behaviors, neuroscientists need a quantitative description of the hierarchical dynamics of different behaviors. However, the recent end-to-end machine-learning-based methods for behavior analysis mostly focus on recognizing behavioral identities on a static timescale or based on limited observations. These approaches usually lose rich dynamic information on cross-scale behaviors. Here, inspired by the natural structure of animal behaviors, we address this challenge by proposing a parallel and multi-layered framework to learn the hierarchical dynamics and generate an objective metric to map the behavior into the feature space. In addition, we characterize the animal 3D kinematics with our low-cost and efficient multi-view 3D animal motion-capture system. Finally, we demonstrate that this framework can monitor spontaneous behavior and automatically identify the behavioral phenotypes of the transgenic animal disease model. The extensive experiment results suggest that our framework has a wide range of applications, including animal disease model phenotyping and the relationships modeling between the neural circuits and behavior.
Collapse
Grants
- This work was supported in part by Key Area R&D Program of Guangdong Province (2018B030338001 P.W., 2018B030331001 L.W.), National Key R&D Program of China (2018YFA0701403 P.W.), National Natural Science Foundation of China (NSFC 31500861 P.W., NSFC 31630031 L.W., NSFC 91732304 L.W., NSFC 31930047 L.W.), Chang Jiang Scholars Program (L.W.), the International Big Science Program Cultivating Project of CAS (172644KYS820170004 L.W.), the Strategic Priority Research Program of Chinese Academy of Science (XDB32030100, L.W.), the Youth Innovation Promotion Association of the Chinese Academy of Sciences (2017413 P.W.), CAS Key Laboratory of Brain Connectome and Manipulation (2019DP173024), Shenzhen Government Basic Research Grants (JCYJ20170411140807570 P.W., JCYJ20170413164535041 L.W.), Science, Technology and Innovation Commission of Shenzhen Municipality (JCYJ20160429185235132 K.H.), Helmholtz-CAS joint research grant (GJHZ1508 L.W.), Guangdong Provincial Key Laboratory of Brain Connectome and Behavior (2017B030301017 L.W.), the Ten Thousand Talent Program (L.W.), the Guangdong Special Support Program (L.W.), Key Laboratory of SIAT (2019DP173024 L.W.), Shenzhen Key Science and Technology Infrastructure Planning Project (ZDKJ20190204002 L.W.).
Collapse
Affiliation(s)
- Kang Huang
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
- University of Chinese Academy of Sciences, Beijing, China
| | - Yaning Han
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
- University of Chinese Academy of Sciences, Beijing, China
| | - Ke Chen
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
- University of Chinese Academy of Sciences, Beijing, China
| | - Hongli Pan
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
| | - Gaoyang Zhao
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
- University of Chinese Academy of Sciences, Beijing, China
| | - Wenling Yi
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
- University of Chinese Academy of Sciences, Beijing, China
| | - Xiaoxi Li
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
- University of Chinese Academy of Sciences, Beijing, China
| | - Siyuan Liu
- Pennsylvania State University, University Park, PA, USA
| | - Pengfei Wei
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China.
- University of Chinese Academy of Sciences, Beijing, China.
| | - Liping Wang
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China.
- University of Chinese Academy of Sciences, Beijing, China.
| |
Collapse
|
16
|
Yoshida T, Yamagata A, Imai A, Kim J, Izumi H, Nakashima S, Shiroshima T, Maeda A, Iwasawa-Okamoto S, Azechi K, Osaka F, Saitoh T, Maenaka K, Shimada T, Fukata Y, Fukata M, Matsumoto J, Nishijo H, Takao K, Tanaka S, Okabe S, Tabuchi K, Uemura T, Mishina M, Mori H, Fukai S. Canonical versus non-canonical transsynaptic signaling of neuroligin 3 tunes development of sociality in mice. Nat Commun 2021; 12:1848. [PMID: 33758193 PMCID: PMC7988105 DOI: 10.1038/s41467-021-22059-6] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2020] [Accepted: 02/25/2021] [Indexed: 12/31/2022] Open
Abstract
Neuroligin 3 (NLGN3) and neurexins (NRXNs) constitute a canonical transsynaptic cell-adhesion pair, which has been implicated in autism. In autism spectrum disorder (ASD) development of sociality can be impaired. However, the molecular mechanism underlying NLGN3-mediated social development is unclear. Here, we identify non-canonical interactions between NLGN3 and protein tyrosine phosphatase δ (PTPδ) splice variants, competing with NRXN binding. NLGN3-PTPδ complex structure revealed a splicing-dependent interaction mode and competition mechanism between PTPδ and NRXNs. Mice carrying a NLGN3 mutation that selectively impairs NLGN3-NRXN interaction show increased sociability, whereas mice where the NLGN3-PTPδ interaction is impaired exhibit impaired social behavior and enhanced motor learning, with imbalance in excitatory/inhibitory synaptic protein expressions, as reported in the Nlgn3 R451C autism model. At neuronal level, the autism-related Nlgn3 R451C mutation causes selective impairment in the non-canonical pathway. Our findings suggest that canonical and non-canonical NLGN3 pathways compete and regulate the development of sociality.
Collapse
Affiliation(s)
- Tomoyuki Yoshida
- Department of Molecular Neuroscience, Faculty of Medicine, University of Toyama, Toyama, Japan. .,Research Center for Idling Brain Science, University of Toyama, Toyama, Japan. .,JST PRESTO, Saitama, Japan.
| | | | - Ayako Imai
- Department of Molecular Neuroscience, Faculty of Medicine, University of Toyama, Toyama, Japan
| | - Juhyon Kim
- Division of Bio-Information Engineering, Faculty of Engineering, University of Toyama, Toyama, Japan
| | - Hironori Izumi
- Department of Molecular Neuroscience, Faculty of Medicine, University of Toyama, Toyama, Japan
| | - Shogo Nakashima
- Department of System Emotional Science, Faculty of Medicine, University of Toyama, Toyama, Japan
| | - Tomoko Shiroshima
- Department of Anatomy, Kitasato University School of Medicine, Kanagawa, Japan
| | - Asami Maeda
- Research Institute for Diseases of Old Age, Juntendo University Graduate School of Medicine, Tokyo, Japan
| | - Shiho Iwasawa-Okamoto
- Department of Molecular Neuroscience, Faculty of Medicine, University of Toyama, Toyama, Japan
| | - Kenji Azechi
- Department of Molecular Neuroscience, Faculty of Medicine, University of Toyama, Toyama, Japan
| | - Fumina Osaka
- Center for Research and Education on Drug Discovery, Faculty of Pharmaceutical Sciences, Hokkaido University, Sapporo, Japan
| | - Takashi Saitoh
- Center for Research and Education on Drug Discovery, Faculty of Pharmaceutical Sciences, Hokkaido University, Sapporo, Japan
| | - Katsumi Maenaka
- Center for Research and Education on Drug Discovery, Faculty of Pharmaceutical Sciences, Hokkaido University, Sapporo, Japan.,Laboratory of Biomolecular Science, Faculty of Pharmaceutical Sciences, Hokkaido University, Sapporo, Japan
| | - Takashi Shimada
- SHIMADZU Bioscience Research Partnership, Innovation Center, Shimadzu Scientific Instruments, Bothell, WA, USA
| | - Yuko Fukata
- Division of Membrane Physiology, Department of Molecular and Cellular Physiology, National Institute for Physiological Sciences, National Institutes of Natural Sciences, Aichi, Japan
| | - Masaki Fukata
- Division of Membrane Physiology, Department of Molecular and Cellular Physiology, National Institute for Physiological Sciences, National Institutes of Natural Sciences, Aichi, Japan
| | - Jumpei Matsumoto
- Research Center for Idling Brain Science, University of Toyama, Toyama, Japan.,Department of System Emotional Science, Faculty of Medicine, University of Toyama, Toyama, Japan
| | - Hisao Nishijo
- Research Center for Idling Brain Science, University of Toyama, Toyama, Japan.,Department of System Emotional Science, Faculty of Medicine, University of Toyama, Toyama, Japan
| | - Keizo Takao
- Research Center for Idling Brain Science, University of Toyama, Toyama, Japan.,Life Science Research Center, University of Toyama, Toyama, Japan
| | - Shinji Tanaka
- Department of Cellular Neurobiology, Graduate School of Medicine, The University of Tokyo, Tokyo, Japan
| | - Shigeo Okabe
- Department of Cellular Neurobiology, Graduate School of Medicine, The University of Tokyo, Tokyo, Japan
| | - Katsuhiko Tabuchi
- JST PRESTO, Saitama, Japan.,Department of Molecular and Cellular Physiology, Institute of Medicine, Academic Assembly, Shinshu University, Nagano, Japan.,Institute for Biomedical Sciences, Interdisciplinary Cluster for Cutting Edge Research, Shinshu University, Nagano, Japan
| | - Takeshi Uemura
- Institute for Biomedical Sciences, Interdisciplinary Cluster for Cutting Edge Research, Shinshu University, Nagano, Japan.,Division of Gene Research, Research Center for Supports to Advanced Science, Shinshu University, Nagano, Japan
| | - Masayoshi Mishina
- Brain Science Laboratory, Research Organization of Science and Technology, Ritsumeikan University, Shiga, Japan
| | - Hisashi Mori
- Department of Molecular Neuroscience, Faculty of Medicine, University of Toyama, Toyama, Japan.,Research Center for Idling Brain Science, University of Toyama, Toyama, Japan
| | - Shuya Fukai
- Department of Chemistry, Graduate School of Science, Kyoto University, Kyoto, Japan.
| |
Collapse
|
17
|
Improved 3D tracking and automated classification of rodents' behavioral activity using depth-sensing cameras. Behav Res Methods 2021; 52:2156-2167. [PMID: 32232737 DOI: 10.3758/s13428-020-01381-9] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Analysis of rodents' behavior/activity is of fundamental importance in many research fields. However, many behavioral experiments still rely on manual scoring, with obvious problems in reproducibility. Despite important advances in video-analysis systems and computational ethology, automated behavior quantification is still a challenge. The need for large training datasets, background stability requirements, and reduction to two-dimensional analysis (impairing full posture characterization), limit their use. Here we present a novel integrated solution for behavioral analysis of individual rats, combining video segmentation, tracking of body parts, and automated classification of behaviors, using machine learning and computer vision methods. Low-cost depth cameras (RGB-D) are used to enable three-dimensional tracking and classification in dark conditions and absence of color contrast. Our solution automatically tracks five anatomical landmarks in dynamic environments and recognizes seven distinct behaviors, within the accuracy range of human annotations. The developed free software was validated in experiments where behavioral differences between Wistar Kyoto and Wistar rats were automatically quantified. The results reveal the capability for effective automated phenotyping. An extended annotated RGB-D dataset is also made publicly available. The proposed solution is an easy-to-use tool, with low-cost setup and powerful 3D segmentation methods (in static/dynamic environments). The ability to work in dark conditions means that natural animal behavior is not affected by recording lights. Furthermore, automated classification is possible with only ~30 minutes of annotated videos. By creating conditions for high-throughput analysis and reproducible quantitative measurements of animal behavior experiments, we believe this contribution can greatly improve behavioral analysis research.
Collapse
|
18
|
Otabi H, Okayama T, Toyoda A. Assessment of nest building and social interaction behavior in mice exposed to acute social defeat stress using a three-dimensional depth camera. Anim Sci J 2020; 91:e13447. [PMID: 32902039 DOI: 10.1111/asj.13447] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2020] [Revised: 06/24/2020] [Accepted: 07/15/2020] [Indexed: 01/05/2023]
Abstract
Nest building is an instinctive behavior toward protection from predators, body temperature regulation, and courtship. Previously, we discovered that acute and chronic social defeat stress suppresses the onset of nest-building behavior in male mice (C57BL/6J). Here, we analyzed nest building and other behavioral deficits induced by acute social defeat stress (ASDS). We utilized a customized cage and specifically developed observational programs for nest building, social avoidance, and other behaviors using an infrared depth camera to acquire three-dimensional (3D) data of animal behavior (Negura system). We determined the volume of nesting materials from these 3D depth images. Mice exposed to ASDS showed increased spontaneous activities, decreased rearing, and delayed nest building; however, nest-building activity was gradually recovered during the dark period of the 24 hr observation interval. At the endpoint following 24 hr, the ASDS and control groups showed no differences in nest volumes. Furthermore, we observed the time courses of both nest building and social avoidance behaviors and their relationship using the Negura system. Our data demonstrated a weak positive correlation between nest-building delay and social avoidance in ASDS mice. The Negura system can observe various behaviors that reflect the effects of social defeat stress.
Collapse
Affiliation(s)
- Hikari Otabi
- College of Agriculture, Ibaraki University, Ami, Japan.,United Graduate School of Agricultural Science, Tokyo University of Agriculture and Technology, Fuchu, Japan
| | - Tsuyoshi Okayama
- College of Agriculture, Ibaraki University, Ami, Japan.,United Graduate School of Agricultural Science, Tokyo University of Agriculture and Technology, Fuchu, Japan.,Ibaraki University Cooperation between Agriculture and Medical Science (IUCAM), Ami, Japan
| | - Atsushi Toyoda
- College of Agriculture, Ibaraki University, Ami, Japan.,United Graduate School of Agricultural Science, Tokyo University of Agriculture and Technology, Fuchu, Japan.,Ibaraki University Cooperation between Agriculture and Medical Science (IUCAM), Ami, Japan
| |
Collapse
|
19
|
Nourizonoz A, Zimmermann R, Ho CLA, Pellat S, Ormen Y, Prévost-Solié C, Reymond G, Pifferi F, Aujard F, Herrel A, Huber D. EthoLoop: automated closed-loop neuroethology in naturalistic environments. Nat Methods 2020; 17:1052-1059. [PMID: 32994566 DOI: 10.1038/s41592-020-0961-2] [Citation(s) in RCA: 31] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2019] [Accepted: 08/24/2020] [Indexed: 01/07/2023]
Abstract
Accurate tracking and analysis of animal behavior is crucial for modern systems neuroscience. However, following freely moving animals in naturalistic, three-dimensional (3D) or nocturnal environments remains a major challenge. Here, we present EthoLoop, a framework for studying the neuroethology of freely roaming animals. Combining real-time optical tracking and behavioral analysis with remote-controlled stimulus-reward boxes, this system allows direct interactions with animals in their habitat. EthoLoop continuously provides close-up views of the tracked individuals and thus allows high-resolution behavioral analysis using deep-learning methods. The behaviors detected on the fly can be automatically reinforced either by classical conditioning or by optogenetic stimulation via wirelessly controlled portable devices. Finally, by combining 3D tracking with wireless neurophysiology we demonstrate the existence of place-cell-like activity in the hippocampus of freely moving primates. Taken together, we show that the EthoLoop framework enables interactive, well-controlled and reproducible neuroethological studies in large-field naturalistic settings.
Collapse
Affiliation(s)
- Ali Nourizonoz
- University of Geneva, Department of Basic Neurosciences, Geneva, Switzerland
| | - Robert Zimmermann
- University of Geneva, Department of Basic Neurosciences, Geneva, Switzerland
| | - Chun Lum Andy Ho
- University of Geneva, Department of Basic Neurosciences, Geneva, Switzerland
| | - Sebastien Pellat
- University of Geneva, Department of Basic Neurosciences, Geneva, Switzerland
| | - Yannick Ormen
- University of Geneva, Department of Basic Neurosciences, Geneva, Switzerland
| | | | - Gilles Reymond
- University of Geneva, Department of Basic Neurosciences, Geneva, Switzerland
| | - Fabien Pifferi
- Musée National d'Histoire Naturelle, Adaptive Mechanisms and Evolution, UMR7179-CNRS, Paris, France
| | - Fabienne Aujard
- Musée National d'Histoire Naturelle, Adaptive Mechanisms and Evolution, UMR7179-CNRS, Paris, France
| | - Anthony Herrel
- Musée National d'Histoire Naturelle, Adaptive Mechanisms and Evolution, UMR7179-CNRS, Paris, France
| | - Daniel Huber
- University of Geneva, Department of Basic Neurosciences, Geneva, Switzerland.
| |
Collapse
|
20
|
Automated Measures of Force and Motion Can Improve Our Understanding of Infants’ Motor Persistence. JOURNAL OF MOTOR LEARNING AND DEVELOPMENT 2020. [DOI: 10.1123/jmld.2019-0010] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Every day, young learners are confronted with challenges. The degree to which they persist in overcoming those challenges, and the different ways they persist, provides critical insights into the various cognitive, motoric, and affective processes that drive behavior. Here, we present a systematic overview of the methodologies that have been traditionally used to study persistence, and offer suggestions for new approaches to the study of persistence that will make strides in moving the field forward. We argue that automated measures of force and motion, which have long been used in the study of infants’ motoric behavior, can provide a means to unravel the psychological processes that guide infants’ trying behavior. To illustrate this, we present a case study that highlights the novel lessons to be learned by the use of automated measures of force and motion regarding infants’ persistence, along with an analysis of the benefits and drawbacks of this approach, as well as detailed instructions for application. In sum, we conclude that these measures, when used in conjunction with more traditional approaches, will provide creative new insights into the nature and development of early persistence.
Collapse
|
21
|
Ryait H, Bermudez-Contreras E, Harvey M, Faraji J, Mirza Agha B, Gomez-Palacio Schjetnan A, Gruber A, Doan J, Mohajerani M, Metz GAS, Whishaw IQ, Luczak A. Data-driven analyses of motor impairments in animal models of neurological disorders. PLoS Biol 2019; 17:e3000516. [PMID: 31751328 PMCID: PMC6871764 DOI: 10.1371/journal.pbio.3000516] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2019] [Accepted: 10/18/2019] [Indexed: 12/14/2022] Open
Abstract
Behavior provides important insights into neuronal processes. For example, analysis of reaching movements can give a reliable indication of the degree of impairment in neurological disorders such as stroke, Parkinson disease, or Huntington disease. The analysis of such movement abnormalities is notoriously difficult and requires a trained evaluator. Here, we show that a deep neural network is able to score behavioral impairments with expert accuracy in rodent models of stroke. The same network was also trained to successfully score movements in a variety of other behavioral tasks. The neural network also uncovered novel movement alterations related to stroke, which had higher predictive power of stroke volume than the movement components defined by human experts. Moreover, when the regression network was trained only on categorical information (control = 0; stroke = 1), it generated predictions with intermediate values between 0 and 1 that matched the human expert scores of stroke severity. The network thus offers a new data-driven approach to automatically derive ratings of motor impairments. Altogether, this network can provide a reliable neurological assessment and can assist the design of behavioral indices to diagnose and monitor neurological disorders.
Collapse
Affiliation(s)
- Hardeep Ryait
- Canadian Center for Behavioural Neuroscience, University of Lethbridge, Lethbridge, Alberta, Canada
| | - Edgar Bermudez-Contreras
- Canadian Center for Behavioural Neuroscience, University of Lethbridge, Lethbridge, Alberta, Canada
| | - Matthew Harvey
- Coastline Automation, San Jose, California, United States of America
| | - Jamshid Faraji
- Canadian Center for Behavioural Neuroscience, University of Lethbridge, Lethbridge, Alberta, Canada
- Faculty of Nursing & Midwifery, Golestan University of Medical Sciences, Gorgan, Iran
| | - Behroo Mirza Agha
- Canadian Center for Behavioural Neuroscience, University of Lethbridge, Lethbridge, Alberta, Canada
| | | | - Aaron Gruber
- Canadian Center for Behavioural Neuroscience, University of Lethbridge, Lethbridge, Alberta, Canada
| | - Jon Doan
- Canadian Center for Behavioural Neuroscience, University of Lethbridge, Lethbridge, Alberta, Canada
| | - Majid Mohajerani
- Canadian Center for Behavioural Neuroscience, University of Lethbridge, Lethbridge, Alberta, Canada
| | - Gerlinde A. S. Metz
- Canadian Center for Behavioural Neuroscience, University of Lethbridge, Lethbridge, Alberta, Canada
| | - Ian Q. Whishaw
- Canadian Center for Behavioural Neuroscience, University of Lethbridge, Lethbridge, Alberta, Canada
| | - Artur Luczak
- Canadian Center for Behavioural Neuroscience, University of Lethbridge, Lethbridge, Alberta, Canada
| |
Collapse
|
22
|
Rodent Activity Detector (RAD), an Open Source Device for Measuring Activity in Rodent Home Cages. eNeuro 2019; 6:ENEURO.0160-19.2019. [PMID: 31235468 PMCID: PMC6620392 DOI: 10.1523/eneuro.0160-19.2019] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2019] [Revised: 06/04/2019] [Accepted: 06/08/2019] [Indexed: 01/10/2023] Open
Abstract
Physical activity is a critical behavioral variable in many research studies and is, therefore, important to quantify. However, existing methods for measuring physical activity have limitations which include high expense, specialized caging or equipment, and high computational overhead. To address these limitations, we present an open-source, cost-effective, device for measuring rodent activity. Our device is battery powered and designed to be placed in vivarium home cages to enable high-throughput, long-term operation with minimal investigator intervention. The primary aim of this study was to assess the feasibility of using passive infrared (PIR) sensors and microcontroller-based dataloggers in a rodent home cages to collect physical activity records. To this end, we developed an open-source PIR based data-logging device called the rodent activity detector (RAD). We publish the design files and code so others can readily build the RAD in their own labs. To demonstrate its utility, we used the RAD to collect physical activity data from 40 individually housed mice for up to 10 weeks. This dataset demonstrates the ability of the RAD to (1) operate in a high-throughput installation, (2) detect high-fat diet (HFD)-induced changes in physical activity, and (3) quantify circadian rhythms in individual animals. We further validated the data output of the RAD with simultaneous video tracking of mice in multiple caging configurations, to determine the features of physical activity that it detects. The RAD is easy to build, economical, and fits in vivarium caging. The scalability of such devices will enable high-throughput studies of physical activity in research studies.
Collapse
|
23
|
Nath T, Mathis A, Chen AC, Patel A, Bethge M, Mathis MW. Using DeepLabCut for 3D markerless pose estimation across species and behaviors. Nat Protoc 2019; 14:2152-2176. [PMID: 31227823 DOI: 10.1038/s41596-019-0176-0] [Citation(s) in RCA: 539] [Impact Index Per Article: 107.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2018] [Accepted: 04/09/2019] [Indexed: 12/13/2022]
Abstract
Noninvasive behavioral tracking of animals during experiments is critical to many scientific pursuits. Extracting the poses of animals without using markers is often essential to measuring behavioral effects in biomechanics, genetics, ethology, and neuroscience. However, extracting detailed poses without markers in dynamically changing backgrounds has been challenging. We recently introduced an open-source toolbox called DeepLabCut that builds on a state-of-the-art human pose-estimation algorithm to allow a user to train a deep neural network with limited training data to precisely track user-defined features that match human labeling accuracy. Here, we provide an updated toolbox, developed as a Python package, that includes new features such as graphical user interfaces (GUIs), performance improvements, and active-learning-based network refinement. We provide a step-by-step procedure for using DeepLabCut that guides the user in creating a tailored, reusable analysis pipeline with a graphical processing unit (GPU) in 1-12 h (depending on frame size). Additionally, we provide Docker environments and Jupyter Notebooks that can be run on cloud resources such as Google Colaboratory.
Collapse
Affiliation(s)
- Tanmay Nath
- Rowland Institute at Harvard, Harvard University, Cambridge, MA, USA
| | - Alexander Mathis
- Rowland Institute at Harvard, Harvard University, Cambridge, MA, USA.,Department of Molecular & Cellular Biology, Harvard University, Cambridge, MA, USA
| | - An Chi Chen
- Department of Electrical Engineering, University of Cape Town, Cape Town, South Africa
| | - Amir Patel
- Department of Electrical Engineering, University of Cape Town, Cape Town, South Africa
| | - Matthias Bethge
- Tübingen AI Center & Centre for Integrative Neuroscience, Eberhard Karls Universität Tübingen, Tübingen, Germany
| | | |
Collapse
|
24
|
SEXRAT MALE: A smartphone and tablet application to annotate and process live sexual behavior in male rodents. J Neurosci Methods 2019; 320:9-15. [DOI: 10.1016/j.jneumeth.2019.03.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2019] [Revised: 02/25/2019] [Accepted: 03/01/2019] [Indexed: 11/18/2022]
|
25
|
Aharoni D, Hoogland TM. Circuit Investigations With Open-Source Miniaturized Microscopes: Past, Present and Future. Front Cell Neurosci 2019; 13:141. [PMID: 31024265 PMCID: PMC6461004 DOI: 10.3389/fncel.2019.00141] [Citation(s) in RCA: 105] [Impact Index Per Article: 21.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2019] [Accepted: 03/20/2019] [Indexed: 11/30/2022] Open
Abstract
The ability to simultaneously image the spatiotemporal activity signatures from many neurons during unrestrained vertebrate behaviors has become possible through the development of miniaturized fluorescence microscopes, or miniscopes, sufficiently light to be carried by small animals such as bats, birds and rodents. Miniscopes have permitted the study of circuits underlying song vocalization, action sequencing, head-direction tuning, spatial memory encoding and sleep to name a few. The foundation for these microscopes has been laid over the last two decades through academic research with some of this work resulting in commercialization. More recently, open-source initiatives have led to an even broader adoption of miniscopes in the neuroscience community. Open-source designs allow for rapid modification and extension of their function, which has resulted in a new generation of miniscopes that now permit wire-free or wireless recording, concurrent electrophysiology and imaging, two-color fluorescence detection, simultaneous optical actuation and read-out as well as wide-field and volumetric light-field imaging. These novel miniscopes will further expand the toolset of those seeking affordable methods to probe neural circuit function during naturalistic behaviors. Here, we will discuss the early development, present use and future potential of miniscopes.
Collapse
Affiliation(s)
- Daniel Aharoni
- Department of Neurology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, CA, United States
| | - Tycho M Hoogland
- Department of Neuroscience, Erasmus Medical Center, Rotterdam, Netherlands.,Netherlands Institute for Neuroscience, Royal Netherlands Academy of Arts and Sciences, Amsterdam, Netherlands
| |
Collapse
|
26
|
DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nat Neurosci 2018; 21:1281-1289. [PMID: 30127430 DOI: 10.1038/s41593-018-0209-y] [Citation(s) in RCA: 1781] [Impact Index Per Article: 296.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2018] [Accepted: 06/27/2018] [Indexed: 12/21/2022]
Abstract
Quantifying behavior is crucial for many applications in neuroscience. Videography provides easy methods for the observation and recording of animal behavior in diverse settings, yet extracting particular aspects of a behavior for further analysis can be highly time consuming. In motor control studies, humans or other animals are often marked with reflective markers to assist with computer-based tracking, but markers are intrusive, and the number and location of the markers must be determined a priori. Here we present an efficient method for markerless pose estimation based on transfer learning with deep neural networks that achieves excellent results with minimal training data. We demonstrate the versatility of this framework by tracking various body parts in multiple species across a broad collection of behaviors. Remarkably, even when only a small number of frames are labeled (~200), the algorithm achieves excellent tracking performance on test frames that is comparable to human accuracy.
Collapse
|
27
|
Rachinas-Lopes P, Ribeiro R, dos Santos ME, M. Costa R. D-Track-A semi-automatic 3D video-tracking technique to analyse movements and routines of aquatic animals with application to captive dolphins. PLoS One 2018; 13:e0201614. [PMID: 30114265 PMCID: PMC6095516 DOI: 10.1371/journal.pone.0201614] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2018] [Accepted: 07/18/2018] [Indexed: 11/19/2022] Open
Abstract
Scoring and tracking animal movements manually is a time consuming and subjective process, susceptible to errors due to fatigue. Automated and semi-automated video-based tracking methods have been developed to overcome the errors and biases of manual analyses. In this manuscript we present D-Track, an open-source semi-automatic tracking system able to quantify the 3D trajectories of dolphins, non-invasively, in the water. This software produces a three-dimensional reconstruction of the pool and tracks the animal at different depths, using standard cameras. D-Track allows the determination of spatial preferences of the animals, their speed and its variations, and the identification of behavioural routines. We tested the system with two captive dolphins during different periods of the day. Both animals spent around 85% of the time at the surface of the Deep Area of their pool (5-meters depth). Both dolphins showed a stable average speed throughout 31 sessions, with slow speeds predominant (maximum 1.7 ms-1). Circular swimming was highly variable, with significant differences in the size and duration of the “circles”, between animals, within-animals and across sessions. The D-Track system is a novel tool to study the behaviour of aquatic animals, and it represents a convenient and inexpensive solution for laboratories and marine parks to monitor the preferences and routines of their animals.
Collapse
Affiliation(s)
- Patrícia Rachinas-Lopes
- Champalimaud Neuroscience Programme, Champalimaud Center for the Unknown, Lisboa, Portugal
- MARE – Marine and Environmental Sciences Centre, ISPA – Instituto Universitário, Lisboa, Portugal
- * E-mail:
| | - Ricardo Ribeiro
- Champalimaud Neuroscience Programme, Champalimaud Center for the Unknown, Lisboa, Portugal
| | - Manuel E. dos Santos
- MARE – Marine and Environmental Sciences Centre, ISPA – Instituto Universitário, Lisboa, Portugal
| | - Rui M. Costa
- Champalimaud Neuroscience Programme, Champalimaud Center for the Unknown, Lisboa, Portugal
| |
Collapse
|
28
|
Wang Z, Mirbozorgi SA, Ghovanloo M. An automated behavior analysis system for freely moving rodents using depth image. Med Biol Eng Comput 2018; 56:1807-1821. [PMID: 29560548 DOI: 10.1007/s11517-018-1816-1] [Citation(s) in RCA: 20] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2017] [Accepted: 03/08/2018] [Indexed: 11/26/2022]
Abstract
A rodent behavior analysis system is presented, capable of automated tracking, pose estimation, and recognition of nine behaviors in freely moving animals. The system tracks three key points on the rodent body (nose, center of body, and base of tail) to estimate its pose and head rotation angle in real time. A support vector machine (SVM)-based model, including label optimization steps, is trained to classify on a frame-by-frame basis: resting, walking, bending, grooming, sniffing, rearing supported, rearing unsupported, micro-movements, and "other" behaviors. Compared to conventional red-green-blue (RGB) camera-based methods, the proposed system operates on 3D depth images provided by the Kinect infrared (IR) camera, enabling stable performance regardless of lighting conditions and animal color contrast with the background. This is particularly beneficial for monitoring nocturnal animals' behavior. 3D features are designed to be extracted directly from the depth stream and combined with contour-based 2D features to further improve recognition accuracies. The system is validated on three freely behaving rats for 168 min in total. The behavior recognition model achieved a cross-validation accuracy of 86.8% on the rat used for training and accuracies of 82.1 and 83% on the other two "testing" rats. The automated head angle estimation aided by behavior recognition resulted in 0.76 correlation with human expert annotation. Graphical abstract Top view of a rat freely behaving in a standard homecage, captured by Kinect-v2 sensors. The depth image is used for constructing a 3D topography of the animal for pose estimation, behavior recognition, and head angle calculation. Results of the processed data are displayed on the user interface in various forms.
Collapse
Affiliation(s)
- Zheyuan Wang
- GT-Bionics Lab, School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA, 30308, USA
| | - S Abdollah Mirbozorgi
- GT-Bionics Lab, School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA, 30308, USA
| | - Maysam Ghovanloo
- GT-Bionics Lab, School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA, 30308, USA.
| |
Collapse
|
29
|
Integrated Change Detection and Classification in Urban Areas Based on Airborne Laser Scanning Point Clouds. SENSORS 2018; 18:s18020448. [PMID: 29401656 PMCID: PMC5855963 DOI: 10.3390/s18020448] [Citation(s) in RCA: 39] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/19/2017] [Revised: 01/30/2018] [Accepted: 01/31/2018] [Indexed: 11/17/2022]
Abstract
This paper suggests a new approach for change detection (CD) in 3D point clouds. It combines classification and CD in one step using machine learning. The point cloud data of both epochs are merged for computing features of four types: features describing the point distribution, a feature relating to relative terrain elevation, features specific for the multi-target capability of laser scanning, and features combining the point clouds of both epochs to identify the change. All these features are merged in the points and then training samples are acquired to create the model for supervised classification, which is then applied to the whole study area. The final results reach an overall accuracy of over 90% for both epochs of eight classes: lost tree, new tree, lost building, new building, changed ground, unchanged building, unchanged tree, and unchanged ground.
Collapse
|
30
|
A Markerless 3D Computerized Motion Capture System Incorporating a Skeleton Model for Monkeys. PLoS One 2016; 11:e0166154. [PMID: 27812205 PMCID: PMC5094601 DOI: 10.1371/journal.pone.0166154] [Citation(s) in RCA: 30] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2016] [Accepted: 10/04/2016] [Indexed: 12/27/2022] Open
Abstract
In this study, we propose a novel markerless motion capture system (MCS) for monkeys, in which 3D surface images of monkeys were reconstructed by integrating data from four depth cameras, and a skeleton model of the monkey was fitted onto 3D images of monkeys in each frame of the video. To validate the MCS, first, estimated 3D positions of body parts were compared between the 3D MCS-assisted estimation and manual estimation based on visual inspection when a monkey performed a shuttling behavior in which it had to avoid obstacles in various positions. The mean estimation error of the positions of body parts (3-14 cm) and of head rotation (35-43°) between the 3D MCS-assisted and manual estimation were comparable to the errors between two different experimenters performing manual estimation. Furthermore, the MCS could identify specific monkey actions, and there was no false positive nor false negative detection of actions compared with those in manual estimation. Second, to check the reproducibility of MCS-assisted estimation, the same analyses of the above experiments were repeated by a different user. The estimation errors of positions of most body parts between the two experimenters were significantly smaller in the MCS-assisted estimation than in the manual estimation. Third, effects of methamphetamine (MAP) administration on the spontaneous behaviors of four monkeys were analyzed using the MCS. MAP significantly increased head movements, tended to decrease locomotion speed, and had no significant effect on total path length. The results were comparable to previous human clinical data. Furthermore, estimated data following MAP injection (total path length, walking speed, and speed of head rotation) correlated significantly between the two experimenters in the MCS-assisted estimation (r = 0.863 to 0.999). The results suggest that the presented MCS in monkeys is useful in investigating neural mechanisms underlying various psychiatric disorders and developing pharmacological interventions.
Collapse
|
31
|
Matsumoto J, Nishimaru H, Takamura Y, Urakawa S, Ono T, Nishijo H. Amygdalar Auditory Neurons Contribute to Self-Other Distinction during Ultrasonic Social Vocalization in Rats. Front Neurosci 2016; 10:399. [PMID: 27703429 PMCID: PMC5028407 DOI: 10.3389/fnins.2016.00399] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2016] [Accepted: 08/15/2016] [Indexed: 12/18/2022] Open
Abstract
Although, clinical studies reported hyperactivation of the auditory system and amygdala in patients with auditory hallucinations (hearing others' but not one's own voice, independent of any external stimulus), neural mechanisms of self/other attribution is not well understood. We recorded neuronal responses in the dorsal amygdala including the lateral amygdaloid nucleus to ultrasonic vocalization (USVs) emitted by subjects and conspecifics during free social interaction in 16 adult male rats. The animals emitting the USVs were identified by EMG recordings. One-quarter of the amygdalar neurons (15/60) responded to 50 kHz calls by the subject and/or conspecifics. Among the responsive neurons, most neurons (Type-Other neurons; 73%, 11/15) responded only to calls by conspecifics but not subjects. Two Type-Self neurons (13%, 2/15) responded to calls by the subject but not those by conspecifics, although their response selectivity to subjects vs. conspecifics was lower than that of Type-Other neurons. The remaining two neurons (13%) responded to calls by both the subject and conspecifics. Furthermore, population coding of the amygdalar neurons represented distinction of subject vs. conspecific calls. The present results provide the first neurophysiological evidence that the amygdala discriminately represents affective social calls by subject and conspecifics. These findings suggest that the amygdala is an important brain region for self/other attribution. Furthermore, pathological activation of the amygdala, where Type-Other neurons predominate, could induce external misattribution of percepts of vocalization.
Collapse
Affiliation(s)
- Jumpei Matsumoto
- System Emotional Science, Graduate School of Medicine and Pharmaceutical Sciences, University of Toyama Toyama, Japan
| | - Hiroshi Nishimaru
- System Emotional Science, Graduate School of Medicine and Pharmaceutical Sciences, University of Toyama Toyama, Japan
| | - Yusaku Takamura
- System Emotional Science, Graduate School of Medicine and Pharmaceutical Sciences, University of Toyama Toyama, Japan
| | - Susumu Urakawa
- System Emotional Science, Graduate School of Medicine and Pharmaceutical Sciences, University of Toyama Toyama, Japan
| | - Taketoshi Ono
- System Emotional Science, Graduate School of Medicine and Pharmaceutical Sciences, University of Toyama Toyama, Japan
| | - Hisao Nishijo
- System Emotional Science, Graduate School of Medicine and Pharmaceutical Sciences, University of Toyama Toyama, Japan
| |
Collapse
|
32
|
Novel approach to automatically classify rat social behavior using a video tracking system. J Neurosci Methods 2016; 268:163-70. [DOI: 10.1016/j.jneumeth.2016.02.020] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2015] [Revised: 02/23/2016] [Accepted: 02/26/2016] [Indexed: 11/19/2022]
|
33
|
Barnard S, Calderara S, Pistocchi S, Cucchiara R, Podaliri-Vulpiani M, Messori S, Ferri N. Quick, Accurate, Smart: 3D Computer Vision Technology Helps Assessing Confined Animals' Behaviour. PLoS One 2016; 11:e0158748. [PMID: 27415814 PMCID: PMC4944961 DOI: 10.1371/journal.pone.0158748] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2016] [Accepted: 06/21/2016] [Indexed: 11/18/2022] Open
Abstract
Mankind directly controls the environment and lifestyles of several domestic species for purposes ranging from production and research to conservation and companionship. These environments and lifestyles may not offer these animals the best quality of life. Behaviour is a direct reflection of how the animal is coping with its environment. Behavioural indicators are thus among the preferred parameters to assess welfare. However, behavioural recording (usually from video) can be very time consuming and the accuracy and reliability of the output rely on the experience and background of the observers. The outburst of new video technology and computer image processing gives the basis for promising solutions. In this pilot study, we present a new prototype software able to automatically infer the behaviour of dogs housed in kennels from 3D visual data and through structured machine learning frameworks. Depth information acquired through 3D features, body part detection and training are the key elements that allow the machine to recognise postures, trajectories inside the kennel and patterns of movement that can be later labelled at convenience. The main innovation of the software is its ability to automatically cluster frequently observed temporal patterns of movement without any pre-set ethogram. Conversely, when common patterns are defined through training, a deviation from normal behaviour in time or between individuals could be assessed. The software accuracy in correctly detecting the dogs’ behaviour was checked through a validation process. An automatic behaviour recognition system, independent from human subjectivity, could add scientific knowledge on animals’ quality of life in confinement as well as saving time and resources. This 3D framework was designed to be invariant to the dog’s shape and size and could be extended to farm, laboratory and zoo quadrupeds in artificial housing. The computer vision technique applied to this software is innovative in non-human animal behaviour science. Further improvements and validation are needed, and future applications and limitations are discussed.
Collapse
Affiliation(s)
- Shanis Barnard
- Istituto Zooprofilattico Sperimentale dell’Abruzzo e del Molise, Teramo, Italy
- * E-mail:
| | - Simone Calderara
- Engineering Department “Enzo Ferrari”, University of Modena and Reggio Emilia, Modena, Italy
| | - Simone Pistocchi
- Engineering Department “Enzo Ferrari”, University of Modena and Reggio Emilia, Modena, Italy
| | - Rita Cucchiara
- Engineering Department “Enzo Ferrari”, University of Modena and Reggio Emilia, Modena, Italy
| | | | - Stefano Messori
- Istituto Zooprofilattico Sperimentale dell’Abruzzo e del Molise, Teramo, Italy
| | - Nicola Ferri
- Istituto Zooprofilattico Sperimentale dell’Abruzzo e del Molise, Teramo, Italy
| |
Collapse
|
34
|
Goto T, Tomonaga S, Okayama T, Toyoda A. Murine Depression Model and its Potential Applications for Discovering Foods and Farm Products with Antidepressant-Like Effects. Front Neurosci 2016; 10:72. [PMID: 26973450 PMCID: PMC4771721 DOI: 10.3389/fnins.2016.00072] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2015] [Accepted: 02/16/2016] [Indexed: 12/26/2022] Open
Abstract
Advanced societies face increased health problems related to various stresses. Chronic psychological stress is a major risk factor for psychiatric disorders such as depression. Although therapeutic agents reduce several symptoms of depression, most have side effects in a broad range of the population. Furthermore, some victims of depression do not show significant improvement with any drugs, so alternative approaches are needed. Good dietary habits may potentially reduce depressive symptoms, but there is little scientific evidence thus far. Murine depression models are useful to test nutritional approaches in vivo. Our model mice subjected to a subchronic mild social defeat stress (sCSDS) paradigm show several alterations in physiological parameters and social behavior. These stress-induced symptoms in sCSDS mice can be used as cues to identify antidepressant-like natural resources including foods and farm products. We previously discovered that sCSDS mice show more vulnerability to social stress by changing dietary condition. In addition, we developed a more objective system for analyzing mouse behavior using a 3D depth-sensing camera to understand relationships between diet and behavior. The combination of sCSDS mice with 3D behavioral analysis is a powerful method for screening ingredients in foods and farm products for antidepressant-like effects.
Collapse
Affiliation(s)
- Tatsuhiko Goto
- Department of Biological Production Science, College of Agriculture, Ibaraki UniversityAmi, Ibaraki, Japan; Department of Biological Production Science, Ibaraki University Cooperation between Agriculture and Medical ScienceAmi, Ibaraki, Japan
| | - Shozo Tomonaga
- Graduate School of Agriculture, Kyoto University Kyoto, Japan
| | - Tsuyoshi Okayama
- Department of Biological Production Science, College of Agriculture, Ibaraki UniversityAmi, Ibaraki, Japan; Department of Biological Production Science, Ibaraki University Cooperation between Agriculture and Medical ScienceAmi, Ibaraki, Japan; Department of Biological Production Science, United Graduate School of Agricultural Science, Tokyo University of Agriculture and TechnologyFuchu, Japan
| | - Atsushi Toyoda
- Department of Biological Production Science, College of Agriculture, Ibaraki UniversityAmi, Ibaraki, Japan; Department of Biological Production Science, Ibaraki University Cooperation between Agriculture and Medical ScienceAmi, Ibaraki, Japan; Department of Biological Production Science, United Graduate School of Agricultural Science, Tokyo University of Agriculture and TechnologyFuchu, Japan
| |
Collapse
|
35
|
|
36
|
Hong W, Kennedy A, Burgos-Artizzu XP, Zelikowsky M, Navonne SG, Perona P, Anderson DJ. Automated measurement of mouse social behaviors using depth sensing, video tracking, and machine learning. Proc Natl Acad Sci U S A 2015; 112:E5351-60. [PMID: 26354123 PMCID: PMC4586844 DOI: 10.1073/pnas.1515982112] [Citation(s) in RCA: 148] [Impact Index Per Article: 16.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
A lack of automated, quantitative, and accurate assessment of social behaviors in mammalian animal models has limited progress toward understanding mechanisms underlying social interactions and their disorders such as autism. Here we present a new integrated hardware and software system that combines video tracking, depth sensing, and machine learning for automatic detection and quantification of social behaviors involving close and dynamic interactions between two mice of different coat colors in their home cage. We designed a hardware setup that integrates traditional video cameras with a depth camera, developed computer vision tools to extract the body "pose" of individual animals in a social context, and used a supervised learning algorithm to classify several well-described social behaviors. We validated the robustness of the automated classifiers in various experimental settings and used them to examine how genetic background, such as that of Black and Tan Brachyury (BTBR) mice (a previously reported autism model), influences social behavior. Our integrated approach allows for rapid, automated measurement of social behaviors across diverse experimental designs and also affords the ability to develop new, objective behavioral metrics.
Collapse
Affiliation(s)
- Weizhe Hong
- Division of Biology and Biological Engineering 156-29, Howard Hughes Medical Institute, California Institute of Technology, Pasadena, CA 91125;
| | - Ann Kennedy
- Division of Biology and Biological Engineering 156-29, Howard Hughes Medical Institute, California Institute of Technology, Pasadena, CA 91125
| | - Xavier P Burgos-Artizzu
- Division of Engineering and Applied Sciences 136-93, California Institute of Technology, Pasadena, CA 91125
| | - Moriel Zelikowsky
- Division of Biology and Biological Engineering 156-29, Howard Hughes Medical Institute, California Institute of Technology, Pasadena, CA 91125
| | - Santiago G Navonne
- Division of Engineering and Applied Sciences 136-93, California Institute of Technology, Pasadena, CA 91125
| | - Pietro Perona
- Division of Engineering and Applied Sciences 136-93, California Institute of Technology, Pasadena, CA 91125
| | - David J Anderson
- Division of Biology and Biological Engineering 156-29, Howard Hughes Medical Institute, California Institute of Technology, Pasadena, CA 91125;
| |
Collapse
|
37
|
Goto T, Okayama T, Toyoda A. Strain differences in temporal changes of nesting behaviors in C57BL/6N, DBA/2N, and their F1 hybrid mice assessed by a three-dimensional monitoring system. Behav Processes 2015. [PMID: 26220275 DOI: 10.1016/j.beproc.2015.07.007] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Nest building is one of the innate behaviors that are widely observed throughout the animal kingdom. Previous studies have reported specific brain regions and genetic loci associated with nest building in mice. These studies mainly evaluated the nest structure, without observing the nesting process. In this study, we evaluated the effects of strain and learning on the nesting process of mice using a 3D depth camera. To determine the quality of the nest structure, a conventional scoring method, Deacon scores 1-5, was applied to the recorded depth images. The final score of the nest, latency to start nesting behavior, and latencies to reach Deacon scores 3-5, were determined using three genetically different mouse strains-C57BL/6NCrl (B6), DBA/2NCrlCrlj (DBA), and B6D2F1/Crl (B6D2F1). The final score of the DBA nest was significantly lower than that of the B6D2F1 nest, and DBA mice showed significantly longer latency to start nest building than the other two strains in the first trial. By observing the time course of nest building, we confirmed that DBA mice took significantly longer to build their nests than B6 and B6D2F1 mice. Although we did not find any significant differences between DBA and B6 mice in the final assessment of the nest based on the Deacon method, overnight monitoring of the nesting behavior using a 3D depth camera could elucidate the clear differences in the amount of time spent nesting between DBA and B6 mice. In addition, the learning effect was more evident in DBA mice than it was in B6 in terms of latencies to reach Deacon score 3-5 in five repeated trials. DBA mice showed a gradual decrease in latency to build, whereas nesting behaviors of B6 mice were relatively consistent throughout the five trials. Therefore, our 3D depth image method gives higher resolution and structural information regarding the nesting process in mice. Future genetic analyses using the 3D assessment system will provide novel insights into the complex genetic basis for nesting and other behaviors in animals.
Collapse
Affiliation(s)
- Tatsuhiko Goto
- College of Agriculture, Ibaraki University, Ami, Ibaraki 300-0393, Japan; Ibaraki University Cooperation between Agriculture and Medical Science (IUCAM), Ami, Ibaraki 300-0393, Japan
| | - Tsuyoshi Okayama
- College of Agriculture, Ibaraki University, Ami, Ibaraki 300-0393, Japan; Ibaraki University Cooperation between Agriculture and Medical Science (IUCAM), Ami, Ibaraki 300-0393, Japan; United Graduate School of Agricultural Science, Tokyo University of Agriculture and Technology, Fuchu-city, Tokyo 183-8509, Japan
| | - Atsushi Toyoda
- College of Agriculture, Ibaraki University, Ami, Ibaraki 300-0393, Japan; Ibaraki University Cooperation between Agriculture and Medical Science (IUCAM), Ami, Ibaraki 300-0393, Japan; United Graduate School of Agricultural Science, Tokyo University of Agriculture and Technology, Fuchu-city, Tokyo 183-8509, Japan.
| |
Collapse
|
38
|
Okayama T, Goto T, Toyoda A. Assessing nest-building behavior of mice using a 3D depth camera. J Neurosci Methods 2015; 251:151-7. [PMID: 26051553 DOI: 10.1016/j.jneumeth.2015.05.019] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2014] [Revised: 04/27/2015] [Accepted: 05/28/2015] [Indexed: 01/18/2023]
Abstract
We developed a novel method to evaluate the nest-building behavior of mice using an inexpensive depth camera. The depth camera clearly captured nest-building behavior. Using three-dimensional information from the depth camera, we obtained objective features for assessing nest-building behavior, including "volume," "radius," and "mean height". The "volume" represents the change in volume of the nesting material, a pressed cotton square that a mouse shreds and untangles in order to build its nest. During the nest-building process, the total volume of cotton fragments is increased. The "radius" refers to the radius of the circle enclosing the fragments of cotton. It describes the extent of nesting material dispersion. The "radius" averaged approximately 60mm when a nest was built. The "mean height" represents the change in the mean height of objects. If the nest walls were high, the "mean height" was also high. These features provided us with useful information for assessment of nest-building behavior, similar to conventional methods for the assessment of nest building. However, using the novel method, we found that JF1 mice built nests with higher walls than B6 mice, and B6 mice built nests faster than JF1 mice. Thus, our novel method can evaluate the differences in nest-building behavior that cannot be detected or quantified by conventional methods. In future studies, we will evaluate nest-building behaviors of genetically modified, as well as several inbred, strains of mice, with several nesting materials.
Collapse
Affiliation(s)
- Tsuyoshi Okayama
- College of Agriculture, Ibaraki University, Ami 300-0393, Ibaraki, Japan; United Graduate School of Agricultural Science, Tokyo University of Agriculture and Technology, Fuchu-city 183-8509, Tokyo, Japan; Ibaraki University Cooperation between Agriculture and Medical Science (IUCAM), Ami 300-0393, Ibaraki, Japan.
| | - Tatsuhiko Goto
- College of Agriculture, Ibaraki University, Ami 300-0393, Ibaraki, Japan; Ibaraki University Cooperation between Agriculture and Medical Science (IUCAM), Ami 300-0393, Ibaraki, Japan.
| | - Atsushi Toyoda
- College of Agriculture, Ibaraki University, Ami 300-0393, Ibaraki, Japan; United Graduate School of Agricultural Science, Tokyo University of Agriculture and Technology, Fuchu-city 183-8509, Tokyo, Japan; Ibaraki University Cooperation between Agriculture and Medical Science (IUCAM), Ami 300-0393, Ibaraki, Japan.
| |
Collapse
|
39
|
Peters SM, Pothuizen HHJ, Spruijt BM. Ethological concepts enhance the translational value of animal models. Eur J Pharmacol 2015; 759:42-50. [PMID: 25823814 DOI: 10.1016/j.ejphar.2015.03.043] [Citation(s) in RCA: 46] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2015] [Revised: 02/25/2015] [Accepted: 03/12/2015] [Indexed: 12/21/2022]
Abstract
The translational value of animal models is an issue of ongoing discussion. We argue that 'Refinement' of animal experiments is needed and this can be achieved by exploiting an ethological approach when setting up and conducting experiments. Ethology aims to assess the functional meaning of behavioral changes, due to experimental manipulation or treatment, in animal models. Although the use of ethological concepts is particularly important for studies involving the measurement of animal behavior (as is the case for most studies on neuro-psychiatric conditions), it will also substantially benefit other disciplines, such as those investigating the immune system or inflammatory response. Using an ethological approach also involves using more optimal testing conditions are employed that have a biological relevance to the animal. Moreover, using a more biological relevant analysis of the data will help to clarify the functional meaning of the modeled readout (e.g. whether it is psychopathological or adaptive in nature). We advocate for instance that more behavioral studies should use animals in group-housed conditions, including the recording of their ultrasonic vocalizations, because (1) social behavior is an essential feature of animal models for human 'social' psychopathologies, such as autism and schizophrenia, and (2) social conditions are indispensable conditions for appropriate behavioral studies in social species, such as the rat. Only when taking these elements into account, the validity of animal experiments and, thus, the translation value of animal models can be enhanced.
Collapse
Affiliation(s)
- Suzanne M Peters
- Faculty of Science, Utrecht University, Padualaan 8, NL-3584 CH Utrecht, The Netherlands; Delta Phenomics B.V., Nistelrooisebaan 3, NL-5374 RE Schaijk, The Netherlands.
| | - Helen H J Pothuizen
- Delta Phenomics B.V., Nistelrooisebaan 3, NL-5374 RE Schaijk, The Netherlands
| | - Berry M Spruijt
- Faculty of Science, Utrecht University, Padualaan 8, NL-3584 CH Utrecht, The Netherlands.
| |
Collapse
|
40
|
Matsumoto J, Uehara T, Urakawa S, Takamura Y, Sumiyoshi T, Suzuki M, Ono T, Nishijo H. 3D video analysis of the novel object recognition test in rats. Behav Brain Res 2014; 272:16-24. [PMID: 24991752 DOI: 10.1016/j.bbr.2014.06.047] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2014] [Revised: 06/17/2014] [Accepted: 06/23/2014] [Indexed: 12/14/2022]
Abstract
The novel object recognition (NOR) test has been widely used to test memory function. We developed a 3D computerized video analysis system that estimates nose contact with an object in Long Evans rats to analyze object exploration during NOR tests. The results indicate that the 3D system reproducibly and accurately scores the NOR test. Furthermore, the 3D system captures a 3D trajectory of the nose during object exploration, enabling detailed analyses of spatiotemporal patterns of object exploration. The 3D trajectory analysis revealed a specific pattern of object exploration in the sample phase of the NOR test: normal rats first explored the lower parts of objects and then gradually explored the upper parts. A systematic injection of MK-801 suppressed changes in these exploration patterns. The results, along with those of previous studies, suggest that the changes in the exploration patterns reflect neophobia to a novel object and/or changes from spatial learning to object learning. These results demonstrate that the 3D tracking system is useful not only for detailed scoring of animal behaviors but also for investigation of characteristic spatiotemporal patterns of object exploration. The system has the potential to facilitate future investigation of neural mechanisms underlying object exploration that result from dynamic and complex brain activity.
Collapse
Affiliation(s)
- Jumpei Matsumoto
- System Emotional Science, University of Toyama, Toyama, Toyama, Japan
| | - Takashi Uehara
- Department of Neuropsychiatry, Kanazawa Medical University, Ucninada-cho, Kahoku, Ishikawa, Japan
| | - Susumu Urakawa
- System Emotional Science, University of Toyama, Toyama, Toyama, Japan
| | - Yusaku Takamura
- System Emotional Science, University of Toyama, Toyama, Toyama, Japan
| | - Tomiki Sumiyoshi
- Department of Clinical Research Promotion, National Center Hospital, National Center of Neurology and Psychiatry, Kodaira, Tokyo, Japan
| | - Michio Suzuki
- Department of Neuropsychiatry, University of Toyama, Toyama, Toyama, Japan
| | - Taketoshi Ono
- System Emotional Science, University of Toyama, Toyama, Toyama, Japan
| | - Hisao Nishijo
- System Emotional Science, University of Toyama, Toyama, Toyama, Japan.
| |
Collapse
|
41
|
Dell AI, Bender JA, Branson K, Couzin ID, de Polavieja GG, Noldus LPJJ, Pérez-Escudero A, Perona P, Straw AD, Wikelski M, Brose U. Automated image-based tracking and its application in ecology. Trends Ecol Evol 2014; 29:417-28. [PMID: 24908439 DOI: 10.1016/j.tree.2014.05.004] [Citation(s) in RCA: 248] [Impact Index Per Article: 24.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2013] [Revised: 05/03/2014] [Accepted: 05/06/2014] [Indexed: 11/26/2022]
Abstract
The behavior of individuals determines the strength and outcome of ecological interactions, which drive population, community, and ecosystem organization. Bio-logging, such as telemetry and animal-borne imaging, provides essential individual viewpoints, tracks, and life histories, but requires capture of individuals and is often impractical to scale. Recent developments in automated image-based tracking offers opportunities to remotely quantify and understand individual behavior at scales and resolutions not previously possible, providing an essential supplement to other tracking methodologies in ecology. Automated image-based tracking should continue to advance the field of ecology by enabling better understanding of the linkages between individual and higher-level ecological processes, via high-throughput quantitative analysis of complex ecological patterns and processes across scales, including analysis of environmental drivers.
Collapse
Affiliation(s)
- Anthony I Dell
- Systemic Conservation Biology, Department of Biology, Georg-August University Göttingen, Göttingen, Germany.
| | | | - Kristin Branson
- Howard Hughes Medical Institute, Janelia Farm Research Campus, Ashburn, VA, USA
| | - Iain D Couzin
- Department of Ecology and Evolutionary Biology, Princeton University, Princeton, NJ, USA
| | | | - Lucas P J J Noldus
- Noldus Information Technology BV, Nieuwe Kanaal 5, 6709 PA Wageningen, The Netherlands
| | | | - Pietro Perona
- Computation and Neural Systems Program, California Institute of Technology, Pasadena, CA, USA
| | - Andrew D Straw
- Research Institute of Molecular Pathology (IMP), Vienna, Austria
| | - Martin Wikelski
- Max Planck Institute for Ornithology, Radolfzell, Germany; Biology Department, University of Konstanz, Konstanz, Germany
| | - Ulrich Brose
- Systemic Conservation Biology, Department of Biology, Georg-August University Göttingen, Göttingen, Germany
| |
Collapse
|
42
|
Ballesta S, Reymond G, Pozzobon M, Duhamel JR. A real-time 3D video tracking system for monitoring primate groups. J Neurosci Methods 2014; 234:147-52. [PMID: 24875622 DOI: 10.1016/j.jneumeth.2014.05.022] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2014] [Revised: 05/14/2014] [Accepted: 05/15/2014] [Indexed: 10/25/2022]
Abstract
To date, assessing the solitary and social behaviors of laboratory primates' colonies relies on time-consuming manual scoring methods. Here, we describe a real-time multi-camera 3D tracking system developed to measure the behavior of socially-housed primates. Their positions are identified using non-invasive color markers such as plastic collars, thus allowing to also track colored objects and to measure their usage. Compared to traditional manual ethological scoring, we show that this system can reliably evaluate solitary behaviors (foraging, solitary resting, toy usage, locomotion) as well as spatial proximity with peers, which is considered as a good proxy of their social motivation. Compared to existing video-based commercial systems currently available to measure animal activity, this system offers many possibilities (real-time data, large volume coverage, multiple animal tracking) at a lower hardware cost. Quantitative behavioral data of animal groups can now be obtained automatically over very long periods of time, thus opening new perspectives in particular for studying the neuroethology of social behavior in primates.
Collapse
Affiliation(s)
- S Ballesta
- Centre de Neuroscience Cognitive, Centre National de la Recherche Scientifique, 69675 Bron, France; Département de Biologie Humaine, Université Lyon 1, 69622 Villeurbanne, France.
| | - G Reymond
- Centre de Neuroscience Cognitive, Centre National de la Recherche Scientifique, 69675 Bron, France; Département de Biologie Humaine, Université Lyon 1, 69622 Villeurbanne, France
| | - M Pozzobon
- Centre de Neuroscience Cognitive, Centre National de la Recherche Scientifique, 69675 Bron, France; Département de Biologie Humaine, Université Lyon 1, 69622 Villeurbanne, France
| | - J-R Duhamel
- Centre de Neuroscience Cognitive, Centre National de la Recherche Scientifique, 69675 Bron, France; Département de Biologie Humaine, Université Lyon 1, 69622 Villeurbanne, France
| |
Collapse
|