1
|
Goodwin NL, Choong JJ, Hwang S, Pitts K, Bloom L, Islam A, Zhang YY, Szelenyi ER, Tong X, Newman EL, Miczek K, Wright HR, McLaughlin RJ, Norville ZC, Eshel N, Heshmati M, Nilsson SRO, Golden SA. Simple Behavioral Analysis (SimBA) as a platform for explainable machine learning in behavioral neuroscience. Nat Neurosci 2024; 27:1411-1424. [PMID: 38778146 PMCID: PMC11268425 DOI: 10.1038/s41593-024-01649-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2023] [Accepted: 04/12/2024] [Indexed: 05/25/2024]
Abstract
The study of complex behaviors is often challenging when using manual annotation due to the absence of quantifiable behavioral definitions and the subjective nature of behavioral annotation. Integration of supervised machine learning approaches mitigates some of these issues through the inclusion of accessible and explainable model interpretation. To decrease barriers to access, and with an emphasis on accessible model explainability, we developed the open-source Simple Behavioral Analysis (SimBA) platform for behavioral neuroscientists. SimBA introduces several machine learning interpretability tools, including SHapley Additive exPlanation (SHAP) scores, that aid in creating explainable and transparent behavioral classifiers. Here we show how the addition of explainability metrics allows for quantifiable comparisons of aggressive social behavior across research groups and species, reconceptualizing behavior as a sharable reagent and providing an open-source framework. We provide an open-source, graphical user interface (GUI)-driven, well-documented package to facilitate the movement toward improved automation and sharing of behavioral classification tools across laboratories.
Collapse
Affiliation(s)
- Nastacia L Goodwin
- Department of Biological Structure, University of Washington, Seattle, WA, USA
- Graduate Program in Neuroscience, University of Washington, Seattle, WA, USA
- Center of Excellence in Neurobiology of Addiction, Pain and Emotion (NAPE), University of Washington, Seattle, WA, USA
| | - Jia J Choong
- Department of Biological Structure, University of Washington, Seattle, WA, USA
- Department of Electrical and Computer Engineering, University of Washington, Seattle, WA, USA
| | - Sophia Hwang
- Department of Biological Structure, University of Washington, Seattle, WA, USA
| | - Kayla Pitts
- Department of Biological Structure, University of Washington, Seattle, WA, USA
| | - Liana Bloom
- Department of Biological Structure, University of Washington, Seattle, WA, USA
| | - Aasiya Islam
- Department of Biological Structure, University of Washington, Seattle, WA, USA
| | - Yizhe Y Zhang
- Department of Biological Structure, University of Washington, Seattle, WA, USA
- Graduate Program in Neuroscience, University of Washington, Seattle, WA, USA
- Center of Excellence in Neurobiology of Addiction, Pain and Emotion (NAPE), University of Washington, Seattle, WA, USA
| | - Eric R Szelenyi
- Department of Biological Structure, University of Washington, Seattle, WA, USA
- Center of Excellence in Neurobiology of Addiction, Pain and Emotion (NAPE), University of Washington, Seattle, WA, USA
| | - Xiaoyu Tong
- New York University Neuroscience Institute, New York, NY, USA
| | - Emily L Newman
- Department of Psychiatry, Harvard Medical School McLean Hospital, Belmont, MA, USA
| | - Klaus Miczek
- Department of Psychology, Tufts University, Medford, MA, USA
| | - Hayden R Wright
- Department of Integrative Physiology and Neuroscience, Washington State University, Pullman, WA, USA
- Graduate Program in Neuroscience, Washington State University, Pullman, WA, USA
| | - Ryan J McLaughlin
- Department of Integrative Physiology and Neuroscience, Washington State University, Pullman, WA, USA
- Graduate Program in Neuroscience, Washington State University, Pullman, WA, USA
| | | | - Neir Eshel
- Department of Psychiatry and Behavioral Sciences, Stanford University, Stanford, CA, USA
| | - Mitra Heshmati
- Department of Biological Structure, University of Washington, Seattle, WA, USA
- Graduate Program in Neuroscience, University of Washington, Seattle, WA, USA
- Center of Excellence in Neurobiology of Addiction, Pain and Emotion (NAPE), University of Washington, Seattle, WA, USA
- Department of Anesthesiology and Pain Medicine, University of Washington, Seattle, WA, USA
| | - Simon R O Nilsson
- Department of Biological Structure, University of Washington, Seattle, WA, USA.
| | - Sam A Golden
- Department of Biological Structure, University of Washington, Seattle, WA, USA.
- Graduate Program in Neuroscience, University of Washington, Seattle, WA, USA.
- Center of Excellence in Neurobiology of Addiction, Pain and Emotion (NAPE), University of Washington, Seattle, WA, USA.
| |
Collapse
|
2
|
Syeda A, Zhong L, Tung R, Long W, Pachitariu M, Stringer C. Facemap: a framework for modeling neural activity based on orofacial tracking. Nat Neurosci 2024; 27:187-195. [PMID: 37985801 PMCID: PMC10774130 DOI: 10.1038/s41593-023-01490-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2022] [Accepted: 10/10/2023] [Indexed: 11/22/2023]
Abstract
Recent studies in mice have shown that orofacial behaviors drive a large fraction of neural activity across the brain. To understand the nature and function of these signals, we need better computational models to characterize the behaviors and relate them to neural activity. Here we developed Facemap, a framework consisting of a keypoint tracker and a deep neural network encoder for predicting neural activity. Our algorithm for tracking mouse orofacial behaviors was more accurate than existing pose estimation tools, while the processing speed was several times faster, making it a powerful tool for real-time experimental interventions. The Facemap tracker was easy to adapt to data from new labs, requiring as few as 10 annotated frames for near-optimal performance. We used the keypoints as inputs to a deep neural network which predicts the activity of ~50,000 simultaneously-recorded neurons and, in visual cortex, we doubled the amount of explained variance compared to previous methods. Using this model, we found that the neuronal activity clusters that were well predicted from behavior were more spatially spread out across cortex. We also found that the deep behavioral features from the model had stereotypical, sequential dynamics that were not reversible in time. In summary, Facemap provides a stepping stone toward understanding the function of the brain-wide neural signals and their relation to behavior.
Collapse
Affiliation(s)
- Atika Syeda
- HHMI Janelia Research Campus, Ashburn, VA, USA.
| | - Lin Zhong
- HHMI Janelia Research Campus, Ashburn, VA, USA
| | - Renee Tung
- HHMI Janelia Research Campus, Ashburn, VA, USA
| | - Will Long
- HHMI Janelia Research Campus, Ashburn, VA, USA
| | | | | |
Collapse
|
3
|
Yadav RSP, Ansari F, Bera N, Kent C, Agrawal P. Lessons from lonely flies: Molecular and neuronal mechanisms underlying social isolation. Neurosci Biobehav Rev 2024; 156:105504. [PMID: 38061597 DOI: 10.1016/j.neubiorev.2023.105504] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2023] [Revised: 12/01/2023] [Accepted: 12/04/2023] [Indexed: 12/26/2023]
Abstract
Animals respond to changes in the environment which affect their internal state by adapting their behaviors. Social isolation is a form of passive environmental stressor that alters behaviors across animal kingdom, including humans, rodents, and fruit flies. Social isolation is known to increase violence, disrupt sleep and increase depression leading to poor mental and physical health. Recent evidences from several model organisms suggest that social isolation leads to remodeling of the transcriptional and epigenetic landscape which alters behavioral outcomes. In this review, we explore how manipulating social experience of fruit fly Drosophila melanogaster can shed light on molecular and neuronal mechanisms underlying isolation driven behaviors. We discuss the recent advances made using the powerful genetic toolkit and behavioral assays in Drosophila to uncover role of neuromodulators, sensory modalities, pheromones, neuronal circuits and molecular mechanisms in mediating social isolation. The insights gained from these studies could be crucial for developing effective therapeutic interventions in future.
Collapse
Affiliation(s)
- R Sai Prathap Yadav
- Centre for Molecular Neurosciences, Kasturba Medical College, Manipal Academy of Higher Education, Karnataka 576104, India
| | - Faizah Ansari
- Centre for Molecular Neurosciences, Kasturba Medical College, Manipal Academy of Higher Education, Karnataka 576104, India
| | - Neha Bera
- Centre for Molecular Neurosciences, Kasturba Medical College, Manipal Academy of Higher Education, Karnataka 576104, India
| | - Clement Kent
- Department of Biology, York University, Toronto, ON M3J 1P3, Canada
| | - Pavan Agrawal
- Centre for Molecular Neurosciences, Kasturba Medical College, Manipal Academy of Higher Education, Karnataka 576104, India.
| |
Collapse
|
4
|
Nuñez KM, Catalano JL, Scaplen KM, Kaun KR. Ethanol Behavioral Responses in Drosophila. Cold Spring Harb Protoc 2023; 2023:719-24. [PMID: 37019606 PMCID: PMC10551053 DOI: 10.1101/pdb.top107887] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/07/2023]
Abstract
Drosophila melanogaster is a powerful genetic model for investigating the mechanisms underlying ethanol-induced behaviors, metabolism, and preference. Ethanol-induced locomotor activity is especially useful for understanding the mechanisms by which ethanol acutely affects the brain and behavior. Ethanol-induced locomotor activity is characterized by hyperlocomotion and subsequent sedation with increased exposure duration or concentration. Locomotor activity is an efficient, easy, robust, and reproducible behavioral screening tool for identifying underlying genes and neuronal circuits as well as investigating genetic and molecular pathways. We introduce a detailed protocol for performing experiments investigating how volatilized ethanol affects locomotor activity using the fly Group Activity Monitor (flyGrAM). We introduce installation, implementation, data collection, and subsequent data-analysis methods for investigating how volatilized stimuli affect activity. We also introduce a procedure for how to optogenetically probe neuronal activity to identify the neural mechanisms underlying locomotor activity.
Collapse
Affiliation(s)
- Kavin M Nuñez
- Molecular Pharmacology and Physiology Graduate Program, Brown University, Providence, Rhode Island 02912, USA
| | - Jamie L Catalano
- Molecular Pharmacology and Physiology Graduate Program, Brown University, Providence, Rhode Island 02912, USA
| | - Kristin M Scaplen
- Department of Psychology, Bryant University, Smithfield, Rhode Island 02917, USA
- Center for Health and Behavioral Sciences, Bryant University, Smithfield, Rhode Island 02917, USA
- Department of Neuroscience, Brown University, Providence, Rhode Island 02912, USA
| | - Karla R Kaun
- Department of Neuroscience, Brown University, Providence, Rhode Island 02912, USA
| |
Collapse
|
5
|
Nuñez KM, Catalano JL, Scaplen KM, Kaun KR. Methods for Exploring the Circuit Basis of Ethanol-Induced Changes in Drosophila Group Locomotor Activity. Cold Spring Harb Protoc 2023; 2023:108138. [PMID: 37019608 PMCID: PMC10551048 DOI: 10.1101/pdb.prot108138] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/07/2023]
Abstract
Locomotion is a behavioral readout that can be used to understand responses to specific stimuli or perturbations. The fly Group Activity Monitor (flyGrAM) provides a high-throughput and high-content readout of the acute stimulatory and sedative effects of ethanol. The flyGrAM system is adaptable and seamlessly introduces thermogenetic or optogenetic stimulation to dissect neural circuits underlying behavior and tests responses to other volatilized stimuli (humidified air, odorants, anesthetics, vaporized drugs of abuse, etc.). The automated quantification and readout of activity provide users with a real-time representation of the group activity within each chamber throughout the experiment, helping users to quickly determine proper ethanol doses and duration, run behavioral screens, and plan follow-up experiments.
Collapse
Affiliation(s)
- Kavin M Nuñez
- Molecular Pharmacology and Physiology Graduate Program, Brown University, Providence, Rhode Island 02912, USA
| | - Jamie L Catalano
- Molecular Pharmacology and Physiology Graduate Program, Brown University, Providence, Rhode Island 02912, USA
| | - Kristin M Scaplen
- Department of Psychology, Bryant University, Smithfield, Rhode Island 02917, USA
- Center for Health and Behavioral Sciences, Bryant University, Smithfield, Rhode Island 02917, USA
- Department of Neuroscience, Brown University, Providence, Rhode Island 02912, USA
| | - Karla R Kaun
- Department of Neuroscience, Brown University, Providence, Rhode Island 02912, USA
| |
Collapse
|
6
|
Stiemer LN, Thoma A, Braun C. MBT3D: Deep learning based multi-object tracker for bumblebee 3D flight path estimation. PLoS One 2023; 18:e0291415. [PMID: 37738269 PMCID: PMC10516433 DOI: 10.1371/journal.pone.0291415] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2023] [Accepted: 08/29/2023] [Indexed: 09/24/2023] Open
Abstract
This work presents the Multi-Bees-Tracker (MBT3D) algorithm, a Python framework implementing a deep association tracker for Tracking-By-Detection, to address the challenging task of tracking flight paths of bumblebees in a social group. While tracking algorithms for bumblebees exist, they often come with intensive restrictions, such as the need for sufficient lighting, high contrast between the animal and background, absence of occlusion, significant user input, etc. Tracking flight paths of bumblebees in a social group is challenging. They suddenly adjust movements and change their appearance during different wing beat states while exhibiting significant similarities in their individual appearance. The MBT3D tracker, developed in this research, is an adaptation of an existing ant tracking algorithm for bumblebee tracking. It incorporates an offline trained appearance descriptor along with a Kalman Filter for appearance and motion matching. Different detector architectures for upstream detections (You Only Look Once (YOLOv5), Faster Region Proposal Convolutional Neural Network (Faster R-CNN), and RetinaNet) are investigated in a comparative study to optimize performance. The detection models were trained on a dataset containing 11359 labeled bumblebee images. YOLOv5 reaches an Average Precision of AP = 53, 8%, Faster R-CNN achieves AP = 45, 3% and RetinaNet AP = 38, 4% on the bumblebee validation dataset, which consists of 1323 labeled bumblebee images. The tracker's appearance model is trained on 144 samples. The tracker (with Faster R-CNN detections) reaches a Multiple Object Tracking Accuracy MOTA = 93, 5% and a Multiple Object Tracking Precision MOTP = 75, 6% on a validation dataset containing 2000 images, competing with state-of-the-art computer vision methods. The framework allows reliable tracking of different bumblebees in the same video stream with rarely occurring identity switches (IDS). MBT3D has much lower IDS than other commonly used algorithms, with one of the lowest false positive rates, competing with state-of-the-art animal tracking algorithms. The developed framework reconstructs the 3-dimensional (3D) flight paths of the bumblebees by triangulation. It also handles and compares two alternative stereo camera pairs if desired.
Collapse
Affiliation(s)
- Luc Nicolas Stiemer
- Department of Aerospace Engineering, FH Aachen, Aachen, North Rhine-Westphalia, Germany
| | - Andreas Thoma
- Department of Aerospace Engineering, FH Aachen, Aachen, North Rhine-Westphalia, Germany
- Department of Aerospace Engineering, RMIT University, Melbourne, Victoria, Australia
| | - Carsten Braun
- Department of Aerospace Engineering, FH Aachen, Aachen, North Rhine-Westphalia, Germany
| |
Collapse
|
7
|
Zhou KC, Harfouche M, Cooke CL, Park J, Konda PC, Kreiss L, Kim K, Jönsson J, Doman T, Reamey P, Saliu V, Cook CB, Zheng M, Bechtel JP, Bègue A, McCarroll M, Bagwell J, Horstmeyer G, Bagnat M, Horstmeyer R. Parallelized computational 3D video microscopy of freely moving organisms at multiple gigapixels per second. NATURE PHOTONICS 2023; 17:442-450. [PMID: 37808252 PMCID: PMC10552607 DOI: 10.1038/s41566-023-01171-7] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/14/2022] [Accepted: 02/03/2023] [Indexed: 10/10/2023]
Abstract
Wide field of view microscopy that can resolve 3D information at high speed and spatial resolution is highly desirable for studying the behaviour of freely moving model organisms. However, it is challenging to design an optical instrument that optimises all these properties simultaneously. Existing techniques typically require the acquisition of sequential image snapshots to observe large areas or measure 3D information, thus compromising on speed and throughput. Here, we present 3D-RAPID, a computational microscope based on a synchronized array of 54 cameras that can capture high-speed 3D topographic videos over an area of 135 cm2, achieving up to 230 frames per second at spatiotemporal throughputs exceeding 5 gigapixels per second. 3D-RAPID employs a 3D reconstruction algorithm that, for each synchronized snapshot, fuses all 54 images into a composite that includes a co-registered 3D height map. The self-supervised 3D reconstruction algorithm trains a neural network to map raw photometric images to 3D topography using stereo overlap redundancy and ray-propagation physics as the only supervision mechanism. The resulting reconstruction process is thus robust to generalization errors and scales to arbitrarily long videos from arbitrarily sized camera arrays. We demonstrate the broad applicability of 3D-RAPID with collections of several freely behaving organisms, including ants, fruit flies, and zebrafish larvae.
Collapse
Affiliation(s)
- Kevin C. Zhou
- Department of Biomedical Engineering, Duke University, Durham, NC 27708, USA
- Ramona Optics Inc., 1000 W Main St., Durham, NC 27701, USA
- Current affiliation: Department of Electrical Engineering and Computer Sciences, University of California, Berkeley, CA, USA
| | - Mark Harfouche
- Ramona Optics Inc., 1000 W Main St., Durham, NC 27701, USA
| | - Colin L. Cooke
- Department of Electrical and Computer Engineering, Duke University, Durham, NC 27708, USA
| | - Jaehee Park
- Ramona Optics Inc., 1000 W Main St., Durham, NC 27701, USA
| | - Pavan C. Konda
- Department of Biomedical Engineering, Duke University, Durham, NC 27708, USA
| | - Lucas Kreiss
- Department of Biomedical Engineering, Duke University, Durham, NC 27708, USA
| | - Kanghyun Kim
- Department of Biomedical Engineering, Duke University, Durham, NC 27708, USA
| | - Joakim Jönsson
- Department of Biomedical Engineering, Duke University, Durham, NC 27708, USA
| | - Thomas Doman
- Ramona Optics Inc., 1000 W Main St., Durham, NC 27701, USA
| | - Paul Reamey
- Ramona Optics Inc., 1000 W Main St., Durham, NC 27701, USA
| | - Veton Saliu
- Ramona Optics Inc., 1000 W Main St., Durham, NC 27701, USA
| | - Clare B. Cook
- Department of Biomedical Engineering, Duke University, Durham, NC 27708, USA
- Ramona Optics Inc., 1000 W Main St., Durham, NC 27701, USA
| | - Maxwell Zheng
- Ramona Optics Inc., 1000 W Main St., Durham, NC 27701, USA
| | | | - Aurélien Bègue
- Ramona Optics Inc., 1000 W Main St., Durham, NC 27701, USA
| | - Matthew McCarroll
- Department of Pharmaceutical Chemistry, University of California, San Francisco, CA, USA
| | - Jennifer Bagwell
- Department of Cell Biology, Duke University, Durham, NC 27710, USA
| | | | - Michel Bagnat
- Department of Cell Biology, Duke University, Durham, NC 27710, USA
| | - Roarke Horstmeyer
- Department of Biomedical Engineering, Duke University, Durham, NC 27708, USA
- Ramona Optics Inc., 1000 W Main St., Durham, NC 27701, USA
- Department of Electrical and Computer Engineering, Duke University, Durham, NC 27708, USA
| |
Collapse
|
8
|
Burte V, Cointe M, Perez G, Mailleret L, Calcagno V. When complex movement yields simple dispersal: behavioural heterogeneity, spatial spread and parasitism in groups of micro-wasps. MOVEMENT ECOLOGY 2023; 11:13. [PMID: 36859387 PMCID: PMC9976481 DOI: 10.1186/s40462-023-00371-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/18/2022] [Accepted: 02/01/2023] [Indexed: 06/18/2023]
Abstract
BACKGROUND Understanding how behavioural dynamics, inter-individual variability and individual interactions scale-up to shape the spatial spread and dispersal of animal populations is a major challenge in ecology. For biocontrol agents, such as the microscopic Trichogramma parasitic wasps, an understanding of movement strategies is also critical to predict pest-suppression performance in the field. METHODS We experimentally studied the spatial propagation of groups of parasitoids and their patterns of parasitism. We investigated whether population spread is density-dependent, how it is affected by the presence of hosts, and whether the spatial distribution of parasitism (dispersal kernel) can be predicted from the observed spread of individuals. Using a novel experimental device and high-throughput imaging techniques, we continuously tracked the spatial spread of groups of parasitoids over large temporal and spatial scales (8 h; and 6 m, ca. 12,000 body lengths). We could thus study how population density, the presence of hosts and their spatial distribution impacted the rate of population spread, the spatial distribution of individuals during population expansion, the overall rate of parasitism and the dispersal kernel (position of parasitism events). RESULTS Higher population density accelerated population spread, but only transiently: the rate of spread reverted to low values after 4 h, in a "tortoise-hare" effect. Interestingly, the presence of hosts suppressed this transiency and permitted a sustained high rate of population spread. Importantly, we found that population spread did not obey classical diffusion, but involved dynamical switches between resident and explorer movement modes. Population distribution was therefore not Gaussian, though surprisingly the distribution of parasitism (dispersal kernel) was. CONCLUSIONS Even homogenous asexual groups of insects develop behavioural heterogeneities over a few hours, and the latter control patterns of population spread. Behavioural switching between resident and explorer states determined population distribution, density-dependence and dispersal. A simple Gaussian dispersal kernel did not reflect classical diffusion, but rather the interplay of several non-linearities at individual level. These results highlight the need to take into account behaviour and inter-individual heterogeneity to understand population spread in animals.
Collapse
Affiliation(s)
- Victor Burte
- Université Côte d'Azur, INRAE, CNRS, Institut Sophia Agrobiotech, Sophia Antipolis, France
| | - Melina Cointe
- Université Côte d'Azur, INRAE, CNRS, Institut Sophia Agrobiotech, Sophia Antipolis, France
| | - Guy Perez
- Université Côte d'Azur, INRAE, CNRS, Institut Sophia Agrobiotech, Sophia Antipolis, France
| | - Ludovic Mailleret
- Université Côte d'Azur, INRAE, CNRS, Institut Sophia Agrobiotech, Sophia Antipolis, France
- Université Côte d'Azur, Inria, INRAE, CNRS, Sorbonne Université, Biocore, Sophia Antipolis, France
| | - Vincent Calcagno
- Université Côte d'Azur, INRAE, CNRS, Institut Sophia Agrobiotech, Sophia Antipolis, France.
| |
Collapse
|
9
|
Thurley K. Naturalistic neuroscience and virtual reality. Front Syst Neurosci 2022; 16:896251. [PMID: 36467978 PMCID: PMC9712202 DOI: 10.3389/fnsys.2022.896251] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2022] [Accepted: 10/31/2022] [Indexed: 04/04/2024] Open
Abstract
Virtual reality (VR) is one of the techniques that became particularly popular in neuroscience over the past few decades. VR experiments feature a closed-loop between sensory stimulation and behavior. Participants interact with the stimuli and not just passively perceive them. Several senses can be stimulated at once, large-scale environments can be simulated as well as social interactions. All of this makes VR experiences more natural than those in traditional lab paradigms. Compared to the situation in field research, a VR simulation is highly controllable and reproducible, as required of a laboratory technique used in the search for neural correlates of perception and behavior. VR is therefore considered a middle ground between ecological validity and experimental control. In this review, I explore the potential of VR in eliciting naturalistic perception and behavior in humans and non-human animals. In this context, I give an overview of recent virtual reality approaches used in neuroscientific research.
Collapse
Affiliation(s)
- Kay Thurley
- Faculty of Biology, Ludwig-Maximilians-Universität München, Munich, Germany
- Bernstein Center for Computational Neuroscience Munich, Munich, Germany
| |
Collapse
|
10
|
Bresolin T, Ferreira R, Reyes F, Van Os J, Dórea J. Assessing optimal frequency for image acquisition in computer vision systems developed to monitor feeding behavior of group-housed Holstein heifers. J Dairy Sci 2022; 106:664-675. [DOI: 10.3168/jds.2022-22138] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2022] [Accepted: 08/02/2022] [Indexed: 11/05/2022]
|
11
|
Application of Challenging Learning Based on Human-Computer Interaction under Machine Vision in Vocational Undergraduate Colleges. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE 2022; 2022:4667387. [PMID: 36268158 PMCID: PMC9578859 DOI: 10.1155/2022/4667387] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/13/2022] [Revised: 08/23/2022] [Accepted: 09/03/2022] [Indexed: 11/23/2022]
Abstract
Science and technology have progressed in recent years, the deepening of talent education, a challenging learning method of human-computer interaction has gradually emerged. Human-Computer Interaction (HCI for short) is the communication and interaction between humans and machines. This essay aims to apply the challenging learning combined with HCI to vocational undergraduate colleges. The GMM (that is Gaussian mixture model algorithm, commonly used in image recognition or speech recognition, etc.) algorithm is proposed in this essay to recognize students' actions. The effect of HCI is achieved by feeding back the recognized actions to the system. This essay selects 200 students from a vocational undergraduate college for challenging learning (that is, a comprehensive teaching method aiming at students' autonomous learning by stimulating students' interest in learning). The challenging learning designed in this essay is divided into 15 weeks, the task chain contains a total of 196 tasks, and the learning time is 138 h. This essay analyzes the application effects of liberal arts, male and female students, and different grades. The results show that the overall average completion rates of learning tasks for freshman, sophomore, and junior students are about 70%, 75%, and 85%, respectively, and the overall average scores for challenging learning are about 70, 78, and 83. The overall completion rate of weekly tasks for boys and girls is about 68% and 70%, and the overall average score is about 70 points and 75 points. The weekly task completion rate of liberal arts students is generally above 75%, and the overall average score is about 70 points. The overall completion rate of science students is less than 75%, and most of the learning scores are higher than 75 points. In addition, the average accuracy of the GMM algorithm for face and gesture recognition is 90% and 87%. The average frequency of students using HCI is about 320 times a day; the average score of students' experience effect of HCI is about 80 points. It may be stated that the HCI demanding learning strategy proposed in this study worked well and has achieved satisfactory learning results in the application of vocational undergraduate colleges.
Collapse
|
12
|
Jiang K, Xia Q, Ma L, Xu X, Shao Z. Insulator fault feature extraction system of substation equipment based on machine vision. IET NETWORKS 2022. [DOI: 10.1049/ntw2.12058] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Affiliation(s)
- Keruo Jiang
- Ningbo Power Supply Company of State Grid Zhejiang Electric Power Co., Ltd Ningbo Zhejiang China
| | - Qiaoqun Xia
- Ningbo Power Supply Company of State Grid Zhejiang Electric Power Co., Ltd Ningbo Zhejiang China
| | - Lijun Ma
- Ningbo Power Supply Company of State Grid Zhejiang Electric Power Co., Ltd Ningbo Zhejiang China
| | - Xin Xu
- Ningbo Power Supply Company of State Grid Zhejiang Electric Power Co., Ltd Ningbo Zhejiang China
| | - Zhipeng Shao
- Ningbo Power Supply Company of State Grid Zhejiang Electric Power Co., Ltd Ningbo Zhejiang China
| |
Collapse
|
13
|
Vagvolgyi BP, Jayakumar RP, Madhav MS, Knierim JJ, Cowan NJ. Wide-angle, monocular head tracking using passive markers. J Neurosci Methods 2022; 368:109453. [PMID: 34968626 PMCID: PMC8857048 DOI: 10.1016/j.jneumeth.2021.109453] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2021] [Revised: 11/22/2021] [Accepted: 12/17/2021] [Indexed: 10/19/2022]
Abstract
BACKGROUND Camera images can encode large amounts of visual information of an animal and its environment, enabling high fidelity 3D reconstruction of the animal and its environment using computer vision methods. Most systems, both markerless (e.g. deep learning based) and marker-based, require multiple cameras to track features across multiple points of view to enable such 3D reconstruction. However, such systems can be expensive and are challenging to set up in small animal research apparatuses. NEW METHODS We present an open-source, marker-based system for tracking the head of a rodent for behavioral research that requires only a single camera with a potentially wide field of view. The system features a lightweight visual target and computer vision algorithms that together enable high-accuracy tracking of the six-degree-of-freedom position and orientation of the animal's head. The system, which only requires a single camera positioned above the behavioral arena, robustly reconstructs the pose over a wide range of head angles (360° in yaw, and approximately ± 120° in roll and pitch). RESULTS Experiments with live animals demonstrate that the system can reliably identify rat head position and orientation. Evaluations using a commercial optical tracker device show that the system achieves accuracy that rivals commercial multi-camera systems. COMPARISON WITH EXISTING METHODS Our solution significantly improves upon existing monocular marker-based tracking methods, both in accuracy and in allowable range of motion. CONCLUSIONS The proposed system enables the study of complex behaviors by providing robust, fine-scale measurements of rodent head motions in a wide range of orientations.
Collapse
Affiliation(s)
- Balazs P. Vagvolgyi
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, U.S.A.,Corresponding author: (Balazs P. Vagvolgyi)
| | - Ravikrishnan P. Jayakumar
- Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, U.S.A.,Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, U.S.A.,Mechanical Engineering Department, Johns Hopkins University, Baltimore, MD, U.S.A
| | - Manu S. Madhav
- Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, U.S.A.,Kavli Neuroscience Discovery Institute, Johns Hopkins University, Baltimore, MD, U.S.A.,Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, U.S.A.,School of Biomedical Engineering, Djawad Mowafaghian Centre for Brain Health, University of British Columbia, BC, Canada,Corresponding author: (Balazs P. Vagvolgyi)
| | - James J. Knierim
- Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, U.S.A.,Kavli Neuroscience Discovery Institute, Johns Hopkins University, Baltimore, MD, U.S.A
| | - Noah J. Cowan
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, U.S.A.,Mechanical Engineering Department, Johns Hopkins University, Baltimore, MD, U.S.A
| |
Collapse
|
14
|
Rodriguez IF, Chan J, Alvarez Rios M, Branson K, Agosto-Rivera JL, Giray T, Mégret R. Automated Video Monitoring of Unmarked and Marked Honey Bees at the Hive Entrance. FRONTIERS IN COMPUTER SCIENCE 2022. [DOI: 10.3389/fcomp.2021.769338] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
We present a novel system for the automatic video monitoring of honey bee foraging activity at the hive entrance. This monitoring system is built upon convolutional neural networks that perform multiple animal pose estimation without the need for marking. This precise detection of honey bee body parts is a key element of the system to provide detection of entrance and exit events at the entrance of the hive including accurate pollen detection. A detailed evaluation of the quality of the detection and a study of the effect of the parameters are presented. The complete system also integrates identification of barcode marked bees, which enables the monitoring at both aggregate and individual levels. The results obtained on multiple days of video recordings show the applicability of the approach for large-scale deployment. This is an important step forward for the understanding of complex behaviors exhibited by honey bees and the automatic assessment of colony health.
Collapse
|
15
|
Research on computer vision enhancement in intelligent robot based on machine learning and deep learning. Neural Comput Appl 2022. [DOI: 10.1007/s00521-021-05898-8] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|
16
|
Jezovit JA, Alwash N, Levine JD. Using Flies to Understand Social Networks. Front Neural Circuits 2021; 15:755093. [PMID: 34924963 PMCID: PMC8683092 DOI: 10.3389/fncir.2021.755093] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2021] [Accepted: 11/08/2021] [Indexed: 12/21/2022] Open
Abstract
Many animals live in groups and interact with each other, creating an organized collective structure. Social network analysis (SNA) is a statistical tool that aids in revealing and understanding the organized patterns of shared social connections between individuals in groups. Surprisingly, the application of SNA revealed that Drosophila melanogaster, previously considered a solitary organism, displays group dynamics and that the structure of group life is inherited. Although the number of studies investigating Drosophila social networks is currently limited, they address a wide array of questions that have only begun to capture the details of group level behavior in this insect. Here, we aim to review these studies, comparing their respective scopes and the methods used, to draw parallels between them and the broader body of knowledge available. For example, we highlight how despite methodological differences, there are similarities across studies investigating the effects of social isolation on social network dynamics. Finally, this review aims to generate hypotheses and predictions that inspire future research in the emerging field of Drosophila social networks.
Collapse
Affiliation(s)
- Jacob A Jezovit
- Department of Cell and Systems Biology, University of Toronto Mississauga, Mississauga, ON, Canada
| | - Nawar Alwash
- Department of Cell and Systems Biology, University of Toronto Mississauga, Mississauga, ON, Canada
| | - Joel D Levine
- Department of Cell and Systems Biology, University of Toronto Mississauga, Mississauga, ON, Canada.,Department of Cell and Systems Biology, University of Toronto, Toronto, ON, Canada.,International Research Centre for Neurointelligence, University of Tokyo, Tokyo, Japan
| |
Collapse
|
17
|
Auer TO, Shahandeh MP, Benton R. Drosophila sechellia: A Genetic Model for Behavioral Evolution and Neuroecology. Annu Rev Genet 2021; 55:527-554. [PMID: 34530638 DOI: 10.1146/annurev-genet-071719-020719] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Defining the mechanisms by which animals adapt to their ecological niche is an important problem bridging evolution, genetics, and neurobiology. We review the establishment of a powerful genetic model for comparative behavioral analysis and neuroecology, Drosophila sechellia. This island-endemic fly species is closely related to several cosmopolitan generalists, including Drosophila melanogaster, but has evolved extreme specialism, feeding and reproducing exclusively on the noni fruit of the tropical shrub Morinda citrifolia. We first describe the development and use of genetic approaches to facilitate genotype/phenotype associations in these drosophilids. Next, we survey the behavioral, physiological, and morphological adaptations of D. sechellia throughout its life cycle and outline our current understanding of the genetic and cellular basis of these traits. Finally, we discuss the principles this knowledge begins to establish in the context of host specialization, speciation, and the neurobiology of behavioral evolution and consider open questions and challenges in the field.
Collapse
Affiliation(s)
- Thomas O Auer
- Center for Integrative Genomics, Faculty of Biology and Medicine, University of Lausanne, CH-1015 Lausanne, Switzerland; , ,
| | - Michael P Shahandeh
- Center for Integrative Genomics, Faculty of Biology and Medicine, University of Lausanne, CH-1015 Lausanne, Switzerland; , ,
| | - Richard Benton
- Center for Integrative Genomics, Faculty of Biology and Medicine, University of Lausanne, CH-1015 Lausanne, Switzerland; , ,
| |
Collapse
|
18
|
From human wellbeing to animal welfare. Neurosci Biobehav Rev 2021; 131:941-952. [PMID: 34509514 DOI: 10.1016/j.neubiorev.2021.09.014] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2019] [Revised: 02/09/2021] [Accepted: 09/07/2021] [Indexed: 12/16/2022]
Abstract
What does it mean to be "well" and how might such a state be cultivated? When we speak of wellbeing, it is of ourselves and fellow humans. When it comes to nonhuman animals, consideration turns to welfare. My aim herein is to suggest that theoretical approaches to human wellbeing might be beneficially applied to consideration of animal welfare, and in so doing, introduce new lines of inquiry and practice. I will review current approaches to human wellbeing, adopting a triarchic structure that delineates hedonic wellbeing, eudaimonic wellbeing, and social wellbeing. For each, I present a conceptual definition and a review of how researchers have endeavored to measure the construct. Drawing these three domains of research together, I highlight how these traditionally anthropocentric lines of inquiry might be extended to the question of animal welfare - namely by considering hedonic welfare, eudaimonic welfare, and social welfare as potentially distinguishable and complementary components of the broader construct of animal welfare.
Collapse
|
19
|
A review of 28 free animal-tracking software applications: current features and limitations. Lab Anim (NY) 2021; 50:246-254. [PMID: 34326537 DOI: 10.1038/s41684-021-00811-1] [Citation(s) in RCA: 30] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2020] [Accepted: 06/24/2021] [Indexed: 11/09/2022]
Abstract
Well-quantified laboratory studies can provide a fundamental understanding of animal behavior in ecology, ethology and ecotoxicology research. These types of studies require observation and tracking of each animal in well-controlled and defined arenas, often for long timescales. Thus, these experiments produce long time series and a vast amount of data that require the use of software applications to automate the analysis and reduce manual annotation. In this review, we examine 28 free software applications for animal tracking to guide researchers in selecting the software that might best suit a particular experiment. We also review the algorithms in the tracking pipeline of the applications, explain how specific techniques can fit different experiments, and finally, expose each approach's weaknesses and strengths. Our in-depth review includes last update, type of platform, user-friendliness, off- or online video acquisition, calibration method, background subtraction and segmentation method, species, multiple arenas, multiple animals, identity preservation, manual identity correction, data analysis and extra features. We found, for example, that out of 28 programs, only 3 include a calibration algorithm to reduce image distortion and perspective problems that affect accuracy and can result in substantial errors when analyzing trajectories and extracting mobility or explored distance. In addition, only 4 programs can directly export in-depth tracking and analysis metrics, only 5 are suited for tracking multiple unmarked animals for more than a few seconds and only 11 have been updated in the period 2019-2021.
Collapse
|
20
|
Jiang Z, Zhou F, Zhao A, Li X, Li L, Tao D, Li X, Zhou H. Multi-View Mouse Social Behaviour Recognition With Deep Graphic Model. IEEE TRANSACTIONS ON IMAGE PROCESSING : A PUBLICATION OF THE IEEE SIGNAL PROCESSING SOCIETY 2021; 30:5490-5504. [PMID: 34048344 DOI: 10.1109/tip.2021.3083079] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Home-cage social behaviour analysis of mice is an invaluable tool to assess therapeutic efficacy of neurodegenerative diseases. Despite tremendous efforts made within the research community, single-camera video recordings are mainly used for such analysis. Because of the potential to create rich descriptions for mouse social behaviors, the use of multi-view video recordings for rodent observations is increasingly receiving much attention. However, identifying social behaviours from various views is still challenging due to the lack of correspondence across data sources. To address this problem, we here propose a novel multi-view latent-attention and dynamic discriminative model that jointly learns view-specific and view-shared sub-structures, where the former captures unique dynamics of each view whilst the latter encodes the interaction between the views. Furthermore, a novel multi-view latent-attention variational autoencoder model is introduced in learning the acquired features, enabling us to learn discriminative features in each view. Experimental results on the standard CRMI13 and our multi-view Parkinson's Disease Mouse Behaviour (PDMB) datasets demonstrate that our proposed model outperforms the other state of the arts technologies, has lower computational cost than the other graphical models and effectively deals with the imbalanced data problem.
Collapse
|
21
|
Huang K, Han Y, Chen K, Pan H, Zhao G, Yi W, Li X, Liu S, Wei P, Wang L. A hierarchical 3D-motion learning framework for animal spontaneous behavior mapping. Nat Commun 2021; 12:2784. [PMID: 33986265 PMCID: PMC8119960 DOI: 10.1038/s41467-021-22970-y] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2020] [Accepted: 04/06/2021] [Indexed: 02/03/2023] Open
Abstract
Animal behavior usually has a hierarchical structure and dynamics. Therefore, to understand how the neural system coordinates with behaviors, neuroscientists need a quantitative description of the hierarchical dynamics of different behaviors. However, the recent end-to-end machine-learning-based methods for behavior analysis mostly focus on recognizing behavioral identities on a static timescale or based on limited observations. These approaches usually lose rich dynamic information on cross-scale behaviors. Here, inspired by the natural structure of animal behaviors, we address this challenge by proposing a parallel and multi-layered framework to learn the hierarchical dynamics and generate an objective metric to map the behavior into the feature space. In addition, we characterize the animal 3D kinematics with our low-cost and efficient multi-view 3D animal motion-capture system. Finally, we demonstrate that this framework can monitor spontaneous behavior and automatically identify the behavioral phenotypes of the transgenic animal disease model. The extensive experiment results suggest that our framework has a wide range of applications, including animal disease model phenotyping and the relationships modeling between the neural circuits and behavior.
Collapse
Grants
- This work was supported in part by Key Area R&D Program of Guangdong Province (2018B030338001 P.W., 2018B030331001 L.W.), National Key R&D Program of China (2018YFA0701403 P.W.), National Natural Science Foundation of China (NSFC 31500861 P.W., NSFC 31630031 L.W., NSFC 91732304 L.W., NSFC 31930047 L.W.), Chang Jiang Scholars Program (L.W.), the International Big Science Program Cultivating Project of CAS (172644KYS820170004 L.W.), the Strategic Priority Research Program of Chinese Academy of Science (XDB32030100, L.W.), the Youth Innovation Promotion Association of the Chinese Academy of Sciences (2017413 P.W.), CAS Key Laboratory of Brain Connectome and Manipulation (2019DP173024), Shenzhen Government Basic Research Grants (JCYJ20170411140807570 P.W., JCYJ20170413164535041 L.W.), Science, Technology and Innovation Commission of Shenzhen Municipality (JCYJ20160429185235132 K.H.), Helmholtz-CAS joint research grant (GJHZ1508 L.W.), Guangdong Provincial Key Laboratory of Brain Connectome and Behavior (2017B030301017 L.W.), the Ten Thousand Talent Program (L.W.), the Guangdong Special Support Program (L.W.), Key Laboratory of SIAT (2019DP173024 L.W.), Shenzhen Key Science and Technology Infrastructure Planning Project (ZDKJ20190204002 L.W.).
Collapse
Affiliation(s)
- Kang Huang
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
- University of Chinese Academy of Sciences, Beijing, China
| | - Yaning Han
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
- University of Chinese Academy of Sciences, Beijing, China
| | - Ke Chen
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
- University of Chinese Academy of Sciences, Beijing, China
| | - Hongli Pan
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
| | - Gaoyang Zhao
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
- University of Chinese Academy of Sciences, Beijing, China
| | - Wenling Yi
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
- University of Chinese Academy of Sciences, Beijing, China
| | - Xiaoxi Li
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
- University of Chinese Academy of Sciences, Beijing, China
| | - Siyuan Liu
- Pennsylvania State University, University Park, PA, USA
| | - Pengfei Wei
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China.
- University of Chinese Academy of Sciences, Beijing, China.
| | - Liping Wang
- Shenzhen Key Lab of Neuropsychiatric Modulation and Collaborative Innovation Center for Brain Science, Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, CAS Center for Excellence in Brain Science and Intelligence Technology, Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China.
- University of Chinese Academy of Sciences, Beijing, China.
| |
Collapse
|
22
|
Plum F, Labonte D. scAnt-an open-source platform for the creation of 3D models of arthropods (and other small objects). PeerJ 2021; 9:e11155. [PMID: 33954036 PMCID: PMC8048404 DOI: 10.7717/peerj.11155] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/27/2020] [Accepted: 03/04/2021] [Indexed: 11/25/2022] Open
Abstract
We present scAnt, an open-source platform for the creation of digital 3D models of arthropods and small objects. scAnt consists of a scanner and a Graphical User Interface, and enables the automated generation of Extended Depth Of Field images from multiple perspectives. These images are then masked with a novel automatic routine which combines random forest-based edge-detection, adaptive thresholding and connected component labelling. The masked images can then be processed further with a photogrammetry software package of choice, including open-source options such as Meshroom, to create high-quality, textured 3D models. We demonstrate how these 3D models can be rigged to enable realistic digital specimen posing, and introduce a novel simple yet effective method to include semi-realistic representations of approximately planar and transparent structures such as wings. As a result of the exclusive reliance on generic hardware components, rapid prototyping and open-source software, scAnt costs only a fraction of available comparable systems. The resulting accessibility of scAnt will (i) drive the development of novel and powerful methods for machine learning-driven behavioural studies, leveraging synthetic data; (ii) increase accuracy in comparative morphometric studies as well as extend the available parameter space with area and volume measurements; (iii) inspire novel forms of outreach; and (iv) aid in the digitisation efforts currently underway in several major natural history collections.
Collapse
Affiliation(s)
- Fabian Plum
- Department of Bioengineering, Imperial College London, London, UK
| | - David Labonte
- Department of Bioengineering, Imperial College London, London, UK
| |
Collapse
|
23
|
Laursen SF, Hansen LS, Bahrndorff S, Nielsen HM, Noer NK, Renault D, Sahana G, Sørensen JG, Kristensen TN. Contrasting Manual and Automated Assessment of Thermal Stress Responses and Larval Body Size in Black Soldier Flies and Houseflies. INSECTS 2021; 12:380. [PMID: 33922364 PMCID: PMC8146041 DOI: 10.3390/insects12050380] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/01/2021] [Revised: 04/19/2021] [Accepted: 04/20/2021] [Indexed: 11/16/2022]
Abstract
Within ecophysiological and genetic studies on insects, morphological and physiological traits are commonly assessed and phenotypes are typically obtained from manual measurements on numerous individuals. Manual observations are, however, time consuming, can introduce observer bias and are prone to human error. Here, we contrast results obtained from manual assessment of larval size and thermal tolerance traits in black soldier flies (Hermetia illucens) and houseflies (Musca domestica) that have been acclimated under three different temperature regimes with those obtained automatically using an image analysis software (Noldus EthoVision XT). We found that (i) larval size estimates of both species, obtained by manual weighing or by using the software, were highly correlated, (ii) measures of heat and cold tolerance using manual and automated approaches provided qualitatively similar results, and (iii) by using the software we obtained quantifiable information on stress responses and acclimation effects of potentially higher ecological relevance than the endpoint traits that are typically assessed when manual assessments are used. Based on these findings, we argue that automated assessment of insect stress responses and largescale phenotyping of morphological traits such as size will provide new opportunities within many disciplines where accurate and largescale phenotyping of insects is required.
Collapse
Affiliation(s)
- Stine Frey Laursen
- Section of Biology and Environmental Science, Department of Chemistry and Bioscience, Aalborg University, Fredrik Bajers Vej 7H, 9220 Aalborg, Denmark; (S.B.); (N.K.N.); (T.N.K.)
| | - Laura Skrubbeltrang Hansen
- Center for Quantitative Genetics and Genomics, Faculty of Technical Sciences, Aarhus University, Blichers Allé 20, 8830 Tjele, Denmark; (L.S.H.); (H.M.N.); (G.S.)
| | - Simon Bahrndorff
- Section of Biology and Environmental Science, Department of Chemistry and Bioscience, Aalborg University, Fredrik Bajers Vej 7H, 9220 Aalborg, Denmark; (S.B.); (N.K.N.); (T.N.K.)
| | - Hanne Marie Nielsen
- Center for Quantitative Genetics and Genomics, Faculty of Technical Sciences, Aarhus University, Blichers Allé 20, 8830 Tjele, Denmark; (L.S.H.); (H.M.N.); (G.S.)
| | - Natasja Krog Noer
- Section of Biology and Environmental Science, Department of Chemistry and Bioscience, Aalborg University, Fredrik Bajers Vej 7H, 9220 Aalborg, Denmark; (S.B.); (N.K.N.); (T.N.K.)
| | - David Renault
- University of Rennes, CNRS, ECOBIO (Ecosystémes, Biodiversité, Evolution)-UMR, 6553 Rennes, France;
- Institut Universitaire de France, 1 Rue Descartes, CEDEX 05, 75231 Paris, France
| | - Goutam Sahana
- Center for Quantitative Genetics and Genomics, Faculty of Technical Sciences, Aarhus University, Blichers Allé 20, 8830 Tjele, Denmark; (L.S.H.); (H.M.N.); (G.S.)
| | - Jesper Givskov Sørensen
- Section for Genetics, Ecology and Evolution, Department of Biology, Aarhus University, Ny Munkegade 116, 8000 Aarhus C, Denmark;
| | - Torsten Nygaard Kristensen
- Section of Biology and Environmental Science, Department of Chemistry and Bioscience, Aalborg University, Fredrik Bajers Vej 7H, 9220 Aalborg, Denmark; (S.B.); (N.K.N.); (T.N.K.)
- Department of Agroecology, Aarhus University, Blichers Allé 20, 8830 Tjele, Denmark
| |
Collapse
|
24
|
Brattoli B, Büchler U, Dorkenwald M, Reiser P, Filli L, Helmchen F, Wahl AS, Ommer B. Unsupervised behaviour analysis and magnification (uBAM) using deep learning. NAT MACH INTELL 2021. [DOI: 10.1038/s42256-021-00326-x] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
25
|
Walter T, Couzin ID. TRex, a fast multi-animal tracking system with markerless identification, and 2D estimation of posture and visual fields. eLife 2021; 10:64000. [PMID: 33634789 PMCID: PMC8096434 DOI: 10.7554/elife.64000] [Citation(s) in RCA: 82] [Impact Index Per Article: 27.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2020] [Accepted: 02/25/2021] [Indexed: 01/24/2023] Open
Abstract
Automated visual tracking of animals is rapidly becoming an indispensable tool for the study of behavior. It offers a quantitative methodology by which organisms’ sensing and decision-making can be studied in a wide range of ecological contexts. Despite this, existing solutions tend to be challenging to deploy in practice, especially when considering long and/or high-resolution video-streams. Here, we present TRex, a fast and easy-to-use solution for tracking a large number of individuals simultaneously using background-subtraction with real-time (60 Hz) tracking performance for up to approximately 256 individuals and estimates 2D visual-fields, outlines, and head/rear of bilateral animals, both in open and closed-loop contexts. Additionally, TRex offers highly accurate, deep-learning-based visual identification of up to approximately 100 unmarked individuals, where it is between 2.5 and 46.7 times faster, and requires 2–10 times less memory, than comparable software (with relative performance increasing for more organisms/longer videos) and provides interactive data-exploration within an intuitive, platform-independent graphical user-interface.
Collapse
Affiliation(s)
- Tristan Walter
- Max Planck Institute of Animal Behavior, Radolfzell, Germany.,Centre for the Advanced Study of Collective Behaviour, University of Konstanz, Konstanz, Germany.,Department of Biology, University of Konstanz, Konstanz, Germany
| | - Iain D Couzin
- Max Planck Institute of Animal Behavior, Radolfzell, Germany.,Centre for the Advanced Study of Collective Behaviour, University of Konstanz, Konstanz, Germany.,Department of Biology, University of Konstanz, Konstanz, Germany
| |
Collapse
|
26
|
Improved 3D tracking and automated classification of rodents' behavioral activity using depth-sensing cameras. Behav Res Methods 2021; 52:2156-2167. [PMID: 32232737 DOI: 10.3758/s13428-020-01381-9] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Analysis of rodents' behavior/activity is of fundamental importance in many research fields. However, many behavioral experiments still rely on manual scoring, with obvious problems in reproducibility. Despite important advances in video-analysis systems and computational ethology, automated behavior quantification is still a challenge. The need for large training datasets, background stability requirements, and reduction to two-dimensional analysis (impairing full posture characterization), limit their use. Here we present a novel integrated solution for behavioral analysis of individual rats, combining video segmentation, tracking of body parts, and automated classification of behaviors, using machine learning and computer vision methods. Low-cost depth cameras (RGB-D) are used to enable three-dimensional tracking and classification in dark conditions and absence of color contrast. Our solution automatically tracks five anatomical landmarks in dynamic environments and recognizes seven distinct behaviors, within the accuracy range of human annotations. The developed free software was validated in experiments where behavioral differences between Wistar Kyoto and Wistar rats were automatically quantified. The results reveal the capability for effective automated phenotyping. An extended annotated RGB-D dataset is also made publicly available. The proposed solution is an easy-to-use tool, with low-cost setup and powerful 3D segmentation methods (in static/dynamic environments). The ability to work in dark conditions means that natural animal behavior is not affected by recording lights. Furthermore, automated classification is possible with only ~30 minutes of annotated videos. By creating conditions for high-throughput analysis and reproducible quantitative measurements of animal behavior experiments, we believe this contribution can greatly improve behavioral analysis research.
Collapse
|
27
|
Leng X, Wohl M, Ishii K, Nayak P, Asahina K. Quantifying influence of human choice on the automated detection of Drosophila behavior by a supervised machine learning algorithm. PLoS One 2020; 15:e0241696. [PMID: 33326445 PMCID: PMC7743940 DOI: 10.1371/journal.pone.0241696] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2020] [Accepted: 10/16/2020] [Indexed: 11/22/2022] Open
Abstract
Automated quantification of behavior is increasingly prevalent in neuroscience research. Human judgments can influence machine-learning-based behavior classification at multiple steps in the process, for both supervised and unsupervised approaches. Such steps include the design of the algorithm for machine learning, the methods used for animal tracking, the choice of training images, and the benchmarking of classification outcomes. However, how these design choices contribute to the interpretation of automated behavioral classifications has not been extensively characterized. Here, we quantify the effects of experimenter choices on the outputs of automated classifiers of Drosophila social behaviors. Drosophila behaviors contain a considerable degree of variability, which was reflected in the confidence levels associated with both human and computer classifications. We found that a diversity of sex combinations and tracking features was important for robust performance of the automated classifiers. In particular, features concerning the relative position of flies contained useful information for training a machine-learning algorithm. These observations shed light on the importance of human influence on tracking algorithms, the selection of training images, and the quality of annotated sample images used to benchmark the performance of a classifier (the ‘ground truth’). Evaluation of these factors is necessary for researchers to accurately interpret behavioral data quantified by a machine-learning algorithm and to further improve automated classifications.
Collapse
Affiliation(s)
- Xubo Leng
- Salk Institute for Biological Studies, La Jolla, California, United States of America
- Department of Electrical and Computer Engineering, University of California, San Diego, La Jolla, California, United States of America
| | - Margot Wohl
- Salk Institute for Biological Studies, La Jolla, California, United States of America
- Neuroscience Graduate Program, University of California, San Diego, La Jolla, California, United States of America
| | - Kenichi Ishii
- Salk Institute for Biological Studies, La Jolla, California, United States of America
| | - Pavan Nayak
- Salk Institute for Biological Studies, La Jolla, California, United States of America
| | - Kenta Asahina
- Salk Institute for Biological Studies, La Jolla, California, United States of America
- * E-mail:
| |
Collapse
|
28
|
Pereira TD, Shaevitz JW, Murthy M. Quantifying behavior to understand the brain. Nat Neurosci 2020; 23:1537-1549. [PMID: 33169033 PMCID: PMC7780298 DOI: 10.1038/s41593-020-00734-z] [Citation(s) in RCA: 110] [Impact Index Per Article: 27.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2020] [Accepted: 10/02/2020] [Indexed: 02/07/2023]
Abstract
Over the past years, numerous methods have emerged to automate the quantification of animal behavior at a resolution not previously imaginable. This has opened up a new field of computational ethology and will, in the near future, make it possible to quantify in near completeness what an animal is doing as it navigates its environment. The importance of improving the techniques with which we characterize behavior is reflected in the emerging recognition that understanding behavior is an essential (or even prerequisite) step to pursuing neuroscience questions. The use of these methods, however, is not limited to studying behavior in the wild or in strictly ethological settings. Modern tools for behavioral quantification can be applied to the full gamut of approaches that have historically been used to link brain to behavior, from psychophysics to cognitive tasks, augmenting those measurements with rich descriptions of how animals navigate those tasks. Here we review recent technical advances in quantifying behavior, particularly in methods for tracking animal motion and characterizing the structure of those dynamics. We discuss open challenges that remain for behavioral quantification and highlight promising future directions, with a strong emphasis on emerging approaches in deep learning, the core technology that has enabled the markedly rapid pace of progress of this field. We then discuss how quantitative descriptions of behavior can be leveraged to connect brain activity with animal movements, with the ultimate goal of resolving the relationship between neural circuits, cognitive processes and behavior.
Collapse
Affiliation(s)
- Talmo D Pereira
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Joshua W Shaevitz
- Department of Physics, Princeton University, Princeton, NJ, USA
- Lewis-Sigler Institute, Princeton University, Princeton, NJ, USA
| | - Mala Murthy
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA.
| |
Collapse
|
29
|
Johnson ZV, Arrojwala MTS, Aljapur V, Lee T, Lancaster TJ, Lowder MC, Gu K, Stockert JI, Lecesne RL, Moorman JM, Streelman JT, McGrath PT. Automated measurement of long-term bower behaviors in Lake Malawi cichlids using depth sensing and action recognition. Sci Rep 2020; 10:20573. [PMID: 33239639 PMCID: PMC7688978 DOI: 10.1038/s41598-020-77549-2] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2020] [Accepted: 11/12/2020] [Indexed: 11/08/2022] Open
Abstract
In the wild, behaviors are often expressed over long time periods in complex and dynamic environments, and many behaviors include direct interaction with the environment itself. However, measuring behavior in naturalistic settings is difficult, and this has limited progress in understanding the mechanisms underlying many naturally evolved behaviors that are critical for survival and reproduction. Here we describe an automated system for measuring long-term bower construction behaviors in Lake Malawi cichlid fishes, in which males use their mouths to sculpt sand into large species-specific structures for courtship and mating. We integrate two orthogonal methods, depth sensing and action recognition, to simultaneously track the developing bower structure and the thousands of individual sand manipulation behaviors performed throughout construction. By registering these two data streams, we show that behaviors can be topographically mapped onto a dynamic 3D sand surface through time. The system runs reliably in multiple species, across many aquariums simultaneously, and for up to weeks at a time. Using this system, we show strong differences in construction behavior and bower form that reflect species differences in nature, and we gain new insights into spatial, temporal, social dimensions of bower construction, feeding, and quivering behaviors. Taken together, our work highlights how low-cost tools can automatically quantify behavior in naturalistic and social environments over long timescales in the lab.
Collapse
Affiliation(s)
- Zachary V Johnson
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA, 30332, USA
| | | | - Vineeth Aljapur
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA, 30332, USA
| | - Tyrone Lee
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA, 30332, USA
| | - Tucker J Lancaster
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA, 30332, USA
- Interdisciplinary Graduate Program in Quantitative Biosciences, Georgia Institute of Technology, Atlanta, GA, 30332, USA
| | - Mark C Lowder
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA, 30332, USA
| | - Karen Gu
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA, 30332, USA
| | - Joseph I Stockert
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA, 30332, USA
| | - Rachel L Lecesne
- Parker H. Petit Institute of Bioengineering and Bioscience, Georgia Institute of Technology, Atlanta, GA, 30332, USA
| | - Jean M Moorman
- Parker H. Petit Institute of Bioengineering and Bioscience, Georgia Institute of Technology, Atlanta, GA, 30332, USA
| | - Jeffrey T Streelman
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA, 30332, USA.
- Interdisciplinary Graduate Program in Quantitative Biosciences, Georgia Institute of Technology, Atlanta, GA, 30332, USA.
| | - Patrick T McGrath
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA, 30332, USA.
- Interdisciplinary Graduate Program in Quantitative Biosciences, Georgia Institute of Technology, Atlanta, GA, 30332, USA.
- Department of Computer Science, Georgia Institute of Technology, Atlanta, GA, 30332, USA.
- School of Physics, Georgia Institute of Technology, Atlanta, GA, 30332, USA.
| |
Collapse
|
30
|
Moulin TC, Covill LE, Itskov PM, Williams MJ, Schiöth HB. Rodent and fly models in behavioral neuroscience: An evaluation of methodological advances, comparative research, and future perspectives. Neurosci Biobehav Rev 2020; 120:1-12. [PMID: 33242563 DOI: 10.1016/j.neubiorev.2020.11.014] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2020] [Revised: 08/25/2020] [Accepted: 11/12/2020] [Indexed: 01/31/2023]
Abstract
The assessment of behavioral outcomes is a central component of neuroscientific research, which has required continuous technological innovations to produce more detailed and reliable findings. In this article, we provide an in-depth review on the progress and future implications for three model organisms (mouse, rat, and Drosophila) essential to our current understanding of behavior. By compiling a comprehensive catalog of popular assays, we are able to compare the diversity of tasks and usage of these animal models in behavioral research. This compilation also allows for the evaluation of existing state-of-the-art methods and experimental applications, including optogenetics, machine learning, and high-throughput behavioral assays. We go on to discuss novel apparatuses and inter-species analyses for centrophobism, feeding behavior, aggression and mating paradigms, with the goal of providing a unique view on comparative behavioral research. The challenges and recent advances are evaluated in terms of their translational value, ethical procedures, and trustworthiness for behavioral research.
Collapse
Affiliation(s)
- Thiago C Moulin
- Functional Pharmacology Unit, Department of Neuroscience, Uppsala University, Uppsala, Sweden.
| | - Laura E Covill
- Functional Pharmacology Unit, Department of Neuroscience, Uppsala University, Uppsala, Sweden; Center for Hematology and Regenerative Medicine, Karolinska Institutet, Stockholm, Sweden
| | - Pavel M Itskov
- Functional Pharmacology Unit, Department of Neuroscience, Uppsala University, Uppsala, Sweden; Department of Pharmacology, Institute of Pharmacy, Sechenov First Moscow State Medical University, Moscow, Russia; Champalimaud Centre for the Unknown, Lisbon, Portugal
| | - Michael J Williams
- Functional Pharmacology Unit, Department of Neuroscience, Uppsala University, Uppsala, Sweden
| | - Helgi B Schiöth
- Functional Pharmacology Unit, Department of Neuroscience, Uppsala University, Uppsala, Sweden; Institute for Translational Medicine and Biotechnology, Sechenov First Moscow State Medical University, Moscow, Russia
| |
Collapse
|
31
|
Gal A, Saragosti J, Kronauer DJC. anTraX, a software package for high-throughput video tracking of color-tagged insects. eLife 2020; 9:e58145. [PMID: 33211008 PMCID: PMC7676868 DOI: 10.7554/elife.58145] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2020] [Accepted: 10/29/2020] [Indexed: 12/14/2022] Open
Abstract
Recent years have seen a surge in methods to track and analyze animal behavior. Nevertheless, tracking individuals in closely interacting, group-living organisms remains a challenge. Here, we present anTraX, an algorithm and software package for high-throughput video tracking of color-tagged insects. anTraX combines neural network classification of animals with a novel approach for representing tracking data as a graph, enabling individual tracking even in cases where it is difficult to segment animals from one another, or where tags are obscured. The use of color tags, a well-established and robust method for marking individual insects in groups, relaxes requirements for image size and quality, and makes the software broadly applicable. anTraX is readily integrated into existing tools and methods for automated image analysis of behavior to further augment its output. anTraX can handle large-scale experiments with minimal human involvement, allowing researchers to simultaneously monitor many social groups over long time periods.
Collapse
Affiliation(s)
- Asaf Gal
- Laboratory of Social Evolution and Behavior, The Rockefeller UniversityNew YorkUnited States
| | - Jonathan Saragosti
- Laboratory of Social Evolution and Behavior, The Rockefeller UniversityNew YorkUnited States
| | - Daniel JC Kronauer
- Laboratory of Social Evolution and Behavior, The Rockefeller UniversityNew YorkUnited States
| |
Collapse
|
32
|
Bentzur A, Ben-Shaanan S, Benichou JIC, Costi E, Levi M, Ilany A, Shohat-Ophir G. Early Life Experience Shapes Male Behavior and Social Networks in Drosophila. Curr Biol 2020; 31:486-501.e3. [PMID: 33186552 DOI: 10.1016/j.cub.2020.10.060] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2020] [Revised: 08/20/2020] [Accepted: 10/20/2020] [Indexed: 10/23/2022]
Abstract
Living in a group creates a complex and dynamic environment in which behavior of individuals is influenced by and affects the behavior of others. Although social interaction and group living are fundamental adaptations exhibited by many organisms, little is known about how prior social experience, internal states, and group composition shape behavior in groups. Here, we present an analytical framework for studying the interplay between social experience and group interaction in Drosophila melanogaster. We simplified the complexity of interactions in a group using a series of experiments in which we controlled the social experience and motivational states of individuals to compare behavioral patterns and social networks of groups under different conditions. We show that social enrichment promotes the formation of distinct group structure that is characterized by high network modularity, high inter-individual and inter-group variance, high inter-individual coordination, and stable social clusters. Using environmental and genetic manipulations, we show that visual cues and cVA-sensing neurons are necessary for the expression of social interaction and network structure in groups. Finally, we explored the formation of group behavior and structure in heterogenous groups composed of flies with distinct internal states and documented emergent structures that are beyond the sum of the individuals that constitute it. Our results demonstrate that fruit flies exhibit complex and dynamic social structures that are modulated by the experience and composition of different individuals within the group. This paves the path for using simple model organisms to dissect the neurobiology of behavior in complex social environments.
Collapse
Affiliation(s)
- Assa Bentzur
- The Mina & Everard Goodman Faculty of Life Sciences, Bar-Ilan University, Ramat-Gan 5290002, Israel
| | - Shir Ben-Shaanan
- The Mina & Everard Goodman Faculty of Life Sciences, Bar-Ilan University, Ramat-Gan 5290002, Israel
| | - Jennifer I C Benichou
- The Mina & Everard Goodman Faculty of Life Sciences, Bar-Ilan University, Ramat-Gan 5290002, Israel
| | - Eliezer Costi
- The Mina & Everard Goodman Faculty of Life Sciences, Bar-Ilan University, Ramat-Gan 5290002, Israel
| | - Mali Levi
- The Mina & Everard Goodman Faculty of Life Sciences, Bar-Ilan University, Ramat-Gan 5290002, Israel
| | - Amiyaal Ilany
- The Mina & Everard Goodman Faculty of Life Sciences, Bar-Ilan University, Ramat-Gan 5290002, Israel.
| | - Galit Shohat-Ophir
- The Mina & Everard Goodman Faculty of Life Sciences, Bar-Ilan University, Ramat-Gan 5290002, Israel; The Leslie and Susan Gonda Multidisciplinary Brain Research Center, Bar-Ilan University, Ramat-Gan 5290002, Israel.
| |
Collapse
|
33
|
Smith JE, Pinter-Wollman N. Observing the unwatchable: Integrating automated sensing, naturalistic observations and animal social network analysis in the age of big data. J Anim Ecol 2020; 90:62-75. [PMID: 33020914 DOI: 10.1111/1365-2656.13362] [Citation(s) in RCA: 45] [Impact Index Per Article: 11.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2019] [Accepted: 09/15/2020] [Indexed: 12/11/2022]
Abstract
In the 4.5 decades since Altmann (1974) published her seminal paper on the methods for the observational study of behaviour, automated detection and analysis of social interaction networks have fundamentally transformed the ways that ecologists study social behaviour. Methodological developments for collecting data remotely on social behaviour involve indirect inference of associations, direct recordings of interactions and machine vision. These recent technological advances are improving the scale and resolution with which we can dissect interactions among animals. They are also revealing new intricacies of animal social interactions at spatial and temporal resolutions as well as in ecological contexts that have been hidden from humans, making the unwatchable seeable. We first outline how these technological applications are permitting researchers to collect exquisitely detailed information with little observer bias. We further recognize new emerging challenges from these new reality-mining approaches. While technological advances in automating data collection and its analysis are moving at an unprecedented rate, we urge ecologists to thoughtfully combine these new tools with classic behavioural and ecological monitoring methods to place our understanding of animal social networks within fundamental biological contexts.
Collapse
Affiliation(s)
| | - Noa Pinter-Wollman
- Department of Ecology and Evolutionary Biology, University of California Los Angeles, Los Angeles, CA, USA
| |
Collapse
|
34
|
Klibaite U, Shaevitz JW. Paired fruit flies synchronize behavior: Uncovering social interactions in Drosophila melanogaster. PLoS Comput Biol 2020; 16:e1008230. [PMID: 33021989 PMCID: PMC7567355 DOI: 10.1371/journal.pcbi.1008230] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/03/2019] [Revised: 10/16/2020] [Accepted: 08/09/2020] [Indexed: 11/19/2022] Open
Abstract
Social behaviors are ubiquitous and crucial to an animal's survival and success. The behaviors an animal performs in a social setting are affected by internal factors, inputs from the environment, and interactions with others. To quantify social behaviors, we need to measure both the stochastic nature of the behavior of isolated individuals and how this behavioral repertoire changes as a function of the environment and interactions between individuals. We probed the behavior of male and female fruit flies in a circular arena as individuals and within all possible pairings. By combining measurements of the animals' position in the arena with an unsupervised analysis of their behaviors, we define the effects of position in the environment and the presence of a partner on locomotion, grooming, singing, and other behaviors that make up an animal's repertoire. We find that geometric context tunes behavioral preference, pairs of animals synchronize their behavioral preferences across shared trials, and paired individuals display signatures of behavioral mimicry.
Collapse
Affiliation(s)
- Ugne Klibaite
- Princeton Neuroscience Institute, Princeton University, Princeton, New Jersey, USA
| | - Joshua W Shaevitz
- Department of Physics, Princeton University, Princeton, New Jersey, USA
| |
Collapse
|
35
|
Salem G, Krynitsky J, Cubert N, Pu A, Anfinrud S, Pedersen J, Lehman J, Kanuri A, Pohida T. Digital video recorder for Raspberry PI cameras with multi-camera synchronous acquisition. HARDWAREX 2020; 8:e00160. [PMID: 35498233 PMCID: PMC9041262 DOI: 10.1016/j.ohx.2020.e00160] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/10/2020] [Revised: 11/04/2020] [Accepted: 11/21/2020] [Indexed: 06/14/2023]
Abstract
Video acquisition and analysis have become integral parts of scientific research. Two major components of a video acquisition system are the choice of camera and the acquisition software. A vast variety of cameras are available on the market. Turnkey multi-camera synchronous acquisition software, however, is not as widely available. For prototyping applications, the Raspberry Pi (RPi) has been widely utilized due to many factors, including cost. There are implementations for video acquisition and preview from a single RPi camera, including one implementation released by the RPi organization itself. However, there are no multi-camera acquisition solutions for the RPi. This paper presents an open-source digital video recorder (DVR) system for the popular RPi camera. The DVR is simple to setup and use for acquisition with a single camera or multiple cameras. In the case of multiple cameras, the acquisition is synchronized between cameras. The DVR comes with a graphical user interface (GUI) to allow previewing the camera streams, setting recording parameters, and associating "names" to cameras. The acquisition code as well as the DVR GUI are written in Python. The open-source software also includes a GUI for playback of recorded video. The versatility of the DVR is demonstrated with a life science research application involving high-throughput monitoring of fruit-flies.
Collapse
Affiliation(s)
- Ghadi Salem
- Signal Processing and Instrumentation Section, Office of Intramural Research, Center for Information Technology, National Institutes of Health, USA
| | - Jonathan Krynitsky
- Signal Processing and Instrumentation Section, Office of Intramural Research, Center for Information Technology, National Institutes of Health, USA
| | - Noah Cubert
- Signal Processing and Instrumentation Section, Office of Intramural Research, Center for Information Technology, National Institutes of Health, USA
| | - Alex Pu
- Division of Veterinary Services, Center for Biologics Evaluation and Research, U. S. Food and Drug Administration, USA
| | - Simeon Anfinrud
- Signal Processing and Instrumentation Section, Office of Intramural Research, Center for Information Technology, National Institutes of Health, USA
| | - Jonathan Pedersen
- Signal Processing and Instrumentation Section, Office of Intramural Research, Center for Information Technology, National Institutes of Health, USA
| | - Joshua Lehman
- Signal Processing and Instrumentation Section, Office of Intramural Research, Center for Information Technology, National Institutes of Health, USA
| | - Ajith Kanuri
- Signal Processing and Instrumentation Section, Office of Intramural Research, Center for Information Technology, National Institutes of Health, USA
| | - Thomas Pohida
- Signal Processing and Instrumentation Section, Office of Intramural Research, Center for Information Technology, National Institutes of Health, USA
| |
Collapse
|
36
|
Long L, Johnson ZV, Li J, Lancaster TJ, Aljapur V, Streelman JT, McGrath PT. Automatic Classification of Cichlid Behaviors Using 3D Convolutional Residual Networks. iScience 2020; 23:101591. [PMID: 33083750 PMCID: PMC7553349 DOI: 10.1016/j.isci.2020.101591] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2020] [Revised: 09/09/2020] [Accepted: 09/16/2020] [Indexed: 11/13/2022] Open
Abstract
Many behaviors that are critical for survival and reproduction are expressed over extended time periods. The ability to inexpensively record and store large volumes of video data creates new opportunities to understand the biological basis of these behaviors and simultaneously creates a need for tools that can automatically quantify behaviors from large video datasets. Here, we demonstrate that 3D Residual Networks can be used to classify an array of complex behaviors in Lake Malawi cichlid fishes. We first apply pixel-based hidden Markov modeling combined with density-based spatiotemporal clustering to identify sand disturbance events. After this, a 3D ResNet, trained on 11,000 manually annotated video clips, accurately (>76%) classifies the sand disturbance events into 10 fish behavior categories, distinguishing between spitting, scooping, fin swipes, and spawning. Furthermore, animal intent can be determined from these clips, as spits and scoops performed during bower construction are classified independently from those during feeding.
Collapse
Affiliation(s)
- Lijiang Long
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA 30332, USA.,Interdisciplinary Graduate Program in Quantitative Biosciences, Georgia Institute of Technology, Atlanta, GA 30332, USA
| | - Zachary V Johnson
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA 30332, USA
| | - Junyu Li
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA 30332, USA
| | - Tucker J Lancaster
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA 30332, USA.,Interdisciplinary Graduate Program in Quantitative Biosciences, Georgia Institute of Technology, Atlanta, GA 30332, USA
| | - Vineeth Aljapur
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA 30332, USA
| | - Jeffrey T Streelman
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA 30332, USA.,Parker H. Petit Institute of Bioengineering and Bioscience, Georgia Institute of Technology, Atlanta, GA 30332, USA
| | - Patrick T McGrath
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA 30332, USA.,Parker H. Petit Institute of Bioengineering and Bioscience, Georgia Institute of Technology, Atlanta, GA 30332, USA.,School of Physics, Georgia Institute of Technology, Atlanta, GA 30332, USA
| |
Collapse
|
37
|
Osborne B, Bakula D, Ben Ezra M, Dresen C, Hartmann E, Kristensen SM, Mkrtchyan GV, Nielsen MH, Petr MA, Scheibye-Knudsen M. New methodologies in ageing research. Ageing Res Rev 2020; 62:101094. [PMID: 32512174 DOI: 10.1016/j.arr.2020.101094] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2020] [Revised: 05/14/2020] [Accepted: 05/27/2020] [Indexed: 02/06/2023]
Abstract
Ageing is arguably the most complex phenotype that occurs in humans. To understand and treat ageing as well as associated diseases, highly specialised technologies are emerging that reveal critical insight into the underlying mechanisms and provide new hope for previously untreated diseases. Herein, we describe the latest developments in cutting edge technologies applied across the field of ageing research. We cover emerging model organisms, high-throughput methodologies and machine-driven approaches. In all, this review will give you a glimpse of what will be pushing the field onwards and upwards.
Collapse
Affiliation(s)
- Brenna Osborne
- Center for Healthy Aging, Department of Cellular and Molecular Medicine, University of Copenhagen, Copenhagen, Denmark
| | - Daniela Bakula
- Center for Healthy Aging, Department of Cellular and Molecular Medicine, University of Copenhagen, Copenhagen, Denmark
| | - Michael Ben Ezra
- Center for Healthy Aging, Department of Cellular and Molecular Medicine, University of Copenhagen, Copenhagen, Denmark
| | - Charlotte Dresen
- Center for Healthy Aging, Department of Cellular and Molecular Medicine, University of Copenhagen, Copenhagen, Denmark
| | - Esben Hartmann
- Center for Healthy Aging, Department of Cellular and Molecular Medicine, University of Copenhagen, Copenhagen, Denmark
| | - Stella M Kristensen
- Center for Healthy Aging, Department of Cellular and Molecular Medicine, University of Copenhagen, Copenhagen, Denmark
| | - Garik V Mkrtchyan
- Center for Healthy Aging, Department of Cellular and Molecular Medicine, University of Copenhagen, Copenhagen, Denmark
| | - Malte H Nielsen
- Center for Healthy Aging, Department of Cellular and Molecular Medicine, University of Copenhagen, Copenhagen, Denmark
| | - Michael A Petr
- Center for Healthy Aging, Department of Cellular and Molecular Medicine, University of Copenhagen, Copenhagen, Denmark
| | - Morten Scheibye-Knudsen
- Center for Healthy Aging, Department of Cellular and Molecular Medicine, University of Copenhagen, Copenhagen, Denmark.
| |
Collapse
|
38
|
Goodwin NL, Nilsson SRO, Golden SA. Rage Against the Machine: Advancing the study of aggression ethology via machine learning. Psychopharmacology (Berl) 2020; 237:2569-2588. [PMID: 32647898 PMCID: PMC7502501 DOI: 10.1007/s00213-020-05577-x] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/07/2019] [Accepted: 06/01/2020] [Indexed: 12/24/2022]
Abstract
RATIONALE Aggression, comorbid with neuropsychiatric disorders, exhibits with diverse clinical presentations and places a significant burden on patients, caregivers, and society. This diversity is observed because aggression is a complex behavior that can be ethologically demarcated as either appetitive (rewarding) or reactive (defensive), each with its own behavioral characteristics, functionality, and neural basis that may transition from adaptive to maladaptive depending on genetic and environmental factors. There has been a recent surge in the development of preclinical animal models for studying appetitive aggression-related behaviors and identifying the neural mechanisms guiding their progression and expression. However, adoption of these procedures is often impeded by the arduous task of manually scoring complex social interactions. Manual observations are generally susceptible to observer drift, long analysis times, and poor inter-rater reliability, and are further incompatible with the sampling frequencies required of modern neuroscience methods. OBJECTIVES In this review, we discuss recent advances in the preclinical study of appetitive aggression in mice, paired with our perspective on the potential for machine learning techniques in producing automated, robust scoring of aggressive social behavior. We discuss critical considerations for implementing valid computer classifications within behavioral pharmacological studies. KEY RESULTS Open-source automated classification platforms can match or exceed the performance of human observers while removing the confounds of observer drift, bias, and inter-rater reliability. Furthermore, unsupervised approaches can identify previously uncharacterized aggression-related behavioral repertoires in model species. DISCUSSION AND CONCLUSIONS Advances in open-source computational approaches hold promise for overcoming current manual annotation caveats while also introducing and generalizing computational neuroethology to the greater behavioral neuroscience community. We propose that currently available open-source approaches are sufficient for overcoming the main limitations preventing wide adoption of machine learning within the context of preclinical aggression behavioral research.
Collapse
Affiliation(s)
- Nastacia L Goodwin
- Department of Biological Structure, University of Washington, Seattle, WA, USA
- Graduate Program in Neuroscience, University of Washington, Seattle, WA, USA
| | - Simon R O Nilsson
- Department of Biological Structure, University of Washington, Seattle, WA, USA
| | - Sam A Golden
- Department of Biological Structure, University of Washington, Seattle, WA, USA.
- Graduate Program in Neuroscience, University of Washington, Seattle, WA, USA.
- Center of Excellence in Neurobiology of Addiction, Pain, and Emotion (NAPE), University of Washington, Seattle, WA, USA.
| |
Collapse
|
39
|
Jensen GW, van der Smagt P, Heiss E, Straka H, Kohl T. SnakeStrike: A Low-Cost Open-Source High-Speed Multi-Camera Motion Capture System. Front Behav Neurosci 2020; 14:116. [PMID: 32848652 PMCID: PMC7416652 DOI: 10.3389/fnbeh.2020.00116] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2020] [Accepted: 06/10/2020] [Indexed: 11/13/2022] Open
Abstract
Current neuroethological experiments require sophisticated technologies to precisely quantify the behavior of animals. In many studies, solutions for video recording and subsequent tracking of animal behavior form a major bottleneck. Three-dimensional (3D) tracking systems have been available for a few years but are usually very expensive and rarely include very high-speed cameras; access to these systems for research is limited. Additionally, establishing custom-built software is often time consuming – especially for researchers without high-performance programming and computer vision expertise. Here, we present an open-source software framework that allows researchers to utilize low-cost high-speed cameras in their research for a fraction of the cost of commercial systems. This software handles the recording of synchronized high-speed video from multiple cameras, the offline 3D reconstruction of that video, and a viewer for the triangulated data, all functions previously also available as separate applications. It supports researchers with a performance-optimized suite of functions that encompass the entirety of data collection and decreases processing time for high-speed 3D position tracking on a variety of animals, including snakes. Motion capture in snakes can be particularly demanding since a strike can be as short as 50 ms, literally twice as fast as the blink of an eye. This is too fast for faithful recording by most commercial tracking systems and therefore represents a challenging test to our software for quantification of animal behavior. Therefore, we conducted a case study investigating snake strike speed to showcase the use and integration of the software in an existing experimental setup.
Collapse
Affiliation(s)
- Grady W Jensen
- Graduate School of Systemic Neurosciences (GSN-LMU), Ludwig-Maximilians-University Munich, Munich, Germany.,argmax.ai, Volkswagen Group Machine Learning Research Lab, Munich, Germany
| | - Patrick van der Smagt
- Graduate School of Systemic Neurosciences (GSN-LMU), Ludwig-Maximilians-University Munich, Munich, Germany.,argmax.ai, Volkswagen Group Machine Learning Research Lab, Munich, Germany.,Department of Artificial Intelligence, Faculty of Informatics, Eötvös Lórand University, Budapest, Hungary
| | - Egon Heiss
- Institute of Zoology and Evolutionary Research, Friedrich-Schiller-University of Jena, Jena, Germany
| | - Hans Straka
- Graduate School of Systemic Neurosciences (GSN-LMU), Ludwig-Maximilians-University Munich, Munich, Germany.,Department Biology II, Ludwig-Maximilians-University Munich, Munich, Germany
| | - Tobias Kohl
- Chair of Zoology, Technical University of Munich, Freising, Germany
| |
Collapse
|
40
|
Alameer A, Kyriazakis I, Bacardit J. Automated recognition of postures and drinking behaviour for the detection of compromised health in pigs. Sci Rep 2020; 10:13665. [PMID: 32788633 PMCID: PMC7423952 DOI: 10.1038/s41598-020-70688-6] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2020] [Accepted: 07/30/2020] [Indexed: 11/12/2022] Open
Abstract
Changes in pig behaviours are a useful aid in detecting early signs of compromised health and welfare. In commercial settings, automatic detection of pig behaviours through visual imaging remains a challenge due to farm demanding conditions, e.g., occlusion of one pig from another. Here, two deep learning-based detector methods were developed to identify pig postures and drinking behaviours of group-housed pigs. We first tested the system ability to detect changes in these measures at group-level during routine management. We then demonstrated the ability of our automated methods to identify behaviours of individual animals with a mean average precision of \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$0.989 \pm 0.009$$\end{document}0.989±0.009, under a variety of settings. When the pig feeding regime was disrupted, we automatically detected the expected deviations from the daily feeding routine in standing, lateral lying and drinking behaviours. These experiments demonstrate that the method is capable of robustly and accurately monitoring individual pig behaviours under commercial conditions, without the need for additional sensors or individual pig identification, hence providing a scalable technology to improve the health and well-being of farm animals. The method has the potential to transform how livestock are monitored and address issues in livestock farming, such as targeted treatment of individuals with medication.
Collapse
Affiliation(s)
- Ali Alameer
- School of Natural and Environmental Sciences, Newcastle University, Newcastle Upon Tyne, NE1 7RU, UK. .,School of Computing, Newcastle University, Newcastle Upon Tyne, NE4 5TG, UK.
| | - Ilias Kyriazakis
- Institute for Global Food Security, Queen's University, Belfast, BT9 5DL, UK
| | - Jaume Bacardit
- School of Computing, Newcastle University, Newcastle Upon Tyne, NE4 5TG, UK
| |
Collapse
|
41
|
Escobedo R, Lecheval V, Papaspyros V, Bonnet F, Mondada F, Sire C, Theraulaz G. A data-driven method for reconstructing and modelling social interactions in moving animal groups. Philos Trans R Soc Lond B Biol Sci 2020; 375:20190380. [PMID: 32713309 DOI: 10.1098/rstb.2019.0380] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Group-living organisms that collectively migrate range from cells and bacteria to human crowds, and include swarms of insects, schools of fish, and flocks of birds or ungulates. Unveiling the behavioural and cognitive mechanisms by which these groups coordinate their movements is a challenging task. These mechanisms take place at the individual scale and can be described as a combination of interactions between individuals and interactions between these individuals and the physical obstacles in the environment. Thanks to the development of novel tracking techniques that provide large and accurate datasets, the main characteristics of individual and collective behavioural patterns can be quantified with an unprecedented level of precision. However, in a large number of studies, social interactions are usually described by force map methods that only have a limited capacity of explanation and prediction, being rarely suitable for a direct implementation in a concise and explicit mathematical model. Here, we present a general method to extract the interactions between individuals that are involved in the coordination of collective movements in groups of organisms. We then apply this method to characterize social interactions in two species of shoaling fish, the rummy-nose tetra (Hemigrammus rhodostomus) and the zebrafish (Danio rerio), which both present a burst-and-coast motion. From the detailed quantitative description of individual-level interactions, it is thus possible to develop a quantitative model of the emergent dynamics observed at the group level, whose predictions can be checked against experimental results. This method can be applied to a wide range of biological and social systems. This article is part of the theme issue 'Multi-scale analysis and modelling of collective migration in biological systems'.
Collapse
Affiliation(s)
- R Escobedo
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), Centre National de la Recherche Scientifique (CNRS) & Université de Toulouse - Paul Sabatier, 31062 Toulouse, France
| | - V Lecheval
- Department of Biology, University of York, York YO10 5DD, UK
| | - V Papaspyros
- MOBOTS group, Biorobotics laboratory, École Polytechnique Fédérale de Lausanne, 1015 Lausanne, Switzerland
| | - F Bonnet
- MOBOTS group, Biorobotics laboratory, École Polytechnique Fédérale de Lausanne, 1015 Lausanne, Switzerland
| | - F Mondada
- MOBOTS group, Biorobotics laboratory, École Polytechnique Fédérale de Lausanne, 1015 Lausanne, Switzerland
| | - C Sire
- Laboratoire de Physique Théorique, Centre National de la Recherche Scientifique (CNRS) & Université de Toulouse - Paul Sabatier, 31062 Toulouse, France
| | - G Theraulaz
- Centre de Recherches sur la Cognition Animale, Centre de Biologie Intégrative (CBI), Centre National de la Recherche Scientifique (CNRS) & Université de Toulouse - Paul Sabatier, 31062 Toulouse, France.,Centre for Ecological Sciences, Indian Institute of Science, Bengaluru, India
| |
Collapse
|
42
|
Francisco FA, Nührenberg P, Jordan A. High-resolution, non-invasive animal tracking and reconstruction of local environment in aquatic ecosystems. MOVEMENT ECOLOGY 2020; 8:27. [PMID: 32582448 PMCID: PMC7310323 DOI: 10.1186/s40462-020-00214-w] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/26/2020] [Accepted: 05/26/2020] [Indexed: 05/19/2023]
Abstract
BACKGROUND Acquiring high resolution quantitative behavioural data underwater often involves installation of costly infrastructure, or capture and manipulation of animals. Aquatic movement ecology can therefore be limited in taxonomic range and ecological coverage. METHODS Here we present a novel deep-learning based, multi-individual tracking approach, which incorporates Structure-from-Motion in order to determine the 3D location, body position and the visual environment of every recorded individual. The application is based on low-cost cameras and does not require the animals to be confined, manipulated, or handled in any way. RESULTS Using this approach, single individuals, small heterospecific groups and schools of fish were tracked in freshwater and marine environments of varying complexity. Positional tracking errors as low as 1.09 ± 0.47 cm (RSME) in underwater areas up to 500 m2 were recorded. CONCLUSIONS This cost-effective and open-source framework allows the analysis of animal behaviour in aquatic systems at an unprecedented resolution. Implementing this versatile approach, quantitative behavioural analysis can be employed in a wide range of natural contexts, vastly expanding our potential for examining non-model systems and species.
Collapse
Affiliation(s)
- Fritz A Francisco
- Centre for the Advanced Study of Collective Behaviour, University of Konstanz, Universitätsstraße 10, Konstanz, 78457 Germany
- Department of Collective Behavior, Max Planck Institute of Animal Behavior, Universitätsstraße 10, Konstanz, 78457 Germany
- Department of Biology, University of Konstanz, Universitätsstraße 10, Konstanz, 78457 Germany
| | - Paul Nührenberg
- Centre for the Advanced Study of Collective Behaviour, University of Konstanz, Universitätsstraße 10, Konstanz, 78457 Germany
- Department of Collective Behavior, Max Planck Institute of Animal Behavior, Universitätsstraße 10, Konstanz, 78457 Germany
- Department of Biology, University of Konstanz, Universitätsstraße 10, Konstanz, 78457 Germany
| | - Alex Jordan
- Centre for the Advanced Study of Collective Behaviour, University of Konstanz, Universitätsstraße 10, Konstanz, 78457 Germany
- Department of Collective Behavior, Max Planck Institute of Animal Behavior, Universitätsstraße 10, Konstanz, 78457 Germany
- Department of Biology, University of Konstanz, Universitätsstraße 10, Konstanz, 78457 Germany
| |
Collapse
|
43
|
Pilkiewicz KR, Lemasson BH, Rowland MA, Hein A, Sun J, Berdahl A, Mayo ML, Moehlis J, Porfiri M, Fernández-Juricic E, Garnier S, Bollt EM, Carlson JM, Tarampi MR, Macuga KL, Rossi L, Shen CC. Decoding collective communications using information theory tools. J R Soc Interface 2020; 17:20190563. [PMID: 32183638 PMCID: PMC7115225 DOI: 10.1098/rsif.2019.0563] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2019] [Accepted: 02/28/2020] [Indexed: 02/03/2023] Open
Abstract
Organisms have evolved sensory mechanisms to extract pertinent information from their environment, enabling them to assess their situation and act accordingly. For social organisms travelling in groups, like the fish in a school or the birds in a flock, sharing information can further improve their situational awareness and reaction times. Data on the benefits and costs of social coordination, however, have largely allowed our understanding of why collective behaviours have evolved to outpace our mechanistic knowledge of how they arise. Recent studies have begun to correct this imbalance through fine-scale analyses of group movement data. One approach that has received renewed attention is the use of information theoretic (IT) tools like mutual information, transfer entropy and causation entropy, which can help identify causal interactions in the type of complex, dynamical patterns often on display when organisms act collectively. Yet, there is a communications gap between studies focused on the ecological constraints and solutions of collective action with those demonstrating the promise of IT tools in this arena. We attempt to bridge this divide through a series of ecologically motivated examples designed to illustrate the benefits and challenges of using IT tools to extract deeper insights into the interaction patterns governing group-level dynamics. We summarize some of the approaches taken thus far to circumvent existing challenges in this area and we conclude with an optimistic, yet cautionary perspective.
Collapse
Affiliation(s)
- K. R. Pilkiewicz
- Environmental Laboratory, U.S. Army Engineer Research and Development Center (EL-ERDC), Vicksburg, MS, USA
| | | | - M. A. Rowland
- Environmental Laboratory, U.S. Army Engineer Research and Development Center (EL-ERDC), Vicksburg, MS, USA
| | - A. Hein
- National Oceanic and Atmospheric Administration, Santa Cruz, CA, USA
- University of California, Santa Cruz, CA, USA
| | - J. Sun
- Department of Mathematics, Clarkson University, Potsdam, NY, USA
| | - A. Berdahl
- School of Aquatic and Fishery Sciences, University of Washington, Seattle, WA, USA
| | - M. L. Mayo
- Environmental Laboratory, U.S. Army Engineer Research and Development Center (EL-ERDC), Vicksburg, MS, USA
| | - J. Moehlis
- Department of Mechanical Engineering, University of California, Santa Barbara, CA, USA
| | - M. Porfiri
- Department of Mechanical and Aerospace Engineering and Department of Biomedical Engineering, New York University Tandon School of Engineering, Brooklyn, NY, USA
| | | | - S. Garnier
- Department of Biological Sciences, New Jersey Institute of Technology, Newark, NJ, USA
| | - E. M. Bollt
- Department of Mathematics, Clarkson University, Potsdam, NY, USA
| | - J. M. Carlson
- Department of Physics, University of California, Santa Barbara, CA, USA
| | - M. R. Tarampi
- Department of Psychology, University of Hartford, West Hartford, CT, USA
| | - K. L. Macuga
- School of Psychological Science, Oregon State University, Corvallis, OR, USA
| | - L. Rossi
- Department of Mathematical Sciences, University of Delaware, Newark, DE, USA
| | - C.-C. Shen
- Department of Computer and Information Sciences, University of Delaware, Newark, DE, USA
| |
Collapse
|
44
|
Ravbar P, Branson K, Simpson JH. An automatic behavior recognition system classifies animal behaviors using movements and their temporal context. J Neurosci Methods 2019; 326:108352. [PMID: 31415845 PMCID: PMC6779137 DOI: 10.1016/j.jneumeth.2019.108352] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2019] [Revised: 07/03/2019] [Accepted: 07/07/2019] [Indexed: 12/23/2022]
Abstract
Animals can perform complex and purposeful behaviors by executing simpler movements in flexible sequences. It is particularly challenging to analyze behavior sequences when they are highly variable, as is the case in language production, certain types of birdsong and, as in our experiments, flies grooming. High sequence variability necessitates rigorous quantification of large amounts of data to identify organizational principles and temporal structure of such behavior. To cope with large amounts of data, and minimize human effort and subjective bias, researchers often use automatic behavior recognition software. Our standard grooming assay involves coating flies in dust and videotaping them as they groom to remove it. The flies move freely and so perform the same movements in various orientations. As the dust is removed, their appearance changes. These conditions make it difficult to rely on precise body alignment and anatomical landmarks such as eyes or legs and thus present challenges to existing behavior classification software. Human observers use speed, location, and shape of the movements as the diagnostic features of particular grooming actions. We applied this intuition to design a new automatic behavior recognition system (ABRS) based on spatiotemporal features in the video data, heavily weighted for temporal dynamics and invariant to the animal's position and orientation in the scene. We use these spatiotemporal features in two steps of supervised classification that reflect two time-scales at which the behavior is structured. As a proof of principle, we show results from quantification and analysis of a large data set of stimulus-induced fly grooming behaviors that would have been difficult to assess in a smaller dataset of human-annotated ethograms. While we developed and validated this approach to analyze fly grooming behavior, we propose that the strategy of combining alignment-invariant features and multi-timescale analysis may be generally useful for movement-based classification of behavior from video data.
Collapse
Affiliation(s)
- Primoz Ravbar
- Department of Molecular, Cellular, and Developmental Biology, UC Santa Barbara, Santa Barbara, CA, USA.
| | - Kristin Branson
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA, USA.
| | - Julie H Simpson
- Department of Molecular, Cellular, and Developmental Biology, UC Santa Barbara, Santa Barbara, CA, USA.
| |
Collapse
|
45
|
Salem G, Krynitsky J, Hayes M, Pohida T, Burgos-Artizzu X. Three-Dimensional Pose Estimation for Laboratory Mouse From Monocular Images. IEEE TRANSACTIONS ON IMAGE PROCESSING : A PUBLICATION OF THE IEEE SIGNAL PROCESSING SOCIETY 2019; 28:4273-4287. [PMID: 30946667 PMCID: PMC6677238 DOI: 10.1109/tip.2019.2908796] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Video-based activity and behavior analysis of mice has garnered wide attention in biomedical research. Animal facilities hold large numbers of mice housed in "home-cages" densely stored within ventilated racks. Automated analysis of mice activity in their home-cages can provide a new set of sensitive measures for detecting abnormalities and time-resolved deviation from the baseline behavior. Large-scale monitoring in animal facilities requires minimal footprint hardware that integrates seamlessly with the ventilated racks. The compactness of hardware imposes the use of fisheye lenses positioned in close proximity to the cage. In this paper, we propose a systematic approach to accurately estimate the 3D pose of the mouse from single-monocular fisheye-distorted images. Our approach employs a novel adaptation of a structured forest algorithm. We benchmark our algorithm against existing methods. We demonstrate the utility of the pose estimates in predicting mouse behavior in a continuous video.
Collapse
|
46
|
Robinson HA, Pozzo-Miller L. The role of MeCP2 in learning and memory. ACTA ACUST UNITED AC 2019; 26:343-350. [PMID: 31416907 PMCID: PMC6699413 DOI: 10.1101/lm.048876.118] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2019] [Accepted: 05/21/2019] [Indexed: 01/31/2023]
Abstract
Gene transcription is a crucial step in the sequence of molecular, synaptic, cellular, and systems mechanisms underlying learning and memory. Here, we review the experimental evidence demonstrating that alterations in the levels and functionality of the methylated DNA-binding transcriptional regulator MeCP2 are implicated in the learning and memory deficits present in mouse models of Rett syndrome and MECP2 duplication syndrome. The significant impact that MeCP2 has on gene transcription through a variety of mechanisms, combined with well-defined models of learning and memory, make MeCP2 an excellent candidate to exemplify the role of gene transcription in learning and memory. Together, these studies have strengthened the concept that precise control of activity-dependent gene transcription is a fundamental mechanism that ensures long-term adaptive behaviors necessary for the survival of individuals interacting with their congeners in an ever-changing environment.
Collapse
Affiliation(s)
- Holly A Robinson
- Department of Neurobiology, The University of Alabama at Birmingham, Birmingham, Alabama 35294, USA
| | - Lucas Pozzo-Miller
- Department of Neurobiology, The University of Alabama at Birmingham, Birmingham, Alabama 35294, USA
| |
Collapse
|
47
|
Drew PJ, Winder AT, Zhang Q. Twitches, Blinks, and Fidgets: Important Generators of Ongoing Neural Activity. Neuroscientist 2019; 25:298-313. [PMID: 30311838 PMCID: PMC6800083 DOI: 10.1177/1073858418805427] [Citation(s) in RCA: 32] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
Abstract
Animals and humans continuously engage in small, spontaneous motor actions, such as blinking, whisking, and postural adjustments ("fidgeting"). These movements are accompanied by changes in neural activity in sensory and motor regions of the brain. The frequency of these motions varies in time, is affected by sensory stimuli, arousal levels, and pathology. These fidgeting behaviors can be entrained by sensory stimuli. Fidgeting behaviors will cause distributed, bilateral functional activation in the 0.01 to 0.1 Hz frequency range that will show up in functional magnetic resonance imaging and wide-field calcium neuroimaging studies, and will contribute to the observed functional connectivity among brain regions. However, despite the large potential of these behaviors to drive brain-wide activity, these fidget-like behaviors are rarely monitored. We argue that studies of spontaneous and evoked brain dynamics in awake animals and humans should closely monitor these fidgeting behaviors. Differences in these fidgeting behaviors due to arousal or pathology will "contaminate" ongoing neural activity, and lead to apparent differences in functional connectivity. Monitoring and accounting for the brain-wide activations by these behaviors is essential during experiments to differentiate fidget-driven activity from internally driven neural dynamics.
Collapse
Affiliation(s)
- Patrick J Drew
- Department of Engineering Science and Mechanics, Pennsylvania State University, University Park, PA, USA
- Department of Neurosurgery and Department of Biomedical Engineering, Pennsylvania State University, University Park, PA, USA
| | - Aaron T Winder
- Department of Engineering Science and Mechanics, Pennsylvania State University, University Park, PA, USA
| | - Qingguang Zhang
- Department of Engineering Science and Mechanics, Pennsylvania State University, University Park, PA, USA
| |
Collapse
|
48
|
Wu S, Tan KJ, Govindarajan LN, Stewart JC, Gu L, Ho JWH, Katarya M, Wong BH, Tan EK, Li D, Claridge-Chang A, Libedinsky C, Cheng L, Aw SS. Fully automated leg tracking of Drosophila neurodegeneration models reveals distinct conserved movement signatures. PLoS Biol 2019; 17:e3000346. [PMID: 31246996 PMCID: PMC6619818 DOI: 10.1371/journal.pbio.3000346] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2018] [Revised: 07/10/2019] [Accepted: 06/14/2019] [Indexed: 11/19/2022] Open
Abstract
Some neurodegenerative diseases, like Parkinsons Disease (PD) and Spinocerebellar ataxia 3 (SCA3), are associated with distinct, altered gait and tremor movements that are reflective of the underlying disease etiology. Drosophila melanogaster models of neurodegeneration have illuminated our understanding of the molecular mechanisms of disease. However, it is unknown whether specific gait and tremor dysfunctions also occur in fly disease mutants. To answer this question, we developed a machine-learning image-analysis program, Feature Learning-based LImb segmentation and Tracking (FLLIT), that automatically tracks leg claw positions of freely moving flies recorded on high-speed video, producing a series of gait measurements. Notably, unlike other machine-learning methods, FLLIT generates its own training sets and does not require user-annotated images for learning. Using FLLIT, we carried out high-throughput and high-resolution analysis of gait and tremor features in Drosophila neurodegeneration mutants for the first time. We found that fly models of PD and SCA3 exhibited markedly different walking gait and tremor signatures, which recapitulated characteristics of the respective human diseases. Selective expression of mutant SCA3 in dopaminergic neurons led to a gait signature that more closely resembled those of PD flies. This suggests that the behavioral phenotype depends on the neurons affected rather than the specific nature of the mutation. Different mutations produced tremors in distinct leg pairs, indicating that different motor circuits were affected. Using this approach, fly models can be used to dissect the neurogenetic mechanisms that underlie movement disorders. This study uses automated leg tracking to characterise gait and tremor features in fruit fly models of Parkinson’s disease and spinocerebellar ataxia 3, finding movement features that resemble characteristics of the respective human diseases.
Collapse
Affiliation(s)
- Shuang Wu
- Bioinformatics Institute, Agency for Science, Technology and Research, Singapore
| | - Kah Junn Tan
- Institute of Molecular and Cell Biology, Agency for Science, Technology and Research, Singapore
| | | | - James Charles Stewart
- Institute of Molecular and Cell Biology, Agency for Science, Technology and Research, Singapore
- Duke-NUS Graduate Medical School, Neuroscience and Behavioural Disorders, Singapore
| | - Lin Gu
- Bioinformatics Institute, Agency for Science, Technology and Research, Singapore
| | - Joses Wei Hao Ho
- Institute of Molecular and Cell Biology, Agency for Science, Technology and Research, Singapore
- Duke-NUS Graduate Medical School, Neuroscience and Behavioural Disorders, Singapore
| | - Malvika Katarya
- Institute of Molecular and Cell Biology, Agency for Science, Technology and Research, Singapore
| | - Boon Hui Wong
- National University of Singapore, Department of Biological Sciences, Singapore
| | - Eng-King Tan
- National Neuroscience Institute, Singapore General Hospital, Singapore
| | - Daiqin Li
- National University of Singapore, Department of Biological Sciences, Singapore
| | - Adam Claridge-Chang
- Institute of Molecular and Cell Biology, Agency for Science, Technology and Research, Singapore
- Duke-NUS Graduate Medical School, Neuroscience and Behavioural Disorders, Singapore
| | - Camilo Libedinsky
- Institute of Molecular and Cell Biology, Agency for Science, Technology and Research, Singapore
- Singapore Institute for Neurotechnology (SiNAPSE), Singapore
- National University of Singapore, Department of Psychology, Singapore
| | - Li Cheng
- Bioinformatics Institute, Agency for Science, Technology and Research, Singapore
- Department of Electrical and Computer Engineering, University of Alberta, Edmonton, Alberta, Canada
- * E-mail: (SA); (CL)
| | - Sherry Shiying Aw
- Institute of Molecular and Cell Biology, Agency for Science, Technology and Research, Singapore
- * E-mail: (SA); (CL)
| |
Collapse
|
49
|
Soto AP, Po T, McHenry MJ. Multichannel stroboscopic videography (MSV): a technique for visualizing multiple channels for behavioral measurements. ACTA ACUST UNITED AC 2019; 222:jeb.201749. [PMID: 31085596 DOI: 10.1242/jeb.201749] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2019] [Accepted: 05/06/2019] [Indexed: 11/20/2022]
Abstract
Biologists commonly visualize different features of an organism using distinct sources of illumination. Such multichannel imaging has largely not been applied to behavioral studies because of the challenges posed by a moving subject. We address this challenge with the technique of multichannel stroboscopic videography (MSV), which synchronizes multiple strobe lights with video exposures of a single camera. We illustrate the utility of this approach with kinematic measurements of a walking cockroach (Gromphadorhina portentosa) and calculations of the pressure field around a swimming fish (Danio rerio). In both, transmitted illumination generated high-contrast images of the animal's body in one channel. Other sources of illumination were used to visualize the points of contact for the feet of the cockroach and the water flow around the fish in separate channels. MSV provides an enhanced potential for high-throughput experimentation and the capacity to integrate changes in physiological or environmental conditions in freely-behaving animals.
Collapse
Affiliation(s)
- Alberto P Soto
- Department of Ecology and Evolutionary Biology, University of California, Irvine, 321 Steinhaus Hall, Irvine, CA 92697, USA
| | - Theodora Po
- Department of Ecology and Evolutionary Biology, University of California, Irvine, 321 Steinhaus Hall, Irvine, CA 92697, USA
| | - Matthew J McHenry
- Department of Ecology and Evolutionary Biology, University of California, Irvine, 321 Steinhaus Hall, Irvine, CA 92697, USA
| |
Collapse
|
50
|
Quantifying the social symptoms of autism using motion capture. Sci Rep 2019; 9:7712. [PMID: 31118483 PMCID: PMC6531432 DOI: 10.1038/s41598-019-44180-9] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2019] [Accepted: 05/07/2019] [Indexed: 11/24/2022] Open
Abstract
Autism Spectrum Disorder (ASD) is a remarkably heterogeneous condition where individuals exhibit a variety of symptoms at different levels of severity. Quantifying the severity of specific symptoms is difficult, because it either requires long assessments or observations of the ASD individual, or reliance on care-giver questionnaires, which can be subjective. Here we present a new technique for objectively quantifying the severity of several core social ASD symptoms using a motion capture system installed in a clinical exam room. We present several measures of child-clinician interaction, which include the distance between them, the proportion of time that the child approached or avoided the clinician, and the direction that the child faced in relation to the clinician. Together, these measures explained ~30% of the variance in ADOS scores, when using only ~5 minute segments of “free play” from the recorded ADOS assessments. These results demonstrate the utility of motion capture for aiding researchers and clinicians in the assessment of ASD social symptoms. Further development of this technology and appropriate motion capture measures for use in kindergartens and at home is likely to yield valuable information that will aid in quantifying the initial severity of core ASD symptoms and their change over time.
Collapse
|