1
|
Ebbesen CL, Froemke RC. Body language signals for rodent social communication. Curr Opin Neurobiol 2021; 68:91-106. [PMID: 33582455 PMCID: PMC8243782 DOI: 10.1016/j.conb.2021.01.008] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2020] [Revised: 01/09/2021] [Accepted: 01/25/2021] [Indexed: 12/15/2022]
Abstract
Integration of social cues to initiate adaptive emotional and behavioral responses is a fundamental aspect of animal and human behavior. In humans, social communication includes prominent nonverbal components, such as social touch, gestures and facial expressions. Comparative studies investigating the neural basis of social communication in rodents has historically been centered on olfactory signals and vocalizations, with relatively less focus on non-verbal social cues. Here, we outline two exciting research directions: First, we will review recent observations pointing to a role of social facial expressions in rodents. Second, we will review observations that point to a role of 'non-canonical' rodent body language: body posture signals beyond stereotyped displays in aggressive and sexual behavior. In both sections, we will outline how social neuroscience can build on recent advances in machine learning, robotics and micro-engineering to push these research directions forward towards a holistic systems neurobiology of rodent body language.
Collapse
Affiliation(s)
- Christian L Ebbesen
- Skirball Institute of Biomolecular Medicine, Neuroscience Institute, Departments of Otolaryngology, Neuroscience and Physiology, New York University School of Medicine, New York, NY, 10016, USA; Center for Neural Science, New York University, New York, NY, 10003, USA.
| | - Robert C Froemke
- Skirball Institute of Biomolecular Medicine, Neuroscience Institute, Departments of Otolaryngology, Neuroscience and Physiology, New York University School of Medicine, New York, NY, 10016, USA; Center for Neural Science, New York University, New York, NY, 10003, USA; Howard Hughes Medical Institute Faculty Scholar, USA.
| |
Collapse
|
2
|
Sehara K, Bahr V, Mitchinson B, Pearson MJ, Larkum ME, Sachdev RNS. Fast, Flexible Closed-Loop Feedback: Tracking Movement in "Real-Millisecond-Time". eNeuro 2019; 6:ENEURO.0147-19.2019. [PMID: 31611334 PMCID: PMC6825957 DOI: 10.1523/eneuro.0147-19.2019] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2019] [Revised: 09/12/2019] [Accepted: 09/16/2019] [Indexed: 12/19/2022] Open
Abstract
One of the principal functions of the brain is to control movement and rapidly adapt behavior to a changing external environment. Over the last decades our ability to monitor activity in the brain, manipulate it while also manipulating the environment the animal moves through, has been tackled with increasing sophistication. However, our ability to track the movement of the animal in real time has not kept pace. Here, we use a dynamic vision sensor (DVS) based event-driven neuromorphic camera system to implement real-time, low-latency tracking of a single whisker that mice can move at ∼25 Hz. The customized DVS system described here converts whisker motion into a series of events that can be used to estimate the position of the whisker and to trigger a position-based output interactively within 2 ms. This neuromorphic chip-based closed-loop system provides feedback rapidly and flexibly. With this system, it becomes possible to use the movement of whiskers or in principal, movement of any part of the body to reward, punish, in a rapidly reconfigurable way. These methods can be used to manipulate behavior, and the neural circuits that help animals adapt to changing values of a sequence of motor actions.
Collapse
Affiliation(s)
- Keisuke Sehara
- Institute of Biology, Humboldt University of Berlin, D-10117 Berlin, Germany
| | | | - Ben Mitchinson
- Department of Computer Science, University of Sheffield, Sheffield, S10 2TP United Kingdom
| | - Martin J Pearson
- Bristol Robotics Laboratory, University of Bristol and University of the West of England, Bristol, BS16 1QY United Kingdom
| | - Matthew E Larkum
- Institute of Biology, Humboldt University of Berlin, D-10117 Berlin, Germany
| | - Robert N S Sachdev
- Institute of Biology, Humboldt University of Berlin, D-10117 Berlin, Germany
| |
Collapse
|
3
|
More than Just a "Motor": Recent Surprises from the Frontal Cortex. J Neurosci 2019; 38:9402-9413. [PMID: 30381432 DOI: 10.1523/jneurosci.1671-18.2018] [Citation(s) in RCA: 49] [Impact Index Per Article: 9.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2018] [Revised: 09/14/2018] [Accepted: 09/17/2018] [Indexed: 12/31/2022] Open
Abstract
Motor and premotor cortices are crucial for the control of movements. However, we still know little about how these areas contribute to higher-order motor control, such as deciding which movements to make and when to make them. Here we focus on rodent studies and review recent findings, which suggest that-in addition to motor control-neurons in motor cortices play a role in sensory integration, behavioral strategizing, working memory, and decision-making. We suggest that these seemingly disparate functions may subserve an evolutionarily conserved role in sensorimotor cognition and that further study of rodent motor cortices could make a major contribution to our understanding of the evolution and function of the mammalian frontal cortex.
Collapse
|
4
|
Vanzella W, Grion N, Bertolini D, Perissinotto A, Gigante M, Zoccolan D. A passive, camera-based head-tracking system for real-time, three-dimensional estimation of head position and orientation in rodents. J Neurophysiol 2019; 122:2220-2242. [PMID: 31553687 DOI: 10.1152/jn.00301.2019] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022] Open
Abstract
Tracking head position and orientation in small mammals is crucial for many applications in the field of behavioral neurophysiology, from the study of spatial navigation to the investigation of active sensing and perceptual representations. Many approaches to head tracking exist, but most of them only estimate the 2D coordinates of the head over the plane where the animal navigates. Full reconstruction of the pose of the head in 3D is much more more challenging and has been achieved only in handful of studies, which employed headsets made of multiple LEDs or inertial units. However, these assemblies are rather bulky and need to be powered to operate, which prevents their application in wireless experiments and in the small enclosures often used in perceptual studies. Here we propose an alternative approach, based on passively imaging a lightweight, compact, 3D structure, painted with a pattern of black dots over a white background. By applying a cascade of feature extraction algorithms that progressively refine the detection of the dots and reconstruct their geometry, we developed a tracking method that is highly precise and accurate, as assessed through a battery of validation measurements. We show that this method can be used to study how a rat samples sensory stimuli during a perceptual discrimination task and how a hippocampal place cell represents head position over extremely small spatial scales. Given its minimal encumbrance and wireless nature, our method could be ideal for high-throughput applications, where tens of animals need to be simultaneously and continuously tracked.NEW & NOTEWORTHY Head tracking is crucial in many behavioral neurophysiology studies. Yet reconstruction of the head's pose in 3D is challenging and typically requires implanting bulky, electrically powered headsets that prevent wireless experiments and are hard to employ in operant boxes. Here we propose an alternative approach, based on passively imaging a compact, 3D dot pattern that, once implanted over the head of a rodent, allows estimating the pose of its head with high precision and accuracy.
Collapse
Affiliation(s)
- Walter Vanzella
- Visual Neuroscience Laboratory, International School for Advanced Studies (SISSA), Trieste, Italy.,Glance Vision Technologies, Trieste, Italy
| | - Natalia Grion
- Visual Neuroscience Laboratory, International School for Advanced Studies (SISSA), Trieste, Italy
| | - Daniele Bertolini
- Visual Neuroscience Laboratory, International School for Advanced Studies (SISSA), Trieste, Italy
| | - Andrea Perissinotto
- Visual Neuroscience Laboratory, International School for Advanced Studies (SISSA), Trieste, Italy.,Glance Vision Technologies, Trieste, Italy
| | - Marco Gigante
- Mechatronics Lab, International School for Advanced Studies (SISSA), Trieste, Italy
| | - Davide Zoccolan
- Visual Neuroscience Laboratory, International School for Advanced Studies (SISSA), Trieste, Italy
| |
Collapse
|
5
|
Adibi M. Whisker-Mediated Touch System in Rodents: From Neuron to Behavior. Front Syst Neurosci 2019; 13:40. [PMID: 31496942 PMCID: PMC6712080 DOI: 10.3389/fnsys.2019.00040] [Citation(s) in RCA: 38] [Impact Index Per Article: 7.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2018] [Accepted: 08/02/2019] [Indexed: 01/02/2023] Open
Abstract
A key question in systems neuroscience is to identify how sensory stimuli are represented in neuronal activity, and how the activity of sensory neurons in turn is “read out” by downstream neurons and give rise to behavior. The choice of a proper model system to address these questions, is therefore a crucial step. Over the past decade, the increasingly powerful array of experimental approaches that has become available in non-primate models (e.g., optogenetics and two-photon imaging) has spurred a renewed interest for the use of rodent models in systems neuroscience research. Here, I introduce the rodent whisker-mediated touch system as a structurally well-established and well-organized model system which, despite its simplicity, gives rise to complex behaviors. This system serves as a behaviorally efficient model system; known as nocturnal animals, along with their olfaction, rodents rely on their whisker-mediated touch system to collect information about their surrounding environment. Moreover, this system represents a well-studied circuitry with a somatotopic organization. At every stage of processing, one can identify anatomical and functional topographic maps of whiskers; “barrelettes” in the brainstem nuclei, “barreloids” in the sensory thalamus, and “barrels” in the cortex. This article provides a brief review on the basic anatomy and function of the whisker system in rodents.
Collapse
Affiliation(s)
- Mehdi Adibi
- School of Psychology, University of New South Wales, Sydney, NSW, Australia.,Tactile Perception and Learning Lab, International School for Advanced Studies (SISSA), Trieste, Italy.,Padua Neuroscience Center, University of Padua, Padua, Italy
| |
Collapse
|
6
|
Rigosa J, Lucantonio A, Noselli G, Fassihi A, Zorzin E, Manzino F, Pulecchi F, Diamond ME. A Fluorescent Dye Method Suitable for Visualization of One or More Rat Whiskers. Bio Protoc 2018; 8:e2749. [PMID: 34179276 DOI: 10.21769/bioprotoc.2749] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2017] [Revised: 02/13/2018] [Accepted: 02/16/2018] [Indexed: 11/02/2022] Open
Abstract
Visualization and tracking of the facial whiskers is critical to many studies of rodent behavior. High-speed videography is the most robust methodology for characterizing whisker kinematics, but whisker visualization is challenging due to the low contrast of the whisker against its background. Recently, we showed that fluorescent dye(s) can be applied to enhance visualization and tracking of whisker(s) ( Rigosa et al., 2017 ), and this protocol provides additional details on the technique.
Collapse
Affiliation(s)
- Jacopo Rigosa
- International School for Advanced Studies, Trieste, Italy
| | | | | | - Arash Fassihi
- International School for Advanced Studies, Trieste, Italy
| | - Erik Zorzin
- International School for Advanced Studies, Trieste, Italy
| | | | | | | |
Collapse
|
7
|
Abstract
A fundamental question in the investigation of any sensory system is what physical signals drive its sensory neurons during natural behavior. Surprisingly, in the whisker system, it is only recently that answers to this question have emerged. Here, we review the key developments, focussing mainly on the first stage of the ascending pathway - the primary whisker afferents (PWAs). We first consider a biomechanical framework, which describes the fundamental mechanical forces acting on the whiskers during active sensation. We then discuss technical progress that has allowed such mechanical variables to be estimated in awake, behaving animals. We discuss past electrophysiological evidence concerning how PWAs function and reinterpret it within the biomechanical framework. Finally, we consider recent studies of PWAs in awake, behaving animals and compare the results to related studies of the cortex. We argue that understanding 'what the whiskers tell the brain' sheds valuable light on the computational functions of downstream neural circuits, in particular, the barrel cortex.
Collapse
|
8
|
Rigosa J, Lucantonio A, Noselli G, Fassihi A, Zorzin E, Manzino F, Pulecchi F, Diamond ME. Dye-enhanced visualization of rat whiskers for behavioral studies. eLife 2017; 6:e25290. [PMID: 28613155 PMCID: PMC5511012 DOI: 10.7554/elife.25290] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2017] [Accepted: 06/13/2017] [Indexed: 11/13/2022] Open
Abstract
Visualization and tracking of the facial whiskers is required in an increasing number of rodent studies. Although many approaches have been employed, only high-speed videography has proven adequate for measuring whisker motion and deformation during interaction with an object. However, whisker visualization and tracking is challenging for multiple reasons, primary among them the low contrast of the whisker against its background. Here, we demonstrate a fluorescent dye method suitable for visualization of one or more rat whiskers. The process makes the dyed whisker(s) easily visible against a dark background. The coloring does not influence the behavioral performance of rats trained on a vibrissal vibrotactile discrimination task, nor does it affect the whiskers' mechanical properties.
Collapse
Affiliation(s)
- Jacopo Rigosa
- International School for Advanced Studies, Trieste, Italy
| | | | | | - Arash Fassihi
- International School for Advanced Studies, Trieste, Italy
| | - Erik Zorzin
- International School for Advanced Studies, Trieste, Italy
| | | | | | | |
Collapse
|