1
|
Liu XH, Gan L, Zhang ZT, Yu PK, Dai J. Probing the processing of facial expressions in monkeys via time perception and eye tracking. Zool Res 2023; 44:882-893. [PMID: 37545418 PMCID: PMC10559096 DOI: 10.24272/j.issn.2095-8137.2023.003] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2023] [Accepted: 08/04/2023] [Indexed: 08/08/2023] Open
Abstract
Accurately recognizing facial expressions is essential for effective social interactions. Non-human primates (NHPs) are widely used in the study of the neural mechanisms underpinning facial expression processing, yet it remains unclear how well monkeys can recognize the facial expressions of other species such as humans. In this study, we systematically investigated how monkeys process the facial expressions of conspecifics and humans using eye-tracking technology and sophisticated behavioral tasks, namely the temporal discrimination task (TDT) and face scan task (FST). We found that monkeys showed prolonged subjective time perception in response to Negative facial expressions in monkeys while showing longer reaction time to Negative facial expressions in humans. Monkey faces also reliably induced divergent pupil contraction in response to different expressions, while human faces and scrambled monkey faces did not. Furthermore, viewing patterns in the FST indicated that monkeys only showed bias toward emotional expressions upon observing monkey faces. Finally, masking the eye region marginally decreased the viewing duration for monkey faces but not for human faces. By probing facial expression processing in monkeys, our study demonstrates that monkeys are more sensitive to the facial expressions of conspecifics than those of humans, thus shedding new light on inter-species communication through facial expressions between NHPs and humans.
Collapse
Affiliation(s)
- Xin-He Liu
- Shenzhen Technological Research Center for Primate Translational Medicine, Shenzhen-Hong Kong Institute of Brain Science, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
- CAS Key Laboratory of Brain Connectome and Manipulation, Brain Cognition and Brain Disease Institute, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
- Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
| | - Lu Gan
- Research Center for Medical Artificial Intelligence, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
| | - Zhi-Ting Zhang
- Shenzhen Technological Research Center for Primate Translational Medicine, Shenzhen-Hong Kong Institute of Brain Science, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
- CAS Key Laboratory of Brain Connectome and Manipulation, Brain Cognition and Brain Disease Institute, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
- Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
| | - Pan-Ke Yu
- Shenzhen Technological Research Center for Primate Translational Medicine, Shenzhen-Hong Kong Institute of Brain Science, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
- University of Chinese Academy of Sciences, Beijing 100049, China
| | - Ji Dai
- Shenzhen Technological Research Center for Primate Translational Medicine, Shenzhen-Hong Kong Institute of Brain Science, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
- CAS Key Laboratory of Brain Connectome and Manipulation, Brain Cognition and Brain Disease Institute, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
- Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
- University of Chinese Academy of Sciences, Beijing 100049, China. E-mail:
| |
Collapse
|
2
|
Noritake A, Ninomiya T, Kobayashi K, Isoda M. Chemogenetic dissection of a prefrontal-hypothalamic circuit for socially subjective reward valuation in macaques. Nat Commun 2023; 14:4372. [PMID: 37474519 PMCID: PMC10359292 DOI: 10.1038/s41467-023-40143-x] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2022] [Accepted: 07/13/2023] [Indexed: 07/22/2023] Open
Abstract
The value of one's own reward is affected by the reward of others, serving as a source for envy. However, it is not known which neural circuits mediate such socially subjective value modulation. Here, we chemogenetically dissected the circuit from the medial prefrontal cortex (MPFC) to the lateral hypothalamus (LH) while male macaques were presented with visual stimuli that concurrently signaled the prospects of one's own and others' rewards. We found that functional disconnection between the MPFC and LH rendered animals significantly less susceptible to others' but not one's own reward prospects. In parallel with this behavioral change, inter-areal coordination, as indexed by coherence and Granger causality, decreased primarily in the delta and theta bands. These findings demonstrate that the MPFC-to-LH circuit plays a crucial role in carrying information about upcoming other-rewards for subjective reward valuation in social contexts.
Collapse
Affiliation(s)
- Atsushi Noritake
- Division of Behavioral Development, Department of System Neuroscience, National Institute for Physiological Sciences, National Institutes of Natural Sciences, Okazaki, Japan
- Department of Physiological Sciences, School of Life Science, The Graduate University for Advanced Studies (SOKENDAI), Hayama, Japan
| | - Taihei Ninomiya
- Division of Behavioral Development, Department of System Neuroscience, National Institute for Physiological Sciences, National Institutes of Natural Sciences, Okazaki, Japan
- Department of Physiological Sciences, School of Life Science, The Graduate University for Advanced Studies (SOKENDAI), Hayama, Japan
| | - Kenta Kobayashi
- Department of Physiological Sciences, School of Life Science, The Graduate University for Advanced Studies (SOKENDAI), Hayama, Japan
- Section of Viral Vector Development, National Institute for Physiological Sciences, National Institutes of Natural Sciences, Okazaki, Japan
| | - Masaki Isoda
- Division of Behavioral Development, Department of System Neuroscience, National Institute for Physiological Sciences, National Institutes of Natural Sciences, Okazaki, Japan.
- Department of Physiological Sciences, School of Life Science, The Graduate University for Advanced Studies (SOKENDAI), Hayama, Japan.
| |
Collapse
|
3
|
Going Deeper than Tracking: A Survey of Computer-Vision Based Recognition of Animal Pain and Emotions. Int J Comput Vis 2022. [DOI: 10.1007/s11263-022-01716-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
Abstract
AbstractAdvances in animal motion tracking and pose recognition have been a game changer in the study of animal behavior. Recently, an increasing number of works go ‘deeper’ than tracking, and address automated recognition of animals’ internal states such as emotions and pain with the aim of improving animal welfare, making this a timely moment for a systematization of the field. This paper provides a comprehensive survey of computer vision-based research on recognition of pain and emotional states in animals, addressing both facial and bodily behavior analysis. We summarize the efforts that have been presented so far within this topic—classifying them across different dimensions, highlight challenges and research gaps, and provide best practice recommendations for advancing the field, and some future directions for research.
Collapse
|
4
|
Correia-Caeiro C, Burrows A, Wilson DA, Abdelrahman A, Miyabe-Nishiwaki T. CalliFACS: The common marmoset Facial Action Coding System. PLoS One 2022; 17:e0266442. [PMID: 35580128 PMCID: PMC9113598 DOI: 10.1371/journal.pone.0266442] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2021] [Accepted: 03/21/2022] [Indexed: 11/19/2022] Open
Abstract
Facial expressions are subtle cues, central for communication and conveying emotions in mammals. Traditionally, facial expressions have been classified as a whole (e.g. happy, angry, bared-teeth), due to automatic face processing in the human brain, i.e., humans categorise emotions globally, but are not aware of subtle or isolated cues such as an eyebrow raise. Moreover, the same facial configuration (e.g. lip corners pulled backwards exposing teeth) can convey widely different information depending on the species (e.g. humans: happiness; chimpanzees: fear). The Facial Action Coding System (FACS) is considered the gold standard for investigating human facial behaviour and avoids subjective interpretations of meaning by objectively measuring independent movements linked to facial muscles, called Action Units (AUs). Following a similar methodology, we developed the CalliFACS for the common marmoset. First, we determined the facial muscular plan of the common marmoset by examining dissections from the literature. Second, we recorded common marmosets in a variety of contexts (e.g. grooming, feeding, play, human interaction, veterinary procedures), and selected clips from online databases (e.g. YouTube) to identify their facial movements. Individual facial movements were classified according to appearance changes produced by the corresponding underlying musculature. A diverse repertoire of 33 facial movements was identified in the common marmoset (15 Action Units, 15 Action Descriptors and 3 Ear Action Descriptors). Although we observed a reduced range of facial movement when compared to the HumanFACS, the common marmoset's range of facial movements was larger than predicted according to their socio-ecology and facial morphology, which indicates their importance for social interactions. CalliFACS is a scientific tool to measure facial movements, and thus, allows us to better understand the common marmoset's expressions and communication. As common marmosets have become increasingly popular laboratory animal models, from neuroscience to cognition, CalliFACS can be used as an important tool to evaluate their welfare, particularly in captivity.
Collapse
Affiliation(s)
| | - Anne Burrows
- Department of Physical Therapy, Duquesne University, Pittsburgh, Pennsylvania, United States of America
- Department of Anthropology, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
| | - Duncan Andrew Wilson
- Primate Research Institute, Kyoto University, Inuyama, Japan
- Graduate School of Letters, Kyoto University, Kyoto, Japan
| | - Abdelhady Abdelrahman
- School of Health and Life Sciences, Glasgow Caledonian University, Glasgow, United Kingdom
| | | |
Collapse
|