1
|
Carugati F, Gorio DC, De Gregorio C, Valente D, Ferrario V, Lefaux B, Friard O, Gamba M. Quantifying Facial Gestures Using Deep Learning in a New World Monkey. Am J Primatol 2025; 87:e70013. [PMID: 40019116 PMCID: PMC11869534 DOI: 10.1002/ajp.70013] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2024] [Revised: 01/30/2025] [Accepted: 02/08/2025] [Indexed: 03/01/2025]
Abstract
Facial gestures are a crucial component of primate multimodal communication. However, current methodologies for extracting facial data from video recordings are labor-intensive and prone to human subjectivity. Although automatic tools for this task are still in their infancy, deep learning techniques are revolutionizing animal behavior research. This study explores the distinctiveness of facial gestures in cotton-top tamarins, quantified using markerless pose estimation algorithms. From footage of captive individuals, we extracted and manually labeled frames to develop a model that can recognize a custom set of landmarks positioned on the face of the target species. The trained model predicted landmark positions and subsequently transformed them into distance matrices representing landmarks' spatial distributions within each frame. We employed three competitive machine learning classifiers to assess the ability to automatically discriminate facial configurations that cooccur with vocal emissions and are associated with different behavioral contexts. Initial analysis showed correct classification rates exceeding 80%, suggesting that voiced facial configurations are highly distinctive from unvoiced ones. Our findings also demonstrated varying context specificity of facial gestures, with the highest classification accuracy observed during yawning, social activity, and resting. This study highlights the potential of markerless pose estimation for advancing the study of primate multimodal communication, even in challenging species such as cotton-top tamarins. The ability to automatically distinguish facial gestures in different behavioral contexts represents a critical step in developing automated tools for extracting behavioral cues from raw video data.
Collapse
Affiliation(s)
- Filippo Carugati
- Department of Life Sciences and Systems BiologyUniversità di TorinoTorinoItaly
| | | | - Chiara De Gregorio
- Department of Life Sciences and Systems BiologyUniversità di TorinoTorinoItaly
- Department of PsychologyUniversity of WarwickCoventryUK
| | - Daria Valente
- Department of Life Sciences and Systems BiologyUniversità di TorinoTorinoItaly
- Parco Natura Viva Garda Zoological ParkBussolengoItaly
| | - Valeria Ferrario
- Department of Life Sciences and Systems BiologyUniversità di TorinoTorinoItaly
- Chester ZooChesterUK
| | | | - Olivier Friard
- Department of Life Sciences and Systems BiologyUniversità di TorinoTorinoItaly
| | - Marco Gamba
- Department of Life Sciences and Systems BiologyUniversità di TorinoTorinoItaly
| |
Collapse
|
2
|
Correia-Caeiro C, Costa R, Hayashi M, Burrows A, Pater J, Miyabe-Nishiwaki T, Richardson JL, Robbins MM, Waller B, Liebal K. GorillaFACS: The Facial Action Coding System for the Gorilla spp. PLoS One 2025; 20:e0308790. [PMID: 39874277 PMCID: PMC11774405 DOI: 10.1371/journal.pone.0308790] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2024] [Accepted: 07/30/2024] [Indexed: 01/30/2025] Open
Abstract
The Facial Action Coding System (FACS) is an objective observation tool for measuring human facial behaviour. It avoids subjective attributions of meaning by objectively measuring independent movements linked to facial muscles, called Action Units (AUs). FACS has been adapted to 11 other taxa, including most apes, macaques and domestic animals, but not yet gorillas. To carry out cross species studies of facial expressions within and beyond apes, gorillas need to be included in such studies. Hence, we developed the GorillaFACS for the Gorilla spp. We followed similar methodology as previous FACS: First, we examined the facial muscular plan of the gorilla. Second, we analysed gorilla videos in a wide variety of contexts to identify their spontaneous facial movements. Third, we classified the individual facial movements according to appearance changes produced by the corresponding underlying musculature. A diverse repertoire of 42 facial movements was identified in the gorilla, including 28 AUs and 14 Action Descriptors, with several new movements not identified in the HumanFACS. Although some of the movements in gorillas differ from humans, the total number of AUs is comparable to the HumanFACS (32 AUs). Importantly, the gorilla's range of facial movements was larger than expected, suggesting a more relevant role in social interactions than what was previously assumed. GorillaFACS is a scientific tool to measure facial movements, and thus, will allow us to better understand the gorilla's expressions and communication. Furthermore, GorillaFACS has the potential be used as an important tool to evaluate this species welfare, particularly in settings of close proximity to humans.
Collapse
Affiliation(s)
- Catia Correia-Caeiro
- Human Biology & Primate Cognition Department, Institute of Biology, Leipzig University, Leipzig, Germany
- Comparative Cultural Psychology, Max Planck Institute for Evolutionary Anthropology, Leipzig, Germany
| | - Raquel Costa
- Research Department, Japan Monkey Center, Inuyama, Japan
- Primate Cognition Research Group, Lisbon, Portugal
- Mulheres pela Primatologia, Florianópolis, Brazil
| | - Misato Hayashi
- Research Department, Japan Monkey Center, Inuyama, Japan
| | - Anne Burrows
- Department of Physical Therapy, Duquesne University, Pittsburgh, PA, United States of America
- Department of Anthropology, University of Pittsburgh, Pittsburgh, PA, United States of America
| | - Jordan Pater
- Department of Physical Therapy, Duquesne University, Pittsburgh, PA, United States of America
| | - Takako Miyabe-Nishiwaki
- Center for the Evolutionary Origins of Human Behavior (EHuB), Kyoto University, Inuyama, Japan
| | - Jack L. Richardson
- Center for the Advanced Study of Human Paleobiology, Department of Anthropology, The George Washington University, Washington, DC, United States of America
| | - Martha M. Robbins
- Department of Primate Behavior and Evolution, Max Planck Institute for Evolutionary Anthropology, Leipzig, Germany
| | - Bridget Waller
- Department of Psychology, Nottingham Trent University, Nottingham, United Kingdom
| | - Katja Liebal
- Human Biology & Primate Cognition Department, Institute of Biology, Leipzig University, Leipzig, Germany
- Comparative Cultural Psychology, Max Planck Institute for Evolutionary Anthropology, Leipzig, Germany
| |
Collapse
|
3
|
Liu XH, Gan L, Zhang ZT, Yu PK, Dai J. Probing the processing of facial expressions in monkeys via time perception and eye tracking. Zool Res 2023; 44:882-893. [PMID: 37545418 PMCID: PMC10559096 DOI: 10.24272/j.issn.2095-8137.2023.003] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2023] [Accepted: 08/04/2023] [Indexed: 08/08/2023] Open
Abstract
Accurately recognizing facial expressions is essential for effective social interactions. Non-human primates (NHPs) are widely used in the study of the neural mechanisms underpinning facial expression processing, yet it remains unclear how well monkeys can recognize the facial expressions of other species such as humans. In this study, we systematically investigated how monkeys process the facial expressions of conspecifics and humans using eye-tracking technology and sophisticated behavioral tasks, namely the temporal discrimination task (TDT) and face scan task (FST). We found that monkeys showed prolonged subjective time perception in response to Negative facial expressions in monkeys while showing longer reaction time to Negative facial expressions in humans. Monkey faces also reliably induced divergent pupil contraction in response to different expressions, while human faces and scrambled monkey faces did not. Furthermore, viewing patterns in the FST indicated that monkeys only showed bias toward emotional expressions upon observing monkey faces. Finally, masking the eye region marginally decreased the viewing duration for monkey faces but not for human faces. By probing facial expression processing in monkeys, our study demonstrates that monkeys are more sensitive to the facial expressions of conspecifics than those of humans, thus shedding new light on inter-species communication through facial expressions between NHPs and humans.
Collapse
Affiliation(s)
- Xin-He Liu
- Shenzhen Technological Research Center for Primate Translational Medicine, Shenzhen-Hong Kong Institute of Brain Science, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
- CAS Key Laboratory of Brain Connectome and Manipulation, Brain Cognition and Brain Disease Institute, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
- Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
| | - Lu Gan
- Research Center for Medical Artificial Intelligence, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
| | - Zhi-Ting Zhang
- Shenzhen Technological Research Center for Primate Translational Medicine, Shenzhen-Hong Kong Institute of Brain Science, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
- CAS Key Laboratory of Brain Connectome and Manipulation, Brain Cognition and Brain Disease Institute, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
- Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
| | - Pan-Ke Yu
- Shenzhen Technological Research Center for Primate Translational Medicine, Shenzhen-Hong Kong Institute of Brain Science, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
- University of Chinese Academy of Sciences, Beijing 100049, China
| | - Ji Dai
- Shenzhen Technological Research Center for Primate Translational Medicine, Shenzhen-Hong Kong Institute of Brain Science, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
- CAS Key Laboratory of Brain Connectome and Manipulation, Brain Cognition and Brain Disease Institute, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
- Guangdong Provincial Key Laboratory of Brain Connectome and Behavior, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong 518055, China
- University of Chinese Academy of Sciences, Beijing 100049, China. E-mail:
| |
Collapse
|
4
|
Noritake A, Ninomiya T, Kobayashi K, Isoda M. Chemogenetic dissection of a prefrontal-hypothalamic circuit for socially subjective reward valuation in macaques. Nat Commun 2023; 14:4372. [PMID: 37474519 PMCID: PMC10359292 DOI: 10.1038/s41467-023-40143-x] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2022] [Accepted: 07/13/2023] [Indexed: 07/22/2023] Open
Abstract
The value of one's own reward is affected by the reward of others, serving as a source for envy. However, it is not known which neural circuits mediate such socially subjective value modulation. Here, we chemogenetically dissected the circuit from the medial prefrontal cortex (MPFC) to the lateral hypothalamus (LH) while male macaques were presented with visual stimuli that concurrently signaled the prospects of one's own and others' rewards. We found that functional disconnection between the MPFC and LH rendered animals significantly less susceptible to others' but not one's own reward prospects. In parallel with this behavioral change, inter-areal coordination, as indexed by coherence and Granger causality, decreased primarily in the delta and theta bands. These findings demonstrate that the MPFC-to-LH circuit plays a crucial role in carrying information about upcoming other-rewards for subjective reward valuation in social contexts.
Collapse
Affiliation(s)
- Atsushi Noritake
- Division of Behavioral Development, Department of System Neuroscience, National Institute for Physiological Sciences, National Institutes of Natural Sciences, Okazaki, Japan
- Department of Physiological Sciences, School of Life Science, The Graduate University for Advanced Studies (SOKENDAI), Hayama, Japan
| | - Taihei Ninomiya
- Division of Behavioral Development, Department of System Neuroscience, National Institute for Physiological Sciences, National Institutes of Natural Sciences, Okazaki, Japan
- Department of Physiological Sciences, School of Life Science, The Graduate University for Advanced Studies (SOKENDAI), Hayama, Japan
| | - Kenta Kobayashi
- Department of Physiological Sciences, School of Life Science, The Graduate University for Advanced Studies (SOKENDAI), Hayama, Japan
- Section of Viral Vector Development, National Institute for Physiological Sciences, National Institutes of Natural Sciences, Okazaki, Japan
| | - Masaki Isoda
- Division of Behavioral Development, Department of System Neuroscience, National Institute for Physiological Sciences, National Institutes of Natural Sciences, Okazaki, Japan.
- Department of Physiological Sciences, School of Life Science, The Graduate University for Advanced Studies (SOKENDAI), Hayama, Japan.
| |
Collapse
|
5
|
Going Deeper than Tracking: A Survey of Computer-Vision Based Recognition of Animal Pain and Emotions. Int J Comput Vis 2022. [DOI: 10.1007/s11263-022-01716-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
Abstract
AbstractAdvances in animal motion tracking and pose recognition have been a game changer in the study of animal behavior. Recently, an increasing number of works go ‘deeper’ than tracking, and address automated recognition of animals’ internal states such as emotions and pain with the aim of improving animal welfare, making this a timely moment for a systematization of the field. This paper provides a comprehensive survey of computer vision-based research on recognition of pain and emotional states in animals, addressing both facial and bodily behavior analysis. We summarize the efforts that have been presented so far within this topic—classifying them across different dimensions, highlight challenges and research gaps, and provide best practice recommendations for advancing the field, and some future directions for research.
Collapse
|
6
|
Correia-Caeiro C, Burrows A, Wilson DA, Abdelrahman A, Miyabe-Nishiwaki T. CalliFACS: The common marmoset Facial Action Coding System. PLoS One 2022; 17:e0266442. [PMID: 35580128 PMCID: PMC9113598 DOI: 10.1371/journal.pone.0266442] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2021] [Accepted: 03/21/2022] [Indexed: 11/19/2022] Open
Abstract
Facial expressions are subtle cues, central for communication and conveying emotions in mammals. Traditionally, facial expressions have been classified as a whole (e.g. happy, angry, bared-teeth), due to automatic face processing in the human brain, i.e., humans categorise emotions globally, but are not aware of subtle or isolated cues such as an eyebrow raise. Moreover, the same facial configuration (e.g. lip corners pulled backwards exposing teeth) can convey widely different information depending on the species (e.g. humans: happiness; chimpanzees: fear). The Facial Action Coding System (FACS) is considered the gold standard for investigating human facial behaviour and avoids subjective interpretations of meaning by objectively measuring independent movements linked to facial muscles, called Action Units (AUs). Following a similar methodology, we developed the CalliFACS for the common marmoset. First, we determined the facial muscular plan of the common marmoset by examining dissections from the literature. Second, we recorded common marmosets in a variety of contexts (e.g. grooming, feeding, play, human interaction, veterinary procedures), and selected clips from online databases (e.g. YouTube) to identify their facial movements. Individual facial movements were classified according to appearance changes produced by the corresponding underlying musculature. A diverse repertoire of 33 facial movements was identified in the common marmoset (15 Action Units, 15 Action Descriptors and 3 Ear Action Descriptors). Although we observed a reduced range of facial movement when compared to the HumanFACS, the common marmoset's range of facial movements was larger than predicted according to their socio-ecology and facial morphology, which indicates their importance for social interactions. CalliFACS is a scientific tool to measure facial movements, and thus, allows us to better understand the common marmoset's expressions and communication. As common marmosets have become increasingly popular laboratory animal models, from neuroscience to cognition, CalliFACS can be used as an important tool to evaluate their welfare, particularly in captivity.
Collapse
Affiliation(s)
| | - Anne Burrows
- Department of Physical Therapy, Duquesne University, Pittsburgh, Pennsylvania, United States of America
- Department of Anthropology, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
| | - Duncan Andrew Wilson
- Primate Research Institute, Kyoto University, Inuyama, Japan
- Graduate School of Letters, Kyoto University, Kyoto, Japan
| | - Abdelhady Abdelrahman
- School of Health and Life Sciences, Glasgow Caledonian University, Glasgow, United Kingdom
| | | |
Collapse
|