1
|
Whitehouse J, Clark PR, Robinson RL, Rees K, O’Callaghan O, Kimock CM, Witham CL, Waller BM. Facial expressivity in dominant macaques is linked to group cohesion. Proc Biol Sci 2024; 291:20240984. [PMID: 39013427 PMCID: PMC11251757 DOI: 10.1098/rspb.2024.0984] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2024] [Revised: 06/03/2024] [Accepted: 06/24/2024] [Indexed: 07/18/2024] Open
Abstract
Social living affords primates (including humans) many benefits. Communication has been proposed to be the key mechanism used to bond social connections, which could explain why primates have evolved such expressive faces. We assessed whether the facial expressivity of the dominant male (quantified from the coding of anatomically based facial movement) was related to social network properties (based on social proximity and grooming) in nine groups of captive rhesus macaques (Macaca mulatta) housed in uniform physical and social environments. More facially expressive dominant male macaques were more socially connected and had more cohesive social groups. These findings show that inter-individual differences in facial expressivity are related to differential social outcomes at both an individual and group level. More expressive individuals occupy more beneficial social positions, which could help explain the selection for complex facial communication in primates.
Collapse
Affiliation(s)
- J. Whitehouse
- Department of Psychology, Nottingham Trent University, NottinghamNG1 4FQ, UK
| | - P. R. Clark
- School of Psychology, University of Lincoln, LincolnLN6 7TS, UK
| | - R. L. Robinson
- Department of Psychology, Nottingham Trent University, NottinghamNG1 4FQ, UK
| | - K. Rees
- Department of Psychology, Nottingham Trent University, NottinghamNG1 4FQ, UK
| | - O. O’Callaghan
- Department of Psychology, Nottingham Trent University, NottinghamNG1 4FQ, UK
| | - C. M. Kimock
- Department of Psychology, Nottingham Trent University, NottinghamNG1 4FQ, UK
| | - C. L. Witham
- Centre for Macaques, Medical Research Council, SalisburySP4 0JQ, UK
| | - B. M. Waller
- Department of Psychology, Nottingham Trent University, NottinghamNG1 4FQ, UK
| |
Collapse
|
2
|
Kareklas K, Oliveira RF. Emotional contagion and prosocial behaviour in fish: An evolutionary and mechanistic approach. Neurosci Biobehav Rev 2024; 163:105780. [PMID: 38955311 DOI: 10.1016/j.neubiorev.2024.105780] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2024] [Revised: 04/30/2024] [Accepted: 06/20/2024] [Indexed: 07/04/2024]
Abstract
In this review, we consider the definitions and experimental approaches to emotional contagion and prosocial behaviour in mammals and explore their evolutionary conceptualisation for studying their occurrence in the evolutionarily divergent vertebrate group of ray-finned fish. We present evidence for a diverse set of fish phenotypes that meet definitional criteria for prosocial behaviour and emotional contagion and discuss conserved mechanisms that may account for some preserved social capacities in fish. Finally, we provide some considerations on how to address the question of interdependency between emotional contagion and prosocial response, highlighting the importance of recognition processes, decision-making systems, and ecological context for providing evolutionary explanations.
Collapse
Affiliation(s)
- Kyriacos Kareklas
- Instituto Gulbenkian de Ciência, R. Q.ta Grande 6, Oeiras 2780-156, Portugal
| | - Rui F Oliveira
- Instituto Gulbenkian de Ciência, R. Q.ta Grande 6, Oeiras 2780-156, Portugal; ISPA - Instituto Universitário, Rua Jardim do Tabaco 34, Lisboa 1149-041, Portugal.
| |
Collapse
|
3
|
Meier TA, Refahi MS, Hearne G, Restifo DS, Munoz-Acuna R, Rosen GL, Woloszynek S. The Role and Applications of Artificial Intelligence in the Treatment of Chronic Pain. Curr Pain Headache Rep 2024; 28:769-784. [PMID: 38822995 DOI: 10.1007/s11916-024-01264-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/28/2024] [Indexed: 06/03/2024]
Abstract
PURPOSE OF REVIEW This review aims to explore the interface between artificial intelligence (AI) and chronic pain, seeking to identify areas of focus for enhancing current treatments and yielding novel therapies. RECENT FINDINGS In the United States, the prevalence of chronic pain is estimated to be upwards of 40%. Its impact extends to increased healthcare costs, reduced economic productivity, and strain on healthcare resources. Addressing this condition is particularly challenging due to its complexity and the significant variability in how patients respond to treatment. Current options often struggle to provide long-term relief, with their benefits rarely outweighing the risks, such as dependency or other side effects. Currently, AI has impacted four key areas of chronic pain treatment and research: (1) predicting outcomes based on clinical information; (2) extracting features from text, specifically clinical notes; (3) modeling 'omic data to identify meaningful patient subgroups with potential for personalized treatments and improved understanding of disease processes; and (4) disentangling complex neuronal signals responsible for pain, which current therapies attempt to modulate. As AI advances, leveraging state-of-the-art architectures will be essential for improving chronic pain treatment. Current efforts aim to extract meaningful representations from complex data, paving the way for personalized medicine. The identification of unique patient subgroups should reveal targets for tailored chronic pain treatments. Moreover, enhancing current treatment approaches is achievable by gaining a more profound understanding of patient physiology and responses. This can be realized by leveraging AI on the increasing volume of data linked to chronic pain.
Collapse
Affiliation(s)
| | - Mohammad S Refahi
- Ecological and Evolutionary Signal-Processing and Informatics (EESI) Laboratory, Department of Electrical and Computer Engineering, Drexel University, Philadelphia, PA, USA
| | - Gavin Hearne
- Ecological and Evolutionary Signal-Processing and Informatics (EESI) Laboratory, Department of Electrical and Computer Engineering, Drexel University, Philadelphia, PA, USA
| | | | - Ricardo Munoz-Acuna
- Anesthesia, Critical Care, and Pain Medicine, Beth Israel Deaconess Medical Center, Boston, MA, USA
| | - Gail L Rosen
- Ecological and Evolutionary Signal-Processing and Informatics (EESI) Laboratory, Department of Electrical and Computer Engineering, Drexel University, Philadelphia, PA, USA
| | - Stephen Woloszynek
- Anesthesia, Critical Care, and Pain Medicine, Beth Israel Deaconess Medical Center, Boston, MA, USA.
| |
Collapse
|
4
|
Arnould C, Love SA, Piégu B, Lefort G, Blache MC, Parias C, Soulet D, Lévy F, Nowak R, Lansade L, Bertin A. Facial blushing and feather fluffing are indicators of emotions in domestic fowl (Gallus gallus domesticus). PLoS One 2024; 19:e0306601. [PMID: 39046983 PMCID: PMC11268617 DOI: 10.1371/journal.pone.0306601] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2024] [Accepted: 06/20/2024] [Indexed: 07/27/2024] Open
Abstract
The study of facial expressions in mammals provided great advances in the identification of their emotions and then in the comprehension of their sentience. So far, this area of research has excluded birds. With a naturalist approach, we analysed facial blushing and feather displays in domestic fowl. Hens were filmed in situations contrasting in emotional valence and arousal level: situations known to indicate calm states (positive valence / low arousal), have rewarding effects (positive valence / high arousal) or induce fear-related behaviour (negative valence / high arousal). Head feather position as well as skin redness of comb, wattles, ear lobes and cheeks varied across these situations. Skin of all four areas was less red in situations with low arousal compared to situations with higher arousal. Furthermore, skin redness of the cheeks and ear lobes also varied depending on the valence of the situation: redness was higher in situations with negative valence compared to situations with positive valence. Feather position also varied with the situations. Feather fluffing was mostly observed in positively valenced situations, except when hens were eating. We conclude that hens have facial displays that reveal their emotions and that blushing is not exclusive to humans. This opens a promising way to explore the emotional lives of birds, which is a critical step when trying to improve poultry welfare.
Collapse
Affiliation(s)
- Cécile Arnould
- CNRS, IFCE, INRAE, Université de Tours, PRC, Nouzilly, France
| | - Scott A. Love
- CNRS, IFCE, INRAE, Université de Tours, PRC, Nouzilly, France
| | - Benoît Piégu
- CNRS, IFCE, INRAE, Université de Tours, PRC, Nouzilly, France
| | - Gaëlle Lefort
- CNRS, IFCE, INRAE, Université de Tours, PRC, Nouzilly, France
| | | | - Céline Parias
- CNRS, IFCE, INRAE, Université de Tours, PRC, Nouzilly, France
| | - Delphine Soulet
- CNRS, IFCE, INRAE, Université de Tours, PRC, Nouzilly, France
| | - Frédéric Lévy
- CNRS, IFCE, INRAE, Université de Tours, PRC, Nouzilly, France
| | - Raymond Nowak
- CNRS, IFCE, INRAE, Université de Tours, PRC, Nouzilly, France
| | - Léa Lansade
- CNRS, IFCE, INRAE, Université de Tours, PRC, Nouzilly, France
| | - Aline Bertin
- CNRS, IFCE, INRAE, Université de Tours, PRC, Nouzilly, France
| |
Collapse
|
5
|
Cordoni G, Ciantia A, Guéry JP, Mulot B, Norscia I. Rapid facial mimicry in Platyrrhini: Play face replication in spider monkeys (Ateles fusciceps, Ateles hybridus, and Ateles paniscus). Am J Primatol 2024; 86:e23607. [PMID: 38369692 DOI: 10.1002/ajp.23607] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2023] [Revised: 01/10/2024] [Accepted: 01/27/2024] [Indexed: 02/20/2024]
Abstract
Rapid facial mimicry (RFM), the rapid and automatic replication of facial expression perceived, is considered a basic form of empathy and was investigated mainly during play. RFM occurs in Catarrhini (Old World primates), but it is not still demonstrated in Platyrrhini (New World primates). For this reason, we collected video data on playful interactions (Nplay_interactions = 149) in three species of spider monkeys (Ateles fusciceps-N = 11, Ateles hybridus-N = 14, and Ateles paniscus-N = 6) housed at La Vallée des Singes and the ZooParc de Beauval (France). For the first time, we demonstrated the occurrence of RFM in Platyrrhini (analyzing 175 events). Players' sex, age, species, relationship quality, and kinship did not modulate RFM probably due to the species' complex fission-fusion dynamics and flexible interindividual social relationships. Compared to the absence of any playful expressions or the presence of only not replicated play face, RFM prolonged the session duration and was sequentially associated with more types of more intense offensive playful patterns (patterns aimed at attacking/pursuing the playmate). We proposed that RFM may favor synchronization and context sharing between players, thus decreasing the risk of behavior misinterpretation while simultaneously fostering a more competitive nature of play. In conclusion, this study stimulates additional research on the evolutionary origins of motor mimicry in primates, possibly dating back to before the divergence of New and Old World monkeys. Furthermore, it also points toward the possibility that RFM may not always lead to cooperation but also to competition, depending on the context and species' social and cognitive features.
Collapse
Affiliation(s)
- Giada Cordoni
- Department of Life Sciences and Systems Biology, University of Torino, Turin, Italy
| | - Annalisa Ciantia
- Department of Life Sciences and Systems Biology, University of Torino, Turin, Italy
| | | | - Baptiste Mulot
- ZooParc de Beauval & Beauval Nature, Saint Aignan sur Cher, France
| | - Ivan Norscia
- Department of Life Sciences and Systems Biology, University of Torino, Turin, Italy
| |
Collapse
|
6
|
Richard JT, Pellegrini I, Levine R. Belugas (Delphinapterus leucas) create facial displays during social interactions by changing the shape of their melons. Anim Cogn 2024; 27:7. [PMID: 38429515 PMCID: PMC10907495 DOI: 10.1007/s10071-024-01843-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2023] [Revised: 11/23/2023] [Accepted: 11/27/2023] [Indexed: 03/03/2024]
Abstract
Beluga whales are considered unique among odontocetes in their ability to visibly alter the appearance of their head by changing the shape of the melon, but only anecdotal observations are available to evaluate the use or potential function of these melon shapes. This study of belugas in professionally managed care aimed to establish an ethogram for the repertoire of categorizable melon shapes and then evaluate their potential function as intentional communication signals by determining if they were produced and elaborated during social interactions of varying behavioral contexts while in the line of sight of a recipient. Five different melon shapes were reliably identified in video observations of the primary study population (n = 4) and externally validated in a second aquarium population (n = 51). Among the 2570 melon shapes observed from the primary study subjects, melon shapes occurred 34 × more frequently during social interactions (1.72 per minute) than outside of social interactions (0.05 per minute). Melon shapes occurring during social interactions were performed within the line of sight of a recipient 93.6% of the time. The frequency of occurrence of the different melon shapes varied across behavioral contexts. Elaboration of melon shapes through extended duration and the occurrence of concurrent open mouth displays varied by shape type and across behavioral contexts. Melon shapes seem to function as visual displays, with some characteristics of intentional communication. This ability could yield adaptive benefits to belugas, given their complex social structure and hypothesized mating system that emphasizes pre-copulatory female mate choice.
Collapse
Affiliation(s)
- Justin T Richard
- Department of Fisheries, Animal and Veterinary Science, University of Rhode Island, Kingston, RI, 02881, USA.
| | - Isabelle Pellegrini
- Department of Fisheries, Animal and Veterinary Science, University of Rhode Island, Kingston, RI, 02881, USA
| | - Rachael Levine
- Department of Fisheries, Animal and Veterinary Science, University of Rhode Island, Kingston, RI, 02881, USA
| |
Collapse
|
7
|
Talwar S, Barbero FM, Calce RP, Collignon O. Automatic Brain Categorization of Discrete Auditory Emotion Expressions. Brain Topogr 2023; 36:854-869. [PMID: 37639111 PMCID: PMC10522533 DOI: 10.1007/s10548-023-00983-8] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2022] [Accepted: 06/21/2023] [Indexed: 08/29/2023]
Abstract
Seamlessly extracting emotional information from voices is crucial for efficient interpersonal communication. However, it remains unclear how the brain categorizes vocal expressions of emotion beyond the processing of their acoustic features. In our study, we developed a new approach combining electroencephalographic recordings (EEG) in humans with a frequency-tagging paradigm to 'tag' automatic neural responses to specific categories of emotion expressions. Participants were presented with a periodic stream of heterogeneous non-verbal emotional vocalizations belonging to five emotion categories: anger, disgust, fear, happiness and sadness at 2.5 Hz (stimuli length of 350 ms with a 50 ms silent gap between stimuli). Importantly, unknown to the participant, a specific emotion category appeared at a target presentation rate of 0.83 Hz that would elicit an additional response in the EEG spectrum only if the brain discriminates the target emotion category from other emotion categories and generalizes across heterogeneous exemplars of the target emotion category. Stimuli were matched across emotion categories for harmonicity-to-noise ratio, spectral center of gravity and pitch. Additionally, participants were presented with a scrambled version of the stimuli with identical spectral content and periodicity but disrupted intelligibility. Both types of sequences had comparable envelopes and early auditory peripheral processing computed via the simulation of the cochlear response. We observed that in addition to the responses at the general presentation frequency (2.5 Hz) in both intact and scrambled sequences, a greater peak in the EEG spectrum at the target emotion presentation rate (0.83 Hz) and its harmonics emerged in the intact sequence in comparison to the scrambled sequence. The greater response at the target frequency in the intact sequence, together with our stimuli matching procedure, suggest that the categorical brain response elicited by a specific emotion is at least partially independent from the low-level acoustic features of the sounds. Moreover, responses at the fearful and happy vocalizations presentation rates elicited different topographies and different temporal dynamics, suggesting that different discrete emotions are represented differently in the brain. Our paradigm revealed the brain's ability to automatically categorize non-verbal vocal emotion expressions objectively (at a predefined frequency of interest), behavior-free, rapidly (in few minutes of recording time) and robustly (with a high signal-to-noise ratio), making it a useful tool to study vocal emotion processing and auditory categorization in general and in populations where behavioral assessments are more challenging.
Collapse
Affiliation(s)
- Siddharth Talwar
- Institute for Research in Psychology (IPSY) & Neuroscience (IoNS), Louvain Bionics, University of Louvain (UCLouvain), Louvain, Belgium.
| | - Francesca M Barbero
- Institute for Research in Psychology (IPSY) & Neuroscience (IoNS), Louvain Bionics, University of Louvain (UCLouvain), Louvain, Belgium
| | - Roberta P Calce
- Institute for Research in Psychology (IPSY) & Neuroscience (IoNS), Louvain Bionics, University of Louvain (UCLouvain), Louvain, Belgium
| | - Olivier Collignon
- Institute for Research in Psychology (IPSY) & Neuroscience (IoNS), Louvain Bionics, University of Louvain (UCLouvain), Louvain, Belgium.
- School of Health Sciences, HES-SO Valais-Wallis, The Sense Innovation and Research Center, Lausanne and Sion, Switzerland.
| |
Collapse
|
8
|
Scott L, Florkiewicz BN. Feline faces: Unraveling the social function of domestic cat facial signals. Behav Processes 2023; 213:104959. [PMID: 37858844 DOI: 10.1016/j.beproc.2023.104959] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2023] [Revised: 10/12/2023] [Accepted: 10/15/2023] [Indexed: 10/21/2023]
Abstract
Lately, there has been a growing interest in studying domestic cat facial signals, but most of this research has centered on signals produced during human-cat interactions or pain. The available research on intraspecific facial signaling with domesticated cats has largely focused on non-affiliative social interactions. However, the transition to intraspecific sociality through domestication could have resulted in a greater reliance on affiliative facial signals that aid with social bonding. Our study aimed to document the various facial signals that cats produce during affiliative and non-affiliative intraspecific interactions. Given the close relationship between the physical form and social function of mammalian facial signals, we predicted that affiliative and non-affiliative facial signals would have noticeable differences in their physical morphology. We observed the behavior of 53 adult domestic shorthair cats at CatCafé Lounge in Los Angeles, CA. Using Facial Action Coding Systems designed for cats, we compared the complexity and compositionality of facial signals produced in affiliative and non-affiliative contexts. To measure complexity and compositionality, we examined the number and types of facial muscle movements (AUs) observed in each signal. We found that compositionality, rather than complexity, was significantly associated with the social function of intraspecific facial signals. Our findings indicate that domestication likely had a significant impact on the development of intraspecific facial signaling repertoires in cats.
Collapse
Affiliation(s)
- Lauren Scott
- School of Medicine, University of Kansas Medical Center, KS, USA
| | | |
Collapse
|
9
|
Rincon AV, Waller BM, Duboscq J, Mielke A, Pérez C, Clark PR, Micheletta J. Higher social tolerance is associated with more complex facial behavior in macaques. eLife 2023; 12:RP87008. [PMID: 37787008 PMCID: PMC10547472 DOI: 10.7554/elife.87008] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/04/2023] Open
Abstract
The social complexity hypothesis for communicative complexity posits that animal societies with more complex social systems require more complex communication systems. We tested the social complexity hypothesis on three macaque species that vary in their degree of social tolerance and complexity. We coded facial behavior in >3000 social interactions across three social contexts (aggressive, submissive, affiliative) in 389 animals, using the Facial Action Coding System for macaques (MaqFACS). We quantified communicative complexity using three measures of uncertainty: entropy, specificity, and prediction error. We found that the relative entropy of facial behavior was higher for the more tolerant crested macaques as compared to the less tolerant Barbary and rhesus macaques across all social contexts, indicating that crested macaques more frequently use a higher diversity of facial behavior. The context specificity of facial behavior was higher in rhesus as compared to Barbary and crested macaques, demonstrating that Barbary and crested macaques used facial behavior more flexibly across different social contexts. Finally, a random forest classifier predicted social context from facial behavior with highest accuracy for rhesus and lowest for crested, indicating there is higher uncertainty and complexity in the facial behavior of crested macaques. Overall, our results support the social complexity hypothesis.
Collapse
Affiliation(s)
- Alan V Rincon
- Department of Psychology, Centre for Comparative and Evolutionary Psychology, University of PortsmouthPortsmouthUnited Kingdom
| | - Bridget M Waller
- Centre for Interdisciplinary Research on Social Interaction, Department of Psychology, Nottingham Trent UniversityNottinghamUnited Kingdom
| | | | - Alexander Mielke
- School of Biological and Behavioural Sciences, Queen Mary University of LondonLondonUnited Kingdom
| | - Claire Pérez
- Department of Psychology, Centre for Comparative and Evolutionary Psychology, University of PortsmouthPortsmouthUnited Kingdom
| | - Peter R Clark
- Department of Psychology, Centre for Comparative and Evolutionary Psychology, University of PortsmouthPortsmouthUnited Kingdom
- School of Psychology, University of LincolnLincolnUnited Kingdom
| | - Jérôme Micheletta
- Department of Psychology, Centre for Comparative and Evolutionary Psychology, University of PortsmouthPortsmouthUnited Kingdom
| |
Collapse
|
10
|
Nelson XJ, Taylor AH, Cartmill EA, Lyn H, Robinson LM, Janik V, Allen C. Joyful by nature: approaches to investigate the evolution and function of joy in non-human animals. Biol Rev Camb Philos Soc 2023; 98:1548-1563. [PMID: 37127535 DOI: 10.1111/brv.12965] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2022] [Revised: 04/12/2023] [Accepted: 04/17/2023] [Indexed: 05/03/2023]
Abstract
The nature and evolution of positive emotion is a major question remaining unanswered in science and philosophy. The study of feelings and emotions in humans and animals is dominated by discussion of affective states that have negative valence. Given the clinical and social significance of negative affect, such as depression, it is unsurprising that these emotions have received more attention from scientists. Compared to negative emotions, such as fear that leads to fleeing or avoidance, positive emotions are less likely to result in specific, identifiable, behaviours being expressed by an animal. This makes it particularly challenging to quantify and study positive affect. However, bursts of intense positive emotion (joy) are more likely to be accompanied by externally visible markers, like vocalisations or movement patterns, which make it more amenable to scientific study and more resilient to concerns about anthropomorphism. We define joy as intense, brief, and event-driven (i.e. a response to something), which permits investigation into how animals react to a variety of situations that would provoke joy in humans. This means that behavioural correlates of joy are measurable, either through newly discovered 'laughter' vocalisations, increases in play behaviour, or reactions to cognitive bias tests that can be used across species. There are a range of potential situations that cause joy in humans that have not been studied in other animals, such as whether animals feel joy on sunny days, when they accomplish a difficult feat, or when they are reunited with a familiar companion after a prolonged absence. Observations of species-specific calls and play behaviour can be combined with biometric markers and reactions to ambiguous stimuli in order to enable comparisons of affect between phylogenetically distant taxonomic groups. Identifying positive affect is also important for animal welfare because knowledge of positive emotional states would allow us to monitor animal well-being better. Additionally, measuring if phylogenetically and ecologically distant animals play more, laugh more, or act more optimistically after certain kinds of experiences will also provide insight into the mechanisms underlying the evolution of joy and other positive emotions, and potentially even into the evolution of consciousness.
Collapse
Affiliation(s)
- Ximena J Nelson
- Private Bag 4800, School of Biological Sciences, University of Canterbury, Christchurch, New Zealand
| | - Alex H Taylor
- Institut de Neurociències, Universitat Autònoma de Barcelona, Bellaterra, Barcelona, 08193, Spain
- ICREA, Pg. Lluís Companys, 23, Barcelona, Spain
- School of Psychology, The University of Auckland, Private Bag 92019, Auckland, 1142, New Zealand
| | - Erica A Cartmill
- Departments of Anthropology and Psychology, UCLA, 375 Portola Plaza, Los Angeles, CA, 90095, USA
| | - Heidi Lyn
- Department of Psychology, University of South Alabama, 75 S. University Blvd., Mobile, AL, 36688, USA
| | - Lauren M Robinson
- Domestication Lab, Konrad Lorenz Institute of Ethology, University of Veterinary Medicine, Vienna, Savoyenstraße 1a, Vienna, A-1160, Austria
| | - Vincent Janik
- Scottish Oceans Institute, School of Biology, University of St. Andrews, St Andrews, KY16 8LB, UK
| | - Colin Allen
- Department of History & Philosophy of Science, University of Pittsburgh, 1101 Cathedral of Learning, 4200 Fifth Ave, Pittsburgh, PA, 15260, USA
| |
Collapse
|
11
|
Korcsok B, Korondi P. How do you do the things that you do? Ethological approach to the description of robot behaviour. Biol Futur 2023; 74:253-279. [PMID: 37812380 DOI: 10.1007/s42977-023-00178-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2023] [Accepted: 09/08/2023] [Indexed: 10/10/2023]
Abstract
The detailed description of behaviour of the interacting parties is becoming more and more important in human-robot interaction (HRI), especially in social robotics (SR). With the rise in the number of publications, there is a substantial need for the objective and comprehensive description of implemented robot behaviours to ensure comparability and reproducibility of the studies. Ethograms and the meticulous analysis of behaviour was introduced long ago in animal behaviour research (cf. ethology). The adoption of this method in SR and HRI can ensure the desired clarity over robot behaviours, while also providing added benefits during robot development, behaviour modelling and analysis of HRI experiments. We provide an overview of the possible uses and advantages of ethograms in HRI, and propose a general framework for describing behaviour which can be adapted to the requirements of specific studies.
Collapse
Affiliation(s)
- Beáta Korcsok
- ELKH-ELTE Comparative Ethology Research Group, Budapest, Hungary.
- Department of Mechatronics, Optics and Mechanical Engineering Informatics, Faculty of Mechanical Engineering, Budapest University of Technology and Economics, Budapest, Hungary.
| | - Péter Korondi
- Department of Mechatronics, Faculty of Engineering, University of Debrecen, Debrecen, Hungary
| |
Collapse
|
12
|
Lim C, Inagaki M, Shinozaki T, Fujita I. Analysis of convolutional neural networks reveals the computational properties essential for subcortical processing of facial expression. Sci Rep 2023; 13:10908. [PMID: 37407668 DOI: 10.1038/s41598-023-37995-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2023] [Accepted: 06/30/2023] [Indexed: 07/07/2023] Open
Abstract
Perception of facial expression is crucial for primate social interactions. This visual information is processed through the ventral cortical pathway and the subcortical pathway. However, the subcortical pathway exhibits inaccurate processing, and the responsible architectural and physiological properties remain unclear. To investigate this, we constructed and examined convolutional neural networks with three key properties of the subcortical pathway: a shallow layer architecture, concentric receptive fields at the initial processing stage, and a greater degree of spatial pooling. These neural networks achieved modest accuracy in classifying facial expressions. By replacing these properties, individually or in combination, with corresponding cortical features, performance gradually improved. Similar to amygdala neurons, some units in the final processing layer exhibited sensitivity to retina-based spatial frequencies (SFs), while others were sensitive to object-based SFs. Replacement of any of these properties affected the coordinates of the SF encoding. Therefore, all three properties limit the accuracy of facial expression information and are essential for determining the SF representation coordinate. These findings characterize the role of the subcortical computational processes in facial expression recognition.
Collapse
Affiliation(s)
- Chanseok Lim
- Laboratory for Cognitive Neuroscience, Graduate School of Frontier Biosciences, Osaka University, 1-4 Yamadaoka, Suita, Osaka, 565-0871, Japan
- Perceptual and Cognitive Neuroscience Laboratory, Graduate School of Frontier Biosciences, Osaka University, 1-4 Yamadaoka, Suita, Osaka, 565-0871, Japan
| | - Mikio Inagaki
- Laboratory for Cognitive Neuroscience, Graduate School of Frontier Biosciences, Osaka University, 1-4 Yamadaoka, Suita, Osaka, 565-0871, Japan
- Center for Information and Neural Networks, National Institute of Information and Communications Technology, 1-4 Yamadaoka, Suita, Osaka, 565-0871, Japan
| | - Takashi Shinozaki
- Center for Information and Neural Networks, National Institute of Information and Communications Technology, 1-4 Yamadaoka, Suita, Osaka, 565-0871, Japan
- Computational Neuroscience Laboratory, Faculty of Informatics, Kindai University, 3-4-1 Kowakae, Higashiosaka, Osaka, 577-8502, Japan
| | - Ichiro Fujita
- Laboratory for Cognitive Neuroscience, Graduate School of Frontier Biosciences, Osaka University, 1-4 Yamadaoka, Suita, Osaka, 565-0871, Japan.
- Center for Information and Neural Networks, National Institute of Information and Communications Technology, 1-4 Yamadaoka, Suita, Osaka, 565-0871, Japan.
- Research Organization of Science and Technology, Ritsumeikan University, 1-1-1 Noji-Higashi, Kusatsu, Shiga, 525-8577, Japan.
| |
Collapse
|
13
|
Cascella M, Schiavo D, Cuomo A, Ottaiano A, Perri F, Patrone R, Migliarelli S, Bignami EG, Vittori A, Cutugno F. Artificial Intelligence for Automatic Pain Assessment: Research Methods and Perspectives. Pain Res Manag 2023; 2023:6018736. [PMID: 37416623 PMCID: PMC10322534 DOI: 10.1155/2023/6018736] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2022] [Revised: 02/03/2023] [Accepted: 04/20/2023] [Indexed: 07/08/2023]
Abstract
Although proper pain evaluation is mandatory for establishing the appropriate therapy, self-reported pain level assessment has several limitations. Data-driven artificial intelligence (AI) methods can be employed for research on automatic pain assessment (APA). The goal is the development of objective, standardized, and generalizable instruments useful for pain assessment in different clinical contexts. The purpose of this article is to discuss the state of the art of research and perspectives on APA applications in both research and clinical scenarios. Principles of AI functioning will be addressed. For narrative purposes, AI-based methods are grouped into behavioral-based approaches and neurophysiology-based pain detection methods. Since pain is generally accompanied by spontaneous facial behaviors, several approaches for APA are based on image classification and feature extraction. Language features through natural language strategies, body postures, and respiratory-derived elements are other investigated behavioral-based approaches. Neurophysiology-based pain detection is obtained through electroencephalography, electromyography, electrodermal activity, and other biosignals. Recent approaches involve multimode strategies by combining behaviors with neurophysiological findings. Concerning methods, early studies were conducted by machine learning algorithms such as support vector machine, decision tree, and random forest classifiers. More recently, artificial neural networks such as convolutional and recurrent neural network algorithms are implemented, even in combination. Collaboration programs involving clinicians and computer scientists must be aimed at structuring and processing robust datasets that can be used in various settings, from acute to different chronic pain conditions. Finally, it is crucial to apply the concepts of explainability and ethics when examining AI applications for pain research and management.
Collapse
Affiliation(s)
- Marco Cascella
- Division of Anesthesia and Pain Medicine, Istituto Nazionale Tumori IRCCS Fondazione G. Pascale, Naples 80131, Italy
| | - Daniela Schiavo
- Division of Anesthesia and Pain Medicine, Istituto Nazionale Tumori IRCCS Fondazione G. Pascale, Naples 80131, Italy
| | - Arturo Cuomo
- Division of Anesthesia and Pain Medicine, Istituto Nazionale Tumori IRCCS Fondazione G. Pascale, Naples 80131, Italy
| | - Alessandro Ottaiano
- SSD-Innovative Therapies for Abdominal Metastases, Istituto Nazionale Tumori di Napoli IRCCS “G. Pascale”, Via M. Semmola, Naples 80131, Italy
| | - Francesco Perri
- Head and Neck Oncology Unit, Istituto Nazionale Tumori IRCCS-Fondazione “G. Pascale”, Naples 80131, Italy
| | - Renato Patrone
- Dieti Department, University of Naples, Naples, Italy
- Division of Hepatobiliary Surgical Oncology, Istituto Nazionale Tumori IRCCS, Fondazione Pascale-IRCCS di Napoli, Naples, Italy
| | - Sara Migliarelli
- Department of Pharmacology, Faculty of Medicine and Psychology, University Sapienza of Rome, Rome, Italy
| | - Elena Giovanna Bignami
- Anesthesiology, Critical Care and Pain Medicine Division, Department of Medicine and Surgery, University of Parma, Parma, Italy
| | - Alessandro Vittori
- Department of Anesthesia and Critical Care, ARCO ROMA, Ospedale Pediatrico Bambino Gesù IRCCS, Rome 00165, Italy
| | - Francesco Cutugno
- Department of Electrical Engineering and Information Technologies, University of Naples “Federico II”, Naples 80100, Italy
| |
Collapse
|
14
|
Long H, Peluso N, Baker CI, Japee S, Taubert J. A database of heterogeneous faces for studying naturalistic expressions. Sci Rep 2023; 13:5383. [PMID: 37012369 PMCID: PMC10070342 DOI: 10.1038/s41598-023-32659-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2022] [Accepted: 03/30/2023] [Indexed: 04/05/2023] Open
Abstract
Facial expressions are thought to be complex visual signals, critical for communication between social agents. Most prior work aimed at understanding how facial expressions are recognized has relied on stimulus databases featuring posed facial expressions, designed to represent putative emotional categories (such as 'happy' and 'angry'). Here we use an alternative selection strategy to develop the Wild Faces Database (WFD); a set of one thousand images capturing a diverse range of ambient facial behaviors from outside of the laboratory. We characterized the perceived emotional content in these images using a standard categorization task in which participants were asked to classify the apparent facial expression in each image. In addition, participants were asked to indicate the intensity and genuineness of each expression. While modal scores indicate that the WFD captures a range of different emotional expressions, in comparing the WFD to images taken from other, more conventional databases, we found that participants responded more variably and less specifically to the wild-type faces, perhaps indicating that natural expressions are more multiplexed than a categorical model would predict. We argue that this variability can be employed to explore latent dimensions in our mental representation of facial expressions. Further, images in the WFD were rated as less intense and more genuine than images taken from other databases, suggesting a greater degree of authenticity among WFD images. The strong positive correlation between intensity and genuineness scores demonstrating that even the high arousal states captured in the WFD were perceived as authentic. Collectively, these findings highlight the potential utility of the WFD as a new resource for bridging the gap between the laboratory and real world in studies of expression recognition.
Collapse
Affiliation(s)
- Houqiu Long
- The School of Psychology, The University of Queensland, St Lucia, QLD, Australia
| | - Natalie Peluso
- The School of Psychology, The University of Queensland, St Lucia, QLD, Australia
| | - Chris I Baker
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA
| | - Shruti Japee
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA
| | - Jessica Taubert
- The School of Psychology, The University of Queensland, St Lucia, QLD, Australia.
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA.
| |
Collapse
|
15
|
Explainable automated recognition of emotional states from canine facial expressions: the case of positive anticipation and frustration. Sci Rep 2022; 12:22611. [PMID: 36585439 PMCID: PMC9803655 DOI: 10.1038/s41598-022-27079-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2022] [Accepted: 12/26/2022] [Indexed: 12/31/2022] Open
Abstract
In animal research, automation of affective states recognition has so far mainly addressed pain in a few species. Emotional states remain uncharted territories, especially in dogs, due to the complexity of their facial morphology and expressions. This study contributes to fill this gap in two aspects. First, it is the first to address dog emotional states using a dataset obtained in a controlled experimental setting, including videos from (n = 29) Labrador Retrievers assumed to be in two experimentally induced emotional states: negative (frustration) and positive (anticipation). The dogs' facial expressions were measured using the Dogs Facial Action Coding System (DogFACS). Two different approaches are compared in relation to our aim: (1) a DogFACS-based approach with a two-step pipeline consisting of (i) a DogFACS variable detector and (ii) a positive/negative state Decision Tree classifier; (2) An approach using deep learning techniques with no intermediate representation. The approaches reach accuracy of above 71% and 89%, respectively, with the deep learning approach performing better. Secondly, this study is also the first to study explainability of AI models in the context of emotion in animals. The DogFACS-based approach provides decision trees, that is a mathematical representation which reflects previous findings by human experts in relation to certain facial expressions (DogFACS variables) being correlates of specific emotional states. The deep learning approach offers a different, visual form of explainability in the form of heatmaps reflecting regions of focus of the network's attention, which in some cases show focus clearly related to the nature of particular DogFACS variables. These heatmaps may hold the key to novel insights on the sensitivity of the network to nuanced pixel patterns reflecting information invisible to the human eye.
Collapse
|
16
|
Davila-Ross M, Palagi E. Laughter, play faces and mimicry in animals: evolution and social functions. Philos Trans R Soc Lond B Biol Sci 2022; 377:20210177. [PMID: 36126662 PMCID: PMC9489294 DOI: 10.1098/rstb.2021.0177] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2021] [Accepted: 02/22/2022] [Indexed: 12/26/2022] Open
Abstract
Human laughter and laugh faces show similarities in morphology and function with animal playful expressions. To better understand primordial uses and effects of human laughter and laugh faces, it is important to examine these positive expressions in animals from both homologous and analogous systems. Phylogenetic research on hominids provided empirical evidence on shared ancestry across these emotional expressions, including human laughter and laugh faces. In addition, playful expressions of animals, in general, arguably have a key role in the development of social cognitive skills, a role that may help explain their polyphyletic history. The present work examines the evolution and function of playful expressions in primates and other animals. As part of this effort, we also coded for muscle activations of six carnivore taxa with regard to their open-mouth faces of play; our findings provide evidence that these carnivore expressions are homologues of primate open-mouth faces of play. Furthermore, our work discusses how the expressions of animal play may communicate positive emotions to conspecifics and how the motor resonance of these expressions increases affiliation and bonding between the subjects, resembling in a number of ways the important social-emotional effects that laughter and laugh faces have in humans. This article is part of the theme issue 'Cracking the laugh code: laughter through the lens of biology, psychology and neuroscience'.
Collapse
Affiliation(s)
- Marina Davila-Ross
- Psychology Department, King Henry Building, University of Portsmouth, Portsmouth PO1 2DY, UK
| | - Elisabetta Palagi
- Department of Biology, Ethology Unit, University of Pisa, Via A. Volta 6, 56126, Pisa, Italy
| |
Collapse
|
17
|
The Association Between the Bared-Teeth Display and Social Dominance in Captive Chimpanzees ( Pan troglodytes). AFFECTIVE SCIENCE 2022; 3:749-760. [PMID: 36217408 PMCID: PMC9535227 DOI: 10.1007/s42761-022-00138-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/15/2022] [Accepted: 06/26/2022] [Indexed: 11/25/2022]
Abstract
Humans use smiles - widely observed emotional expressions - in a variety of social situations, of which the meaning varies depending on social relationship and the context in which it is displayed. The homologue of the human smile in non-human primates - both due to morphological and functional similarities - is the bared-teeth display (BT). According to the power asymmetry hypothesis (PAH), species with strict linear dominance hierarchies are predicted to produce distinct communicative signals to avoid escalations of social conflicts. Hence, while the BT in a despotic species is predicted to be expressed from low- to high-ranking individuals, signaling submission, the BT in a tolerant species is predicted to be expressed in multiple contexts, regardless of rank. We tested this hypothesis in a group of 8 captive chimpanzees (Pan troglodytes), a species commonly characterized as rather despotic. An investigation of 11,774 dyadic social interactions revealed this chimpanzee group to have a linear dominance hierarchy, with moderate steepness. A Bayesian GLMM - used to test the effects of social contexts and rank relationships of dyads on the use of the BT display - indicated multi-contextual use of the BT which is contingent on the rank relationship. We also found that slight morphological and/or acoustic variants (i.e., silent bared-teeth and vocalized bared-teeth) of the BT display may have different communicative meanings. Our findings are in line with the prediction derived from the PAH for a moderately despotic species, and the view that the human smile originated from the primate BT display. Supplementary Information The online version contains supplementary material available at 10.1007/s42761-022-00138-1.
Collapse
|
18
|
Clark PR, Waller BM, Agil M, Micheletta J. Crested macaque facial movements are more intense and stereotyped in potentially risky social interactions. Philos Trans R Soc Lond B Biol Sci 2022; 377:20210307. [PMID: 35934960 PMCID: PMC9358315 DOI: 10.1098/rstb.2021.0307] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022] Open
Abstract
Ambiguity in communicative signals may lead to misunderstandings and thus reduce the effectiveness of communication, especially in unpredictable interactions such as between closely matched rivals or those with a weak social bond. Therefore, signals used in these circumstances should be less ambiguous, more stereotyped and more intense. To test this prediction, we measured facial movements of crested macaques (Macaca nigra) during spontaneous social interaction, using the Facial Action Coding System for macaques (MaqFACS). We used linear mixed models to assess whether facial movement intensity and variability varied according to the interaction outcome, the individuals' dominance relationship and their social bond. Movements were least intense and most variable in affiliative contexts, and more intense in interactions between individuals who were closely matched in terms of dominance rating. We found no effect of social bond strength. Our findings provide evidence for a reduction in ambiguity of facial behaviour in risky social situations but do not demonstrate any mitigating effect of social relationship quality. The results indicate that the ability to modify communicative signals may play an important role in navigating complex primate social interactions. This article is part of the theme issue ‘Cognition, communication and social bonds in primates’.
Collapse
Affiliation(s)
- Peter R Clark
- Evolution and Social Interaction Research Group, Nottingham Trent University, Nottingham NG1 4FQ, UK.,Centre for Comparative and Evolutionary Psychology, University of Portsmouth, Portsmouth PO1 2UP, UK.,Macaca Nigra Project, Tangkoko-Batuangus Nature Reserve, North Sulawesi, Indonesia
| | - Bridget M Waller
- Evolution and Social Interaction Research Group, Nottingham Trent University, Nottingham NG1 4FQ, UK
| | - Muhammad Agil
- Macaca Nigra Project, Tangkoko-Batuangus Nature Reserve, North Sulawesi, Indonesia.,Faculty of Veterinary Medicine, Agricultural University of Bogor, Bogor, Jawa Barat 16680, Indonesia
| | - Jerome Micheletta
- Centre for Comparative and Evolutionary Psychology, University of Portsmouth, Portsmouth PO1 2UP, UK.,Macaca Nigra Project, Tangkoko-Batuangus Nature Reserve, North Sulawesi, Indonesia
| |
Collapse
|
19
|
Burza LB, Bloom T, Trindade PHE, Friedman H, Otta E. Reading emotions in Dogs' eyes and Dogs' faces. Behav Processes 2022; 202:104752. [PMID: 36162604 DOI: 10.1016/j.beproc.2022.104752] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2022] [Revised: 08/14/2022] [Accepted: 09/20/2022] [Indexed: 11/02/2022]
Abstract
Our primary goal was to investigate human ability to recognize basic emotions from only the eyes of dogs in comparison to the whole face. Simultaneously, we replicated and extended previous research (Bloom et al., 2021), while validating American canine emotional facial expression photographs cross-culturally to Brazil. Participants (N = 120) viewed behaviorally-anchored photographs of three breeds. Half the participants in each condition (faces or eyes-only) viewed two-word forced choice items while the other half viewed four-word forced choice items. Participants identified target emotions from images of both dogs' faces and eyes-only at a higher rate than chance. Fear was accurately recognized more than the other emotions. When dogs are afraid, they open their eyes and expose the sclera, a conspicuous signal. Emotion identification accuracy was highest for the Rhodesian Ridgeback, who is similar in morphology to common Brazilian stray dogs (Vira-Latas Carmelo). We conjectured that Brazilians were more accustomed to seeing dogs with the Rhodesian Ridgeback morphology than the erect-eared breeds, thus increasing accuracy for this breed. Further studies with additional dog morphologies are desirable. In addition to research interest, our Canine Eyes task has the potential to become a test of individual differences in Theory of Mind with clinical applications.
Collapse
Affiliation(s)
- Laura Brochini Burza
- Departamento de Psicologia Experimental, Instituto de Psicologia, Universidade de São Paulo, Brazil
| | | | - Pedro Henrique Esteves Trindade
- Departamento de Cirurgia Veterinária e Reprodução Animal, Faculdade de Medicina Veterinária e Zootecnia, Universidade Estadual Paulista (UNESP), Botucatu, Brazil
| | | | - Emma Otta
- Departamento de Psicologia Experimental, Instituto de Psicologia, Universidade de São Paulo, Brazil.
| |
Collapse
|
20
|
Holler J. Visual bodily signals as core devices for coordinating minds in interaction. Philos Trans R Soc Lond B Biol Sci 2022; 377:20210094. [PMID: 35876208 PMCID: PMC9310176 DOI: 10.1098/rstb.2021.0094] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2021] [Accepted: 01/21/2022] [Indexed: 12/11/2022] Open
Abstract
The view put forward here is that visual bodily signals play a core role in human communication and the coordination of minds. Critically, this role goes far beyond referential and propositional meaning. The human communication system that we consider to be the explanandum in the evolution of language thus is not spoken language. It is, instead, a deeply multimodal, multilayered, multifunctional system that developed-and survived-owing to the extraordinary flexibility and adaptability that it endows us with. Beyond their undisputed iconic power, visual bodily signals (manual and head gestures, facial expressions, gaze, torso movements) fundamentally contribute to key pragmatic processes in modern human communication. This contribution becomes particularly evident with a focus that includes non-iconic manual signals, non-manual signals and signal combinations. Such a focus also needs to consider meaning encoded not just via iconic mappings, since kinematic modulations and interaction-bound meaning are additional properties equipping the body with striking pragmatic capacities. Some of these capacities, or its precursors, may have already been present in the last common ancestor we share with the great apes and may qualify as early versions of the components constituting the hypothesized interaction engine. This article is part of the theme issue 'Revisiting the human 'interaction engine': comparative approaches to social action coordination'.
Collapse
Affiliation(s)
- Judith Holler
- Max-Planck-Institut für Psycholinguistik, Nijmegen, The Netherlands
- Donders Centre for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
| |
Collapse
|
21
|
Merkies K, Sudarenko Y, Hodder AJ. Can Ponies (Equus Caballus) Distinguish Human Facial Expressions? Animals (Basel) 2022; 12:ani12182331. [PMID: 36139191 PMCID: PMC9495040 DOI: 10.3390/ani12182331] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2022] [Revised: 08/25/2022] [Accepted: 09/06/2022] [Indexed: 11/16/2022] Open
Abstract
Communication within a species is essential for access to resources, alerting to dangers, group facilitation and social bonding; human facial expressions are considered to be an important factor in one’s ability to communicate with others. Evidence has shown that dogs and horses are able to distinguish positive and negative facial expressions by observing photographs of humans, however there is currently no research on how facial expressions from a live human are perceived by horses. This study investigated how ponies distinguish facial expressions presented by live actors. Trained actors (n = 2), using the human Facial Action Coding System, displayed four facial expressions (anger, sadness, joy and neutral) individually to twenty ponies. Heart rate and behaviors of the ponies including first monocular eye look, eye look duration (right and left side bias) and latency to approach were observed. A generalized linear mixed model (GLIMMIX) using Sidak’s multiple comparisons of least squared means determined that when exposed to anger expressions ponies looked more often with their left eye first and when exposed to joy, looked more often with their right eye first (p = 0.011). The ponies spent more time looking at angry expressions (p = 0.0003) in comparison to other expressions. There was no variation in heart rate across expressions (p > 0.89). Regardless of human facial expression, ponies looked longer (p = 0.0035), took longer to approach (p = 0.0297) and displayed more oral behaviours (p < 0.0001) with one actor than the other indicating increased arousal or negative valence. Ponies with more experience as a lesson mount had lower heart rates (p < 0.0001) carried their head lower (p < 0.0001), kept their left ear on the actor (p < 0.03) and exhibited more oral behaviours (p < 0.0001) than ponies with less experience. This study demonstrates that ponies are able to distinguish facial expressions presented by a live human, but other factors also contribute to their responses to humans.
Collapse
Affiliation(s)
- Katrina Merkies
- Department of Animal Bioscience, University of Guelph, Guelph, ON N1G 2W1, Canada
- Campbell Centre for the Study of Animal Reproduction, University of Guelph, Guelph, ON N1G 2W1, Canada
- Correspondence:
| | - Yuliia Sudarenko
- Department of Animal Bioscience, University of Guelph, Guelph, ON N1G 2W1, Canada
| | - Abigail J. Hodder
- Department of Animal Bioscience, University of Guelph, Guelph, ON N1G 2W1, Canada
- Campbell Centre for the Study of Animal Reproduction, University of Guelph, Guelph, ON N1G 2W1, Canada
| |
Collapse
|
22
|
Leconstant C, Spitz E. Integrative Model of Human-Animal Interactions: A One Health-One Welfare Systemic Approach to Studying HAI. Front Vet Sci 2022; 9:656833. [PMID: 35968006 PMCID: PMC9372562 DOI: 10.3389/fvets.2022.656833] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2021] [Accepted: 06/14/2022] [Indexed: 11/30/2022] Open
Abstract
The Integrative Model of Human-Animal Interactions (IMHAI) described herewith provides a conceptual framework for the study of interspecies interactions and aims to model the primary emotional processes involved in human-animal interactions. This model was developed from theoretical inputs from three fundamental disciplines for understanding interspecies interactions: neuroscience, psychology and ethology, with the objective of providing a transdisciplinary approach on which field professionals and researchers can build and collaborate. Seminal works in affective neuroscience offer a common basis between humans and animals and, as such, can be applied to the study of interspecies interactions from a One Health-One Welfare perspective. On the one hand, Jaak Panksepp's research revealed that primary/basic emotions originate in the deep subcortical regions of the brain and are shared by all mammals, including humans. On the other hand, several works in the field of neuroscience show that the basic physiological state is largely determined by the perception of safety. Thus, emotional expression reflects the state of an individual's permanent adaptation to ever-changing environmental demands. Based on this evidence and over 5 years of action research using grounded theory, alternating between research and practice, the IMHAI proposes a systemic approach to the study of primary-process emotional affects during interspecies social interactions, through the processes of emotional transfer, embodied communication and interactive emotional regulation. IMHAI aims to generate new hypotheses and predictions on affective behavior and interspecies communication. Application of such a model should promote risk prevention and the establishment of positive links between humans and animals thereby contributing to their respective wellbeing.
Collapse
|
23
|
Dairy 4.0: Intelligent Communication Ecosystem for the Cattle Animal Welfare with Blockchain and IoT Enabled Technologies. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12147316] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/10/2022]
Abstract
An intelligent ecosystem with real-time wireless technology is now playing a key role in meeting the sustainability requirements set by the United Nations. Dairy cattle are a major source of milk production all over the world. To meet the food demand of the growing population with maximum productivity, it is necessary for dairy farmers to adopt real-time monitoring technologies. In this study, we will be exploring and assimilating the limitless possibilities for technological interventions in dairy cattle to drastically improve their ecosystem. Intelligent systems for sensing, monitoring, and methods for analysis to be used in applications such as animal health monitoring, animal location tracking, milk quality, and supply chain, feed monitoring and safety, etc., have been discussed briefly. Furthermore, generalized architecture has been proposed that can be directly applied in the future for breakthroughs in research and development linked to data gathering and the processing of applications through edge devices, robots, drones, and blockchain for building intelligent ecosystems. In addition, the article discusses the possibilities and challenges of implementing previous techniques for different activities in dairy cattle. High computing power-based wearable devices, renewable energy harvesting, drone-based furious animal attack detection, and blockchain with IoT assisted systems for the milk supply chain are the vital recommendations addressed in this study for the effective implementation of the intelligent ecosystem in dairy cattle.
Collapse
|
24
|
Kavanagh E, Kimock C, Whitehouse J, Micheletta J, Waller BM. Revisiting Darwin's comparisons between human and non-human primate facial signals. EVOLUTIONARY HUMAN SCIENCES 2022; 4:e27. [PMID: 35821665 PMCID: PMC7613043 DOI: 10.1017/ehs.2022.26] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Abstract
Darwin and other pioneering scholars made comparisons between human facial signals and those of non-human primates, suggesting they share evolutionary history. We now have tools available (Facial Action Coding System: FACS) to make these comparisons anatomically based and standardised, as well as analytical methods to facilitate comparative studies. Here we review the evidence establishing a shared anatomical basis between the facial behaviour of human and non-human primate species, concluding which signals are likely related, and which are not. We then review the evidence for shared function and discuss the implications for understanding human communication. Where differences between humans and other species exist, we explore possible explanations and future directions for enquiry.
Collapse
Affiliation(s)
- Eithne Kavanagh
- Department of Psychology, Nottingham Trent University, Nottingham, UK
| | - Clare Kimock
- Department of Psychology, Nottingham Trent University, Nottingham, UK
| | - Jamie Whitehouse
- Department of Psychology, Nottingham Trent University, Nottingham, UK
| | | | - Bridget M. Waller
- Department of Psychology, Nottingham Trent University, Nottingham, UK
| |
Collapse
|
25
|
Feighelstein M, Shimshoni I, Finka LR, Luna SPL, Mills DS, Zamansky A. Automated recognition of pain in cats. Sci Rep 2022; 12:9575. [PMID: 35688852 PMCID: PMC9187730 DOI: 10.1038/s41598-022-13348-1] [Citation(s) in RCA: 11] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2022] [Accepted: 05/23/2022] [Indexed: 11/09/2022] Open
Abstract
Facial expressions in non-human animals are closely linked to their internal affective states, with the majority of empirical work focusing on facial shape changes associated with pain. However, existing tools for facial expression analysis are prone to human subjectivity and bias, and in many cases also require special expertise and training. This paper presents the first comparative study of two different paths towards automatizing pain recognition in facial images of domestic short haired cats (n = 29), captured during ovariohysterectomy at different time points corresponding to varying intensities of pain. One approach is based on convolutional neural networks (ResNet50), while the other-on machine learning models based on geometric landmarks analysis inspired by species specific Facial Action Coding Systems (i.e. catFACS). Both types of approaches reach comparable accuracy of above 72%, indicating their potential usefulness as a basis for automating cat pain detection from images.
Collapse
Affiliation(s)
| | - Ilan Shimshoni
- Information Systems Department, University of Haifa, Haifa, Israel
| | - Lauren R Finka
- School of Veterinary Medicine and Science, The University of Nottingham, Nottingham, UK
| | - Stelio P L Luna
- Department of Veterinary Surgery and Animal Reproduction, School of Veterinary Medicine and Animal Science, São Paulo State University (Unesp), Botucatu, São Paulo, Brazil
| | - Daniel S Mills
- School of Life Sciences, Joseph Bank Laboratories, University of Lincoln, Lincoln, UK
| | - Anna Zamansky
- Information Systems Department, University of Haifa, Haifa, Israel.
| |
Collapse
|
26
|
Dollion N, Grandgeorge M, Saint-Amour D, Hosein Poitras Loewen A, François N, Fontaine NMG, Champagne N, Plusquellec P. Emotion Facial Processing in Children With Autism Spectrum Disorder: A Pilot Study of the Impact of Service Dogs. Front Psychol 2022; 13:869452. [PMID: 35668968 PMCID: PMC9165718 DOI: 10.3389/fpsyg.2022.869452] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2022] [Accepted: 03/31/2022] [Indexed: 12/04/2022] Open
Abstract
Processing and recognizing facial expressions are key factors in human social interaction. Past research suggests that individuals with autism spectrum disorder (ASD) present difficulties to decode facial expressions. Those difficulties are notably attributed to altered strategies in the visual scanning of expressive faces. Numerous studies have demonstrated the multiple benefits of exposure to pet dogs and service dogs on the interaction skills and psychosocial development of children with ASD. However, no study has investigated if those benefits also extend to the processing of facial expressions. The aim of this study was to investigate if having a service dog had an influence on facial expression processing skills of children with ASD. Two groups of 15 children with ASD, with and without a service dog, were compared using a facial expression recognition computer task while their ocular movements were measured using an eye-tracker. While the two groups did not differ in their accuracy and reaction time, results highlighted that children with ASD owning a service dog directed less attention toward areas that were not relevant to facial expression processing. They also displayed a more differentiated scanning of relevant facial features according to the displayed emotion (i.e., they spent more time on the mouth for joy than for anger, and vice versa for the eyes area). Results from the present study suggest that having a service dog and interacting with it on a daily basis may promote the development of specific visual exploration strategies for the processing of human faces.
Collapse
Affiliation(s)
- Nicolas Dollion
- Univ Rennes, Normandie Univ., CNRS, EthoS (Éthologie Animale et Humaine) - UMR 6552, Rennes, France.,Laboratoire d'Observation et d'Éthologie Humaine du Québec, Montréal Mental Health University Institute, Centre Intégré Universitaire de Santé et de Services Sociaux de l'Est-de-l'Île-de-Montréal (CIUSSS Est), Montréal, QC, Canada.,School of Psychoeducation, University of Montreal, Montréal, QC, Canada.,Mira Foundation Inc., Sainte-Madeleine, QC, Canada
| | - Marine Grandgeorge
- Univ Rennes, Normandie Univ., CNRS, EthoS (Éthologie Animale et Humaine) - UMR 6552, Rennes, France
| | - Dave Saint-Amour
- Department of Psychology, Centre de Recherche en Neuroscience Cognitives, NeuroQAM, Université du Quebec à Montréal, Montréal, QC, Canada
| | - Anthony Hosein Poitras Loewen
- Department of Psychology, Centre de Recherche en Neuroscience Cognitives, NeuroQAM, Université du Quebec à Montréal, Montréal, QC, Canada
| | | | - Nathalie M G Fontaine
- School of Criminology, Université de Montréal, Montréal, QC, Canada.,Centre Interdisciplinaire de Recherche sur le Cerveau et l'Apprentissage, University of Montréal, Montréal, QC, Canada
| | | | - Pierrich Plusquellec
- Laboratoire d'Observation et d'Éthologie Humaine du Québec, Montréal Mental Health University Institute, Centre Intégré Universitaire de Santé et de Services Sociaux de l'Est-de-l'Île-de-Montréal (CIUSSS Est), Montréal, QC, Canada.,School of Psychoeducation, University of Montreal, Montréal, QC, Canada.,Centre Interdisciplinaire de Recherche sur le Cerveau et l'Apprentissage, University of Montréal, Montréal, QC, Canada
| |
Collapse
|
27
|
Correia-Caeiro C, Burrows A, Wilson DA, Abdelrahman A, Miyabe-Nishiwaki T. CalliFACS: The common marmoset Facial Action Coding System. PLoS One 2022; 17:e0266442. [PMID: 35580128 PMCID: PMC9113598 DOI: 10.1371/journal.pone.0266442] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2021] [Accepted: 03/21/2022] [Indexed: 11/19/2022] Open
Abstract
Facial expressions are subtle cues, central for communication and conveying emotions in mammals. Traditionally, facial expressions have been classified as a whole (e.g. happy, angry, bared-teeth), due to automatic face processing in the human brain, i.e., humans categorise emotions globally, but are not aware of subtle or isolated cues such as an eyebrow raise. Moreover, the same facial configuration (e.g. lip corners pulled backwards exposing teeth) can convey widely different information depending on the species (e.g. humans: happiness; chimpanzees: fear). The Facial Action Coding System (FACS) is considered the gold standard for investigating human facial behaviour and avoids subjective interpretations of meaning by objectively measuring independent movements linked to facial muscles, called Action Units (AUs). Following a similar methodology, we developed the CalliFACS for the common marmoset. First, we determined the facial muscular plan of the common marmoset by examining dissections from the literature. Second, we recorded common marmosets in a variety of contexts (e.g. grooming, feeding, play, human interaction, veterinary procedures), and selected clips from online databases (e.g. YouTube) to identify their facial movements. Individual facial movements were classified according to appearance changes produced by the corresponding underlying musculature. A diverse repertoire of 33 facial movements was identified in the common marmoset (15 Action Units, 15 Action Descriptors and 3 Ear Action Descriptors). Although we observed a reduced range of facial movement when compared to the HumanFACS, the common marmoset's range of facial movements was larger than predicted according to their socio-ecology and facial morphology, which indicates their importance for social interactions. CalliFACS is a scientific tool to measure facial movements, and thus, allows us to better understand the common marmoset's expressions and communication. As common marmosets have become increasingly popular laboratory animal models, from neuroscience to cognition, CalliFACS can be used as an important tool to evaluate their welfare, particularly in captivity.
Collapse
Affiliation(s)
| | - Anne Burrows
- Department of Physical Therapy, Duquesne University, Pittsburgh, Pennsylvania, United States of America
- Department of Anthropology, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
| | - Duncan Andrew Wilson
- Primate Research Institute, Kyoto University, Inuyama, Japan
- Graduate School of Letters, Kyoto University, Kyoto, Japan
| | - Abdelhady Abdelrahman
- School of Health and Life Sciences, Glasgow Caledonian University, Glasgow, United Kingdom
| | | |
Collapse
|
28
|
Inagaki M, Inoue KI, Tanabe S, Kimura K, Takada M, Fujita I. Rapid processing of threatening faces in the amygdala of nonhuman primates: subcortical inputs and dual roles. Cereb Cortex 2022; 33:895-915. [PMID: 35323915 PMCID: PMC9890477 DOI: 10.1093/cercor/bhac109] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2021] [Revised: 02/22/2022] [Accepted: 02/22/2022] [Indexed: 02/04/2023] Open
Abstract
A subcortical pathway through the superior colliculus and pulvinar has been proposed to provide the amygdala with rapid but coarse visual information about emotional faces. However, evidence for short-latency, facial expression-discriminating responses from individual amygdala neurons is lacking; even if such a response exists, how it might contribute to stimulus detection is unclear. Also, no definitive anatomical evidence is available for the assumed pathway. Here we showed that ensemble responses of amygdala neurons in monkeys carried robust information about open-mouthed, presumably threatening, faces within 50 ms after stimulus onset. This short-latency signal was not found in the visual cortex, suggesting a subcortical origin. Temporal analysis revealed that the early response contained excitatory and suppressive components. The excitatory component may be useful for sending rapid signals downstream, while the sharpening of the rising phase of later-arriving inputs (presumably from the cortex) by the suppressive component might improve the processing of facial expressions over time. Injection of a retrograde trans-synaptic tracer into the amygdala revealed presumed monosynaptic labeling in the pulvinar and disynaptic labeling in the superior colliculus, including the retinorecipient layers. We suggest that the early amygdala responses originating from the colliculo-pulvino-amygdalar pathway play dual roles in threat detection.
Collapse
Affiliation(s)
- Mikio Inagaki
- Laboratory for Cognitive Neuroscience, Graduate School of Frontier Biosciences, Osaka University, 1-4 Yamadaoka, Suita, Osaka 565-0871, Japan,Center for Information and Neural Networks, National Institute of Information and Communications Technology and Osaka University, 1-4 Yamadaoka, Suita, Osaka 565-0871, Japan
| | - Ken-ichi Inoue
- Systems Neuroscience Section, Primate Research Institute, Kyoto University, 41-2 Kanrin, Inuyama, Aichi 484-8506, Japan
| | - Soshi Tanabe
- Systems Neuroscience Section, Primate Research Institute, Kyoto University, 41-2 Kanrin, Inuyama, Aichi 484-8506, Japan
| | - Kei Kimura
- Systems Neuroscience Section, Primate Research Institute, Kyoto University, 41-2 Kanrin, Inuyama, Aichi 484-8506, Japan
| | - Masahiko Takada
- Systems Neuroscience Section, Primate Research Institute, Kyoto University, 41-2 Kanrin, Inuyama, Aichi 484-8506, Japan
| | - Ichiro Fujita
- Corresponding author: Laboratory for Cognitive Neuroscience, Graduate School of Frontier Biosciences, Osaka University, 1-4 Yamadaoka, Suita, Osaka 565-0871, Japan.
| |
Collapse
|
29
|
Kret ME, Massen JJM, de Waal FBM. My Fear Is Not, and Never Will Be, Your Fear: On Emotions and Feelings in Animals. AFFECTIVE SCIENCE 2022; 3:182-189. [PMID: 36042781 PMCID: PMC9382921 DOI: 10.1007/s42761-021-00099-x] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 05/19/2021] [Accepted: 12/17/2021] [Indexed: 11/26/2022]
Abstract
Do nonhuman animals (henceforth, animals) have emotions, and if so, are these similar to ours? This opinion piece aims to add to the recent debate about this question and provides a critical re-evaluation of what can be concluded about animal and human emotions. Emotions, and their cognitive interpretation, i.e., feelings, serve important survival functions. Emotions, we believe, can exist without feelings and are unconsciously influencing our behavior more than we think, and possibly more so than feelings do. Given that emotions are expressed in body and brain, they can be inferred from these measures. We view feelings primarily as private states, which may be similar across closely related species but remain mostly inaccessible to science. Still, combining data acquired through behavioral observation with data obtained from noninvasive techniques (e.g., eyetracking, thermography, hormonal samples) and from cognitive tasks (e.g., decision-making paradigms, cognitive bias, attentional bias) provides new information about the inner states of animals, and possibly about their feelings as well. Given that many other species show behavioral, neurophysiological, hormonal, and cognitive responses to valenced stimuli equivalent to human responses, it seems logical to speak of animal emotions and sometimes even of animal feelings. At the very least, the contemporary multi-method approach allows us to get closer than ever before. We conclude with recommendations on how the field should move forward.
Collapse
Affiliation(s)
- Mariska E. Kret
- Cognitive Psychology Unit, Institute of Psychology, Leiden University, Leiden, The Netherlands
- Comparative Psychology & Affective Neuroscience Lab, Cognitive Psychology Department, Leiden University, Leiden, The Netherlands
- Leiden Institute for Brain and Cognition (LIBC), Leiden, The Netherlands
| | - Jorg J. M. Massen
- Animal Behaviour and Cognition, Department of Biology, Utrecht University, Utrecht, The Netherlands
| | - Frans B. M. de Waal
- Animal Behaviour and Cognition, Department of Biology, Utrecht University, Utrecht, The Netherlands
- Psychology Department, Emory University, Atlanta, GA USA
| |
Collapse
|
30
|
Waller BM, Kavanagh E, Micheletta J, Clark PR, Whitehouse J. The face is central to primate multicomponent signals. INT J PRIMATOL 2022. [DOI: 10.1007/s10764-021-00260-0] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
AbstractA wealth of experimental and observational evidence suggests that faces have become increasingly important in the communication system of primates over evolutionary time and that both the static and moveable aspects of faces convey considerable information. Therefore, whenever there is a visual component to any multicomponent signal the face is potentially relevant. However, the role of the face is not always considered in primate multicomponent communication research. We review the literature and make a case for greater focus on the face going forward. We propose that the face can be overlooked for two main reasons: first, due to methodological difficulty. Examination of multicomponent signals in primates is difficult, so scientists tend to examine a limited number of signals in combination. Detailed examination of the subtle and dynamic components of facial signals is particularly hard to achieve in studies of primates. Second, due to a common assumption that the face contains “emotional” content. A priori categorisation of facial behavior as “emotional” ignores the potentially communicative and predictive information present in the face that might contribute to signals. In short, we argue that the face is central to multicomponent signals (and also many multimodal signals) and suggest future directions for investigating this phenomenon.
Collapse
|
31
|
Escelsior A, Amadeo MB, Esposito D, Rosina A, Trabucco A, Inuggi A, Pereira da Silva B, Serafini G, Gori M, Amore M. COVID-19 and psychiatric disorders: The impact of face masks in emotion recognition face masks and emotion recognition in psychiatry. Front Psychiatry 2022; 13:932791. [PMID: 36238943 PMCID: PMC9551300 DOI: 10.3389/fpsyt.2022.932791] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/30/2022] [Accepted: 08/26/2022] [Indexed: 11/13/2022] Open
Abstract
Since the outbreak of the COVID-19 pandemic, reading facial expressions has become more complex due to face masks covering the lower part of people's faces. A history of psychiatric illness has been associated with higher rates of complications, hospitalization, and mortality due to COVID-19. Psychiatric patients have well-documented difficulties reading emotions from facial expressions; accordingly, this study assesses how using face masks, such as those worn for preventing COVID-19 transmission, impacts the emotion recognition skills of patients with psychiatric disorders. To this end, the current study asked patients with bipolar disorder, major depressive disorder, schizophrenia, and healthy individuals to identify facial emotions on face images with and without facial masks. Results demonstrate that the emotion recognition skills of all participants were negatively influenced by face masks. Moreover, the main insight of the study is that the impairment is crucially significant when patients with major depressive disorder and schizophrenia had to identify happiness at a low-intensity level. These findings have important implications for satisfactory social relationships and well-being. If emotions with positive valence are hardly understood by specific psychiatric patients, there is an even greater requirement for doctor-patient interactions in public primary care.
Collapse
Affiliation(s)
- Andrea Escelsior
- Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica ed SPDC, Largo Rosanna Benzi, Genoa, Italy.,Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| | - Maria Bianca Amadeo
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,U-VIP Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Genoa, Italy
| | - Davide Esposito
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,U-VIP Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Genoa, Italy
| | - Anna Rosina
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| | - Alice Trabucco
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy
| | - Alberto Inuggi
- Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica ed SPDC, Largo Rosanna Benzi, Genoa, Italy.,Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| | - Beatriz Pereira da Silva
- Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica ed SPDC, Largo Rosanna Benzi, Genoa, Italy.,Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy.,U-VIP Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Genoa, Italy
| | - Gianluca Serafini
- Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica ed SPDC, Largo Rosanna Benzi, Genoa, Italy.,Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| | - Monica Gori
- Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,U-VIP Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Genoa, Italy
| | - Mario Amore
- Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica ed SPDC, Largo Rosanna Benzi, Genoa, Italy.,Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| |
Collapse
|
32
|
Terhürne P, Schwartz B, Baur T, Schiller D, Eberhardt ST, André E, Lutz W. Validation and application of the Non-Verbal Behavior Analyzer: An automated tool to assess non-verbal emotional expressions in psychotherapy. Front Psychiatry 2022; 13:1026015. [PMID: 36386975 PMCID: PMC9650367 DOI: 10.3389/fpsyt.2022.1026015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/23/2022] [Accepted: 10/12/2022] [Indexed: 11/22/2022] Open
Abstract
BACKGROUND Emotions play a key role in psychotherapy. However, a problem with examining emotional states via self-report questionnaires is that the assessment usually takes place after the actual emotion has been experienced which might lead to biases and continuous human ratings are time and cost intensive. Using the AI-based software package Non-Verbal Behavior Analyzer (NOVA), video-based emotion recognition of arousal and valence can be applied in naturalistic psychotherapeutic settings. In this study, four emotion recognition models (ERM) each based on specific feature sets (facial: OpenFace, OpenFace-Aureg; body: OpenPose-Activation, OpenPose-Energy) were developed and compared in their ability to predict arousal and valence scores correlated to PANAS emotion scores and processes of change (interpersonal experience, coping experience, affective experience) as well as symptoms (depression and anxiety in HSCL-11). MATERIALS AND METHODS A total of 183 patient therapy videos were divided into a training sample (55 patients), a test sample (50 patients), and a holdout sample (78 patients). The best ERM was selected for further analyses. Then, ERM based arousal and valence scores were correlated with patient and therapist estimates of emotions and processes of change. Furthermore, using regression models arousal and valence were examined as predictors of symptom severity in depression and anxiety. RESULTS The ERM based on OpenFace produced the best agreement to the human coder rating. Arousal and valence correlated significantly with therapists' ratings of sadness, shame, anxiety, and relaxation, but not with the patient ratings of their own emotions. Furthermore, a significant negative correlation indicates that negative valence was associated with higher affective experience. Negative valence was found to significantly predict higher anxiety but not depression scores. CONCLUSION This study shows that emotion recognition with NOVA can be used to generate ERMs associated with patient emotions, affective experiences and symptoms. Nevertheless, limitations were obvious. It seems necessary to improve the ERMs using larger databases of sessions and the validity of ERMs needs to be further investigated in different samples and different applications. Furthermore, future research should take ERMs to identify emotional synchrony between patient and therapists into account.
Collapse
Affiliation(s)
- Patrick Terhürne
- Clinical Psychology and Psychotherapy, University of Trier, Trier, Germany
| | - Brian Schwartz
- Clinical Psychology and Psychotherapy, University of Trier, Trier, Germany
| | - Tobias Baur
- Chair for Human Centered Artificial Intelligence, Augsburg University, Augsburg, Germany
| | - Dominik Schiller
- Chair for Human Centered Artificial Intelligence, Augsburg University, Augsburg, Germany
| | | | - Elisabeth André
- Chair for Human Centered Artificial Intelligence, Augsburg University, Augsburg, Germany
| | - Wolfgang Lutz
- Clinical Psychology and Psychotherapy, University of Trier, Trier, Germany
| |
Collapse
|
33
|
Automatic Recognition of Macaque Facial Expressions for Detection of Affective States. eNeuro 2021; 8:ENEURO.0117-21.2021. [PMID: 34799408 PMCID: PMC8664380 DOI: 10.1523/eneuro.0117-21.2021] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2021] [Revised: 08/28/2021] [Accepted: 11/10/2021] [Indexed: 11/21/2022] Open
Abstract
Internal affective states produce external manifestations such as facial expressions. In humans, the Facial Action Coding System (FACS) is widely used to objectively quantify the elemental facial action units (AUs) that build complex facial expressions. A similar system has been developed for macaque monkeys-the Macaque FACS (MaqFACS); yet, unlike the human counterpart, which is already partially replaced by automatic algorithms, this system still requires labor-intensive coding. Here, we developed and implemented the first prototype for automatic MaqFACS coding. We applied the approach to the analysis of behavioral and neural data recorded from freely interacting macaque monkeys. The method achieved high performance in the recognition of six dominant AUs, generalizing between conspecific individuals (Macaca mulatta) and even between species (Macaca fascicularis). The study lays the foundation for fully automated detection of facial expressions in animals, which is crucial for investigating the neural substrates of social and affective states.
Collapse
|
34
|
Abstract
Understanding facial signals in humans and other species is crucial for understanding the evolution, complexity, and function of the face as a communication tool. The Facial Action Coding System (FACS) enables researchers to measure facial movements accurately, but we currently lack tools to reliably analyse data and efficiently communicate results. Network analysis can provide a way to use the information encoded in FACS datasets: by treating individual AUs (the smallest units of facial movements) as nodes in a network and their co-occurrence as connections, we can analyse and visualise differences in the use of combinations of AUs in different conditions. Here, we present ‘NetFACS’, a statistical package that uses occurrence probabilities and resampling methods to answer questions about the use of AUs, AU combinations, and the facial communication system as a whole in humans and non-human animals. Using highly stereotyped facial signals as an example, we illustrate some of the current functionalities of NetFACS. We show that very few AUs are specific to certain stereotypical contexts; that AUs are not used independently from each other; that graph-level properties of stereotypical signals differ; and that clusters of AUs allow us to reconstruct facial signals, even when blind to the underlying conditions. The flexibility and widespread use of network analysis allows us to move away from studying facial signals as stereotyped expressions, and towards a dynamic and differentiated approach to facial communication.
Collapse
|
35
|
Llewelyn H, Kiddie J. Can a facial action coding system (CatFACS) be used to determine the welfare state of cats with cerebellar hypoplasia? Vet Rec 2021; 190:e1079. [PMID: 34723388 DOI: 10.1002/vetr.1079] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2021] [Revised: 09/01/2021] [Accepted: 10/17/2021] [Indexed: 11/11/2022]
Abstract
BACKGROUND The impaired motor skills of cats living with cerebellar hypoplasia (CH) suggests they would be unable to practice normal behaviour, one of the five welfare needs. This study aimed to explore the use of facial action coding system (CatFACS) as a welfare assessment tool for cats with CH. METHODS Facial expressions (action units [AUs]) were defined as neutral/positive or negative by recording healthy cats (n = 89) during presumed aversive or relaxed scenarios. CH cats (n = 33) were then filmed and their facial expressions compared to those of the presumed positively- and negatively-valenced healthy cats. RESULTS Sixteen negative AUs were defined. CH cats performed more of these than healthy cats (p = 0.023) in the relaxed scenario. There was no difference in AU expression between three levels of CH severity (mild, moderate or severe) (p = 0.461). CONCLUSION Cats perform distinct AUs when experiencing negatively-valenced arousal, the presence or absence of these AUs could be used to infer the welfare of healthy and CH cats. As there was no difference in AU expression between the three levels of CH severity, the behavioural restrictions CH imposes on cats does not necessarily indicate lower welfare and the reasons why CH cats perform more negatively associated AUs warrant further research.
Collapse
Affiliation(s)
- Helen Llewelyn
- Department of Biology, School of Life Science, Anglia Ruskin University, Cambridge, UK
| | - Jenna Kiddie
- Department of Biology, School of Life Science, Anglia Ruskin University, Cambridge, UK.,Institute of Science, Natural Resources and Outdoor Studies, University of Cumbria, Carlisle, UK
| |
Collapse
|
36
|
Jarvis S, Ellis MA, Turnbull JF, Rey Planellas S, Wemelsfelder F. Qualitative Behavioral Assessment in Juvenile Farmed Atlantic Salmon ( Salmo salar): Potential for On-Farm Welfare Assessment. Front Vet Sci 2021; 8:702783. [PMID: 34557541 PMCID: PMC8453064 DOI: 10.3389/fvets.2021.702783] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2021] [Accepted: 08/12/2021] [Indexed: 11/26/2022] Open
Abstract
There is a growing scientific and legislative consensus that fish are sentient, and therefore have the capacity to experience pain and suffering. The assessment of the welfare of farmed fish is challenging due to the aquatic environment and the number of animals housed together. However, with increasing global production and intensification of aquaculture comes greater impetus for developing effective tools which are suitable for the aquatic environment to assess the emotional experience and welfare of farmed fish. This study therefore aimed to investigate the use of Qualitative Behavioral Assessment (QBA), originally developed for terrestrial farmed animals, in farmed salmon and evaluate its potential for use as a welfare monitoring tool. QBA is a “whole animal” approach based on the description and quantification of the expressive qualities of an animal's dynamic style of behaving, using descriptors such as relaxed, agitated, lethargic, or confident. A list of 20 qualitative descriptors was generated by fish farmers after viewing video-footage showing behavior expressions representative of the full repertoire of salmon in this context. A separate, non-experienced group of 10 observers subsequently watched 25 video clips of farmed salmon, and scored the 20 descriptors for each clip using a Visual Analog Scale (VAS). To assess intra-observer reliability each observer viewed the same 25 video clips twice, in two sessions 10 days apart, with the second clip set presented in a different order. The observers were unaware that the two sets of video clips were identical. Data were analyzed using Principal Component (PC) Analysis (correlation matrix, no rotation), revealing four dimensions that together explained 79% of the variation between video clips, with PC1 (Tense/anxious/skittish—Calm/mellow/relaxed) explaining the greatest percentage of variation (56%). PC1 was the only dimension to show acceptable inter- and intra-observer reliability, and mean PC1 scores correlated significantly to durations of slow and erratic physical movements measured for the same 25 video clips. Further refinements to the methodology may be necessary, but this study is the first to provide evidence for the potential of Qualitative Behavioral Assessment to serve as a time-efficient welfare assessment tool for juvenile salmon under farmed conditions.
Collapse
Affiliation(s)
- Susan Jarvis
- The Global Academy of Agriculture and Food Security, University of Edinburgh, Edinburgh, United Kingdom
| | - Maureen A Ellis
- Institute of Aquaculture, Faculty of Natural Sciences, University of Stirling, Stirling, United Kingdom
| | - James F Turnbull
- Institute of Aquaculture, Faculty of Natural Sciences, University of Stirling, Stirling, United Kingdom
| | - Sonia Rey Planellas
- Institute of Aquaculture, Faculty of Natural Sciences, University of Stirling, Stirling, United Kingdom
| | - Francoise Wemelsfelder
- Animal and Veterinary Sciences, SRUC (Scotland's Rural College), Edinburgh, United Kingdom
| |
Collapse
|
37
|
Florkiewicz B, Campbell M. Chimpanzee facial gestures and the implications for the evolution of language. PeerJ 2021. [DOI: 10.7717/peerj.12237] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
Great ape manual gestures are described as communicative, flexible, intentional, and goal-oriented. These gestures are thought to be an evolutionary pre-cursor to human language. Conversely, facial expressions are thought to be inflexible, automatic, and derived from emotion. However, great apes can make a wide range of movements with their faces, and they may possess the control needed to gesture with their faces as well as their hands. We examined whether chimpanzee facial expressions possess the four important gesture properties and how they compare to manual gestures. To do this, we quantified variables that have been previously described through largely qualitative means. Chimpanzee facial expressions met all four gesture criteria and performed remarkably similar to manual gestures. Facial gestures have implications for the evolution of language. If other mammals also show facial gestures, then the gestural origins of language may be much older than the human/great ape lineage.
Collapse
Affiliation(s)
- Brittany Florkiewicz
- Department of Anthropology, University of California, Los Angeles, Los Angeles, CA, United States of America
| | - Matthew Campbell
- Department of Psychology, California State University, Channel Islands, Camarillo, CA, United States of America
| |
Collapse
|
38
|
Alais D, Xu Y, Wardle SG, Taubert J. A shared mechanism for facial expression in human faces and face pareidolia. Proc Biol Sci 2021; 288:20210966. [PMID: 34229489 PMCID: PMC8261219 DOI: 10.1098/rspb.2021.0966] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022] Open
Abstract
Facial expressions are vital for social communication, yet the underlying mechanisms are still being discovered. Illusory faces perceived in objects (face pareidolia) are errors of face detection that share some neural mechanisms with human face processing. However, it is unknown whether expression in illusory faces engages the same mechanisms as human faces. Here, using a serial dependence paradigm, we investigated whether illusory and human faces share a common expression mechanism. First, we found that images of face pareidolia are reliably rated for expression, within and between observers, despite varying greatly in visual features. Second, they exhibit positive serial dependence for perceived facial expression, meaning an illusory face (happy or angry) is perceived as more similar in expression to the preceding one, just as seen for human faces. This suggests illusory and human faces engage similar mechanisms of temporal continuity. Third, we found robust cross-domain serial dependence of perceived expression between illusory and human faces when they were interleaved, with serial effects larger when illusory faces preceded human faces than the reverse. Together, the results support a shared mechanism for facial expression between human faces and illusory faces and suggest that expression processing is not tightly bound to human facial features.
Collapse
Affiliation(s)
- David Alais
- School of Psychology, The University of Sydney, Sydney, New South Wales, Australia
| | - Yiben Xu
- School of Psychology, The University of Sydney, Sydney, New South Wales, Australia
| | - Susan G Wardle
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA
| | - Jessica Taubert
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA
| |
Collapse
|
39
|
Correia-Caeiro C, Guo K, Mills D. Bodily emotional expressions are a primary source of information for dogs, but not for humans. Anim Cogn 2021; 24:267-279. [PMID: 33507407 PMCID: PMC8035094 DOI: 10.1007/s10071-021-01471-x] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2020] [Revised: 12/22/2020] [Accepted: 01/02/2021] [Indexed: 11/26/2022]
Abstract
Dogs have remarkable abilities to synergise their behaviour with that of people, but how dogs read facial and bodily emotional cues in comparison to humans remains unclear. Both species share the same ecological niche, are highly social and expressive, making them an ideal comparative model for intra- and inter-species emotion perception. We compared eye-tracking data from unrestrained humans and dogs when viewing dynamic and naturalistic emotional expressions in humans and dogs. Dogs attended more to the body than the head of human and dog figures, unlike humans who focused more on the head of both species. Dogs and humans also showed a clear age effect that reduced head gaze. Our results indicate a species-specific evolutionary adaptation for emotion perception, which is only partly modified for heterospecific cues. These results have important implications for managing the risk associated with human-dog interactions, where expressive and perceptual differences are crucial.
Collapse
Affiliation(s)
- Catia Correia-Caeiro
- School of Psychology, University of Lincoln, Lincoln, UK.
- School of Life Sciences, University of Lincoln, Lincoln, UK.
- Primate Research Institute, Kyoto University, Inuyama, Japan.
| | - Kun Guo
- School of Psychology, University of Lincoln, Lincoln, UK
| | - Daniel Mills
- School of Life Sciences, University of Lincoln, Lincoln, UK
| |
Collapse
|
40
|
Measuring Farm Animal Emotions-Sensor-Based Approaches. SENSORS 2021; 21:s21020553. [PMID: 33466737 PMCID: PMC7830443 DOI: 10.3390/s21020553] [Citation(s) in RCA: 19] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/14/2020] [Revised: 01/11/2021] [Accepted: 01/12/2021] [Indexed: 02/06/2023]
Abstract
Understanding animal emotions is a key to unlocking methods for improving animal welfare. Currently there are no ‘benchmarks’ or any scientific assessments available for measuring and quantifying the emotional responses of farm animals. Using sensors to collect biometric data as a means of measuring animal emotions is a topic of growing interest in agricultural technology. Here we reviewed several aspects of the use of sensor-based approaches in monitoring animal emotions, beginning with an introduction on animal emotions. Then we reviewed some of the available technological systems for analyzing animal emotions. These systems include a variety of sensors, the algorithms used to process biometric data taken from these sensors, facial expression, and sound analysis. We conclude that a single emotional expression measurement based on either the facial feature of animals or the physiological functions cannot show accurately the farm animal’s emotional changes, and hence compound expression recognition measurement is required. We propose some novel ways to combine sensor technologies through sensor fusion into efficient systems for monitoring and measuring the animals’ compound expression of emotions. Finally, we explore future perspectives in the field, including challenges and opportunities.
Collapse
|
41
|
Correia-Caeiro C, Holmes K, Miyabe-Nishiwaki T. Extending the MaqFACS to measure facial movement in Japanese macaques (Macaca fuscata) reveals a wide repertoire potential. PLoS One 2021; 16:e0245117. [PMID: 33411716 PMCID: PMC7790396 DOI: 10.1371/journal.pone.0245117] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2020] [Accepted: 12/23/2020] [Indexed: 02/01/2023] Open
Abstract
Facial expressions are complex and subtle signals, central for communication and emotion in social mammals. Traditionally, facial expressions have been classified as a whole, disregarding small but relevant differences in displays. Even with the same morphological configuration different information can be conveyed depending on the species. Due to a hardwired processing of faces in the human brain, humans are quick to attribute emotion, but have difficulty in registering facial movement units. The well-known human FACS (Facial Action Coding System) is the gold standard for objectively measuring facial expressions, and can be adapted through anatomical investigation and functional homologies for cross-species systematic comparisons. Here we aimed at developing a FACS for Japanese macaques, following established FACS methodology: first, we considered the species' muscular facial plan; second, we ascertained functional homologies with other primate species; and finally, we categorised each independent facial movement into Action Units (AUs). Due to similarities in the rhesus and Japanese macaques' facial musculature, the MaqFACS (previously developed for rhesus macaques) was used as a basis to extend the FACS tool to Japanese macaques, while highlighting the morphological and appearance changes differences between the two species. We documented 19 AUs, 15 Action Descriptors (ADs) and 3 Ear Action Units (EAUs) in Japanese macaques, with all movements of MaqFACS found in Japanese macaques. New movements were also observed, indicating a slightly larger repertoire than in rhesus or Barbary macaques. Our work reported here of the MaqFACS extension for Japanese macaques, when used together with the MaqFACS, comprises a valuable objective tool for the systematic and standardised analysis of facial expressions in Japanese macaques. The MaqFACS extension for Japanese macaques will now allow the investigation of the evolution of communication and emotion in primates, as well as contribute to improving the welfare of individuals, particularly in captivity and laboratory settings.
Collapse
Affiliation(s)
| | - Kathryn Holmes
- School of Psychology, University of Lincoln, Lincoln, Lincolnshire, United Kingdom
| | | |
Collapse
|
42
|
Steinmair D, Löffler-Stastka H. The Emerging Role of Interdisciplinarity in Clinical Psychoanalysis. Front Psychol 2021; 12:659429. [PMID: 34025523 PMCID: PMC8131672 DOI: 10.3389/fpsyg.2021.659429] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2021] [Accepted: 04/12/2021] [Indexed: 02/05/2023] Open
Abstract
Given the tight interconnections proposed between brain and psyche, psychoanalysis was conceptualized as an interdisciplinary theory right from the beginning. The diversification of knowledge performed by different science and technology fields, concerned with the same matter (explaining mind and brain and connecting them), makes this interdisciplinarity even more visible and evident. This challenges the integrative potential lying in psychoanalytic meta-theory.
Collapse
Affiliation(s)
- Dagmar Steinmair
- Department of Psychoanalysis and Psychotherapy, Medical University of Vienna, Vienna, Austria
- Karl Landsteiner Private University for Health Sciences, Krems an der Donau, Austria
| | - Henriette Löffler-Stastka
- Department of Psychoanalysis and Psychotherapy, Medical University of Vienna, Vienna, Austria
- *Correspondence: Henriette Löffler-Stastka,
| |
Collapse
|
43
|
Taubert J, Japee S. Using FACS to trace the neural specializations underlying the recognition of facial expressions: A commentary on Waller et al. (2020). Neurosci Biobehav Rev 2020; 120:75-77. [PMID: 33227326 DOI: 10.1016/j.neubiorev.2020.10.016] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2020] [Revised: 10/20/2020] [Accepted: 10/30/2020] [Indexed: 02/08/2023]
Abstract
In the recent review by Waller et al. (2020) the authors discuss how the Facial Action Coding System (FACS) can be used to study the evolution of facial behaviors. This is a timely and thought-provoking review which highlights the numerous ways in which FACS could be used to compare the mechanisms responsible for the production of facial behaviors across species. We propose that FACS could also be used to study the recognition of facial behaviors in nonhuman subjects where one of the key challenges is finding suitable stimuli that convey different emotions. By using FACS-rated images in awake neuroimaging experiments, researchers could accurately identify the brain mechanisms responsible for recognizing expressions across mammalian species. This approach would reveal neural homologs and deepen our understanding of how nonverbal social communication has evolved.
Collapse
Affiliation(s)
- Jessica Taubert
- The Laboratory of Brain and Cognition, The National Institute of Mental Health, United States.
| | - Shruti Japee
- The Laboratory of Brain and Cognition, The National Institute of Mental Health, United States
| |
Collapse
|
44
|
Burrows AM, Kaminski J, Waller BM, Omstead KM, Rogers-Vizena C, Mendelson B. Dog faces exhibit anatomical differences in comparison to other domestic animals. Anat Rec (Hoboken) 2020; 304:231-241. [PMID: 32969196 DOI: 10.1002/ar.24507] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2020] [Revised: 06/24/2020] [Accepted: 06/27/2020] [Indexed: 11/11/2022]
Affiliation(s)
- Anne M Burrows
- Department of Physical Therapy, Duquesne University, Pittsburgh, Pennsylvania, USA.,Department of Anthropology, University of Pittsburgh, Pittsburgh, Pennsylvania, USA
| | - Juliane Kaminski
- Department of Psychology, University of Portsmouth, Portsmouth, UK
| | - Bridget M Waller
- Department of Psychology, University of Portsmouth, Portsmouth, UK
| | - Kailey M Omstead
- Department of Physical Therapy, Duquesne University, Pittsburgh, Pennsylvania, USA
| | - Carolyn Rogers-Vizena
- Department of Plastic & Oral Surgery, Boston Children's Hospital, Boston, Massachusetts, USA
| | - Bryan Mendelson
- The Centre for Facial Plastic Surgery, Melbourne, Victoria, Australia
| |
Collapse
|
45
|
Emotional expressions in human and non-human great apes. Neurosci Biobehav Rev 2020; 115:378-395. [DOI: 10.1016/j.neubiorev.2020.01.027] [Citation(s) in RCA: 26] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2019] [Revised: 01/17/2020] [Accepted: 01/22/2020] [Indexed: 11/23/2022]
|