1
|
Feighelstein M, Riccie-Bonot C, Hasan H, Weinberg H, Rettig T, Segal M, Distelfeld T, Shimshoni I, Mills DS, Zamansky A. Automated recognition of emotional states of horses from facial expressions. PLoS One 2024; 19:e0302893. [PMID: 39008504 PMCID: PMC11249218 DOI: 10.1371/journal.pone.0302893] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2023] [Accepted: 04/16/2024] [Indexed: 07/17/2024] Open
Abstract
Animal affective computing is an emerging new field, which has so far mainly focused on pain, while other emotional states remain uncharted territories, especially in horses. This study is the first to develop AI models to automatically recognize horse emotional states from facial expressions using data collected in a controlled experiment. We explore two types of pipelines: a deep learning one which takes as input video footage, and a machine learning one which takes as input EquiFACS annotations. The former outperforms the latter, with 76% accuracy in separating between four emotional states: baseline, positive anticipation, disappointment and frustration. Anticipation and frustration were difficult to separate, with only 61% accuracy.
Collapse
Affiliation(s)
| | | | - Hana Hasan
- Information Systems Department, University of Haifa, Haifa, Israel
| | - Hallel Weinberg
- Information Systems Department, University of Haifa, Haifa, Israel
| | - Tidhar Rettig
- Information Systems Department, University of Haifa, Haifa, Israel
| | - Maya Segal
- Faculty of Electrical Engineering, Technion, Israel Institute of Technology, Haifa, Israel
| | - Tomer Distelfeld
- Faculty of Electrical Engineering, Technion, Israel Institute of Technology, Haifa, Israel
| | - Ilan Shimshoni
- Information Systems Department, University of Haifa, Haifa, Israel
| | - Daniel S. Mills
- Department of Life Sciences, Joseph Banks Laboratories, University of Lincoln, Lincoln, United Kingdom
| | - Anna Zamansky
- Information Systems Department, University of Haifa, Haifa, Israel
| |
Collapse
|
2
|
Feighelstein M, Ehrlich Y, Naftaly L, Alpin M, Nadir S, Shimshoni I, Pinho RH, Luna SPL, Zamansky A. Deep learning for video-based automated pain recognition in rabbits. Sci Rep 2023; 13:14679. [PMID: 37674052 PMCID: PMC10482887 DOI: 10.1038/s41598-023-41774-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2023] [Accepted: 08/31/2023] [Indexed: 09/08/2023] Open
Abstract
Despite the wide range of uses of rabbits (Oryctolagus cuniculus) as experimental models for pain, as well as their increasing popularity as pets, pain assessment in rabbits is understudied. This study is the first to address automated detection of acute postoperative pain in rabbits. Using a dataset of video footage of n = 28 rabbits before (no pain) and after surgery (pain), we present an AI model for pain recognition using both the facial area and the body posture and reaching accuracy of above 87%. We apply a combination of 1 sec interval sampling with the Grayscale Short-Term stacking (GrayST) to incorporate temporal information for video classification at frame level and a frame selection technique to better exploit the availability of video data.
Collapse
Affiliation(s)
| | - Yamit Ehrlich
- Information Systems Department, University of Haifa, Haifa, Israel
| | - Li Naftaly
- Information Systems Department, University of Haifa, Haifa, Israel
| | - Miriam Alpin
- Faculty of Electrical Engineering, Technion, Israel Institute of Technology, Haifa, Israel
| | - Shenhav Nadir
- Faculty of Electrical Engineering, Technion, Israel Institute of Technology, Haifa, Israel
| | - Ilan Shimshoni
- Information Systems Department, University of Haifa, Haifa, Israel
| | - Renata H Pinho
- Faculty of Veterinary Medicine, University of Calgary, Calgary, Canada
| | - Stelio P L Luna
- School of Veterinary Medicine and Animal Science, São Paulo State University (UNESP), São Paulo, Brazil
| | - Anna Zamansky
- Information Systems Department, University of Haifa, Haifa, Israel.
| |
Collapse
|
3
|
Jardat P, Liehrmann O, Reigner F, Parias C, Calandreau L, Lansade L. Horses discriminate between human facial and vocal expressions of sadness and joy. Anim Cogn 2023; 26:1733-1742. [PMID: 37543956 DOI: 10.1007/s10071-023-01817-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2023] [Revised: 07/25/2023] [Accepted: 07/28/2023] [Indexed: 08/08/2023]
Abstract
Communication of emotions plays a key role in intraspecific social interactions and likely in interspecific interactions. Several studies have shown that animals perceive human joy and anger, but few studies have examined other human emotions, such as sadness. In this study, we conducted a cross-modal experiment, in which we showed 28 horses two soundless videos simultaneously, one showing a sad, and one a joyful human face. These were accompanied by either a sad or joyful voice. The number of horses whose first look to the video that was incongruent with the voice was longer than their first look to the congruent video was higher than chance, suggesting that horses could form cross-modal representations of human joy and sadness. Moreover, horses were more attentive to the videos of joy and looked at them for longer, more frequently, and more rapidly than the videos of sadness. Their heart rates tended to increase when they heard joy and to decrease when they heard sadness. These results show that horses are able to discriminate facial and vocal expressions of joy and sadness and may form cross-modal representations of these emotions; they also are more attracted to joyful faces than to sad faces and seem to be more aroused by a joyful voice than a sad voice. Further studies are needed to better understand how horses perceive the range of human emotions, and we propose that future experiments include neutral stimuli as well as emotions with different arousal levels but a same valence.
Collapse
Affiliation(s)
- Plotine Jardat
- CNRS, IFCE, INRAE, Université de Tours, PRC, 37380, Nouzilly, France.
| | - Océane Liehrmann
- Department of Biology, University of Turku, 20500, Turku, Finland
| | | | - Céline Parias
- CNRS, IFCE, INRAE, Université de Tours, PRC, 37380, Nouzilly, France
| | | | - Léa Lansade
- CNRS, IFCE, INRAE, Université de Tours, PRC, 37380, Nouzilly, France.
| |
Collapse
|
4
|
Janicka W, Wilk I, Próchniak T. Does social motivation mitigate fear caused by a sudden sound in horses? Anim Cogn 2023; 26:1649-1660. [PMID: 37450226 PMCID: PMC10442260 DOI: 10.1007/s10071-023-01805-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2023] [Revised: 06/01/2023] [Accepted: 06/23/2023] [Indexed: 07/18/2023]
Abstract
Living in a herd has multiple advantages for social species and is a primary survival strategy for prey. The presence of conspecifics, identified as a social buffer, may mitigate the individual stress response. Social isolation is, therefore, particularly stressful for horses, which are gregarious animals. However, they are not equally vulnerable to separation from the group. We tested whether more and less socially dependent horses and independent individuals would differ in their responses to novel and sudden sounds occurring in two contexts: non-social and social motivation. Twenty warmblood horses were first exposed to two social tests: to evaluate the level of social dependence (rate of restless behaviour; social isolation) and the quantity and the quality of interactions in which they were involved (stay on a paddock). Two fear audio tests were then performed to compare the responses to sudden sounds while feeding (non-social motivation; control trial) and while moving towards the herd (social motivation; experimental trial). Socially dependent horses showed more pronounced avoidance behaviour and needed much more time to resume feeding during the control trial. Hence, dependent individuals appeared to be more fearful. However, during an experimental trial, horses of both groups tended to ignore the sound or paid only limited attention to the stimulus, continuing to move forward towards their conspecifics. Thus, social motivation may mitigate fear caused by a frightening stimulus and make fearful and dependent horses more prone to face a potentially stressful event. This finding should be taken into account in horse training and management.
Collapse
Affiliation(s)
- Wiktoria Janicka
- Department of Horse Breeding and Use, University of Life Sciences in Lublin, 13 Akademicka Street, Lublin, Poland
| | - Izabela Wilk
- Department of Horse Breeding and Use, University of Life Sciences in Lublin, 13 Akademicka Street, Lublin, Poland.
| | - Tomasz Próchniak
- Institute of Biological Basis of Animal Production, University of Life Sciences in Lublin, 13 Akademicka Street, Lublin, Poland
| |
Collapse
|
5
|
Explainable automated recognition of emotional states from canine facial expressions: the case of positive anticipation and frustration. Sci Rep 2022; 12:22611. [PMID: 36585439 PMCID: PMC9803655 DOI: 10.1038/s41598-022-27079-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2022] [Accepted: 12/26/2022] [Indexed: 12/31/2022] Open
Abstract
In animal research, automation of affective states recognition has so far mainly addressed pain in a few species. Emotional states remain uncharted territories, especially in dogs, due to the complexity of their facial morphology and expressions. This study contributes to fill this gap in two aspects. First, it is the first to address dog emotional states using a dataset obtained in a controlled experimental setting, including videos from (n = 29) Labrador Retrievers assumed to be in two experimentally induced emotional states: negative (frustration) and positive (anticipation). The dogs' facial expressions were measured using the Dogs Facial Action Coding System (DogFACS). Two different approaches are compared in relation to our aim: (1) a DogFACS-based approach with a two-step pipeline consisting of (i) a DogFACS variable detector and (ii) a positive/negative state Decision Tree classifier; (2) An approach using deep learning techniques with no intermediate representation. The approaches reach accuracy of above 71% and 89%, respectively, with the deep learning approach performing better. Secondly, this study is also the first to study explainability of AI models in the context of emotion in animals. The DogFACS-based approach provides decision trees, that is a mathematical representation which reflects previous findings by human experts in relation to certain facial expressions (DogFACS variables) being correlates of specific emotional states. The deep learning approach offers a different, visual form of explainability in the form of heatmaps reflecting regions of focus of the network's attention, which in some cases show focus clearly related to the nature of particular DogFACS variables. These heatmaps may hold the key to novel insights on the sensitivity of the network to nuanced pixel patterns reflecting information invisible to the human eye.
Collapse
|
6
|
Going Deeper than Tracking: A Survey of Computer-Vision Based Recognition of Animal Pain and Emotions. Int J Comput Vis 2022. [DOI: 10.1007/s11263-022-01716-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
Abstract
AbstractAdvances in animal motion tracking and pose recognition have been a game changer in the study of animal behavior. Recently, an increasing number of works go ‘deeper’ than tracking, and address automated recognition of animals’ internal states such as emotions and pain with the aim of improving animal welfare, making this a timely moment for a systematization of the field. This paper provides a comprehensive survey of computer vision-based research on recognition of pain and emotional states in animals, addressing both facial and bodily behavior analysis. We summarize the efforts that have been presented so far within this topic—classifying them across different dimensions, highlight challenges and research gaps, and provide best practice recommendations for advancing the field, and some future directions for research.
Collapse
|
7
|
Kowalczuk Z, Czubenko M, Żmuda‐Trzebiatowska W. Categorization of emotions in dog behavior based on the deep neural network. Comput Intell 2022. [DOI: 10.1111/coin.12559] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Affiliation(s)
- Zdzisław Kowalczuk
- Faculty of Electronics Telecommunications and Informatics Gdansk University of Technology Pomorskie Poland
| | - Michał Czubenko
- Faculty of Electronics Telecommunications and Informatics Gdansk University of Technology Pomorskie Poland
| | | |
Collapse
|
8
|
Cheng WK, Leong WC, Tan JS, Hong ZW, Chen YL. Affective Recommender System for Pet Social Network. SENSORS (BASEL, SWITZERLAND) 2022; 22:6759. [PMID: 36146109 PMCID: PMC9504351 DOI: 10.3390/s22186759] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/25/2022] [Revised: 09/02/2022] [Accepted: 09/02/2022] [Indexed: 06/16/2023]
Abstract
In this new era, it is no longer impossible to create a smart home environment around the household. Moreover, users are not limited to humans but also include pets such as dogs. Dogs need long-term close companionship with their owners; however, owners may occasionally need to be away from home for extended periods of time and can only monitor their dogs' behaviors through home security cameras. Some dogs are sensitive and may develop separation anxiety, which can lead to disruptive behavior. Therefore, a novel smart home solution with an affective recommendation module is proposed by developing: (1) an application to predict the behavior of dogs and, (2) a communication platform using smartphones to connect with dog friends from different households. To predict the dogs' behaviors, the dog emotion recognition and dog barking recognition methods are performed. The ResNet model and the sequential model are implemented to recognize dog emotions and dog barks. The weighted average is proposed to combine the prediction value of dog emotion and dog bark to improve the prediction output. Subsequently, the prediction output is forwarded to a recommendation module to respond to the dogs' conditions. On the other hand, the Real-Time Messaging Protocol (RTMP) server is implemented as a platform to contact a dog's friends on a list to interact with each other. Various tests were carried out and the proposed weighted average led to an improvement in the prediction accuracy. Additionally, the proposed communication platform using basic smartphones has successfully established the connection between dog friends.
Collapse
Affiliation(s)
- Wai Khuen Cheng
- Faculty of Information and Communication Technology, Universiti Tunku Abdul Rahman, Kampar 31900, Perak, Malaysia
| | - Wai Chun Leong
- Faculty of Information and Communication Technology, Universiti Tunku Abdul Rahman, Kampar 31900, Perak, Malaysia
| | - Joi San Tan
- Faculty of Information and Communication Technology, Universiti Tunku Abdul Rahman, Kampar 31900, Perak, Malaysia
| | - Zeng-Wei Hong
- Department of Information Engineering and Computer Science, Feng Chia University, Taichung 40724, Taiwan
| | - Yen-Lin Chen
- Department of Computer Science and Information Engineering, National Taipei University of Technology, Taipei 106344, Taiwan
| |
Collapse
|