1
|
Jardat P, Liehrmann O, Reigner F, Parias C, Calandreau L, Lansade L. Horses discriminate between human facial and vocal expressions of sadness and joy. Anim Cogn 2023; 26:1733-1742. [PMID: 37543956 DOI: 10.1007/s10071-023-01817-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2023] [Revised: 07/25/2023] [Accepted: 07/28/2023] [Indexed: 08/08/2023]
Abstract
Communication of emotions plays a key role in intraspecific social interactions and likely in interspecific interactions. Several studies have shown that animals perceive human joy and anger, but few studies have examined other human emotions, such as sadness. In this study, we conducted a cross-modal experiment, in which we showed 28 horses two soundless videos simultaneously, one showing a sad, and one a joyful human face. These were accompanied by either a sad or joyful voice. The number of horses whose first look to the video that was incongruent with the voice was longer than their first look to the congruent video was higher than chance, suggesting that horses could form cross-modal representations of human joy and sadness. Moreover, horses were more attentive to the videos of joy and looked at them for longer, more frequently, and more rapidly than the videos of sadness. Their heart rates tended to increase when they heard joy and to decrease when they heard sadness. These results show that horses are able to discriminate facial and vocal expressions of joy and sadness and may form cross-modal representations of these emotions; they also are more attracted to joyful faces than to sad faces and seem to be more aroused by a joyful voice than a sad voice. Further studies are needed to better understand how horses perceive the range of human emotions, and we propose that future experiments include neutral stimuli as well as emotions with different arousal levels but a same valence.
Collapse
Affiliation(s)
- Plotine Jardat
- CNRS, IFCE, INRAE, Université de Tours, PRC, 37380, Nouzilly, France.
| | - Océane Liehrmann
- Department of Biology, University of Turku, 20500, Turku, Finland
| | | | - Céline Parias
- CNRS, IFCE, INRAE, Université de Tours, PRC, 37380, Nouzilly, France
| | | | - Léa Lansade
- CNRS, IFCE, INRAE, Université de Tours, PRC, 37380, Nouzilly, France.
| |
Collapse
|
2
|
Gergely A, Gábor A, Gácsi M, Kis A, Czeibert K, Topál J, Andics A. Dog brains are sensitive to infant- and dog-directed prosody. Commun Biol 2023; 6:859. [PMID: 37596318 PMCID: PMC10439206 DOI: 10.1038/s42003-023-05217-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2023] [Accepted: 08/04/2023] [Indexed: 08/20/2023] Open
Abstract
When addressing preverbal infants and family dogs, people tend to use specific speech styles. While recent studies suggest acoustic parallels between infant- and dog-directed speech, it is unclear whether dogs, like infants, show enhanced neural sensitivity to prosodic aspects of speech directed to them. Using functional magnetic resonance imaging on awake unrestrained dogs we identify two non-primary auditory regions, one that involve the ventralmost part of the left caudal Sylvian gyrus and the temporal pole and the other at the transition of the left caudal and rostral Sylvian gyrus, which respond more to naturalistic dog- and/or infant-directed speech than to adult-directed speech, especially when speak by female speakers. This activity increase is driven by sensitivity to fundamental frequency mean and variance resulting in positive modulatory effects of these acoustic parameters in both aforementioned non-primary auditory regions. These findings show that the dog auditory cortex, similarly to that of human infants, is sensitive to the acoustic properties of speech directed to non-speaking partners. This increased neuronal responsiveness to exaggerated prosody may be one reason why dogs outperform other animals when processing speech.
Collapse
Affiliation(s)
- Anna Gergely
- Institute of Cognitive Neuroscience and Psychology, ELTE-ELKH NAP Comparative Ethology research group, Research Centre for Natural Sciences, Budapest, Hungary.
| | - Anna Gábor
- Department of Ethology, Eötvös Loránd University, Budapest, Hungary
- Neuroethology of Communication Lab, Department of Ethology, Eötvös Loránd University, Budapest, Hungary
| | - Márta Gácsi
- Department of Ethology, Eötvös Loránd University, Budapest, Hungary
- ELKH-ELTE Comparative Ethology Research Group, Budapest, Hungary
| | - Anna Kis
- Institute of Cognitive Neuroscience and Psychology, ELTE-ELKH NAP Comparative Ethology research group, Research Centre for Natural Sciences, Budapest, Hungary
| | - Kálmán Czeibert
- Department of Ethology, Eötvös Loránd University, Budapest, Hungary
| | - József Topál
- Institute of Cognitive Neuroscience and Psychology, ELTE-ELKH NAP Comparative Ethology research group, Research Centre for Natural Sciences, Budapest, Hungary
| | - Attila Andics
- Department of Ethology, Eötvös Loránd University, Budapest, Hungary
- Neuroethology of Communication Lab, Department of Ethology, Eötvös Loránd University, Budapest, Hungary
- ELTE NAP Canine Brain Research Group, Budapest, Hungary
| |
Collapse
|
3
|
Jardat P, Ringhofer M, Yamamoto S, Gouyet C, Degrande R, Parias C, Reigner F, Calandreau L, Lansade L. Horses form cross-modal representations of adults and children. Anim Cogn 2023; 26:369-377. [PMID: 35962844 DOI: 10.1007/s10071-022-01667-9] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/19/2022] [Revised: 07/29/2022] [Accepted: 08/02/2022] [Indexed: 11/26/2022]
Abstract
Recently, research on domestic mammals' sociocognitive skills toward humans has been prolific, allowing us to better understand the human-animal relationship. For example, horses have been shown to distinguish human beings on the basis of photographs and voices and to have cross-modal mental representations of individual humans and human emotions. This leads to questions such as the extent to which horses can differentiate human attributes such as age. Here, we tested whether horses discriminate human adults from children. In a cross-modal paradigm, we presented 31 female horses with two simultaneous muted videos of a child and an adult saying the same neutral sentence, accompanied by the sound of an adult's or child's voice speaking the sentence. The horses looked significantly longer at the videos that were incongruent with the heard voice than at the congruent videos. We conclude that horses can match adults' and children's faces and voices cross-modally. Moreover, their heart rates increased during children's vocalizations but not during adults'. This suggests that in addition to having mental representations of adults and children, horses have a stronger emotional response to children's voices than adults' voices.
Collapse
Affiliation(s)
- Plotine Jardat
- CNRS, IFCE, INRAE, Université de Tours, PRC, 37380, Nouzilly, France.
| | - Monamie Ringhofer
- Department of Animal Science, Teikyo University of Science, Tokyo, Japan
| | - Shinya Yamamoto
- Institute for Advanced Study, Kyoto University, Kyoto, Japan
- Wildlife Research Center, Kyoto University, Kyoto, Japan
| | - Chloé Gouyet
- CNRS, IFCE, INRAE, Université de Tours, PRC, 37380, Nouzilly, France
| | - Rachel Degrande
- CNRS, IFCE, INRAE, Université de Tours, PRC, 37380, Nouzilly, France
| | - Céline Parias
- CNRS, IFCE, INRAE, Université de Tours, PRC, 37380, Nouzilly, France
| | | | | | - Léa Lansade
- CNRS, IFCE, INRAE, Université de Tours, PRC, 37380, Nouzilly, France.
| |
Collapse
|