1
|
Froesel M, Gacoin M, Clavagnier S, Hauser M, Goudard Q, Ben Hamed S. Macaque claustrum, pulvinar and putative dorsolateral amygdala support the cross-modal association of social audio-visual stimuli based on meaning. Eur J Neurosci 2024; 59:3203-3223. [PMID: 38637993 DOI: 10.1111/ejn.16328] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2022] [Revised: 02/14/2024] [Accepted: 03/07/2024] [Indexed: 04/20/2024]
Abstract
Social communication draws on several cognitive functions such as perception, emotion recognition and attention. The association of audio-visual information is essential to the processing of species-specific communication signals. In this study, we use functional magnetic resonance imaging in order to identify the subcortical areas involved in the cross-modal association of visual and auditory information based on their common social meaning. We identified three subcortical regions involved in audio-visual processing of species-specific communicative signals: the dorsolateral amygdala, the claustrum and the pulvinar. These regions responded to visual, auditory congruent and audio-visual stimulations. However, none of them was significantly activated when the auditory stimuli were semantically incongruent with the visual context, thus showing an influence of visual context on auditory processing. For example, positive vocalization (coos) activated the three subcortical regions when presented in the context of positive facial expression (lipsmacks) but not when presented in the context of negative facial expression (aggressive faces). In addition, the medial pulvinar and the amygdala presented multisensory integration such that audiovisual stimuli resulted in activations that were significantly higher than those observed for the highest unimodal response. Last, the pulvinar responded in a task-dependent manner, along a specific spatial sensory gradient. We propose that the dorsolateral amygdala, the claustrum and the pulvinar belong to a multisensory network that modulates the perception of visual socioemotional information and vocalizations as a function of the relevance of the stimuli in the social context. SIGNIFICANCE STATEMENT: Understanding and correctly associating socioemotional information across sensory modalities, such that happy faces predict laughter and escape scenes predict screams, is essential when living in complex social groups. With the use of functional magnetic imaging in the awake macaque, we identify three subcortical structures-dorsolateral amygdala, claustrum and pulvinar-that only respond to auditory information that matches the ongoing visual socioemotional context, such as hearing positively valenced coo calls and seeing positively valenced mutual grooming monkeys. We additionally describe task-dependent activations in the pulvinar, organizing along a specific spatial sensory gradient, supporting its role as a network regulator.
Collapse
Affiliation(s)
- Mathilda Froesel
- Institut des Sciences Cognitives Marc Jeannerod, UMR5229 CNRS Université de Lyon, Bron Cedex, France
| | - Maëva Gacoin
- Institut des Sciences Cognitives Marc Jeannerod, UMR5229 CNRS Université de Lyon, Bron Cedex, France
| | - Simon Clavagnier
- Institut des Sciences Cognitives Marc Jeannerod, UMR5229 CNRS Université de Lyon, Bron Cedex, France
| | - Marc Hauser
- Risk-Eraser, West Falmouth, Massachusetts, USA
| | - Quentin Goudard
- Institut des Sciences Cognitives Marc Jeannerod, UMR5229 CNRS Université de Lyon, Bron Cedex, France
| | - Suliann Ben Hamed
- Institut des Sciences Cognitives Marc Jeannerod, UMR5229 CNRS Université de Lyon, Bron Cedex, France
| |
Collapse
|
2
|
Sladky R, Kargl D, Haubensak W, Lamm C. An active inference perspective for the amygdala complex. Trends Cogn Sci 2024; 28:223-236. [PMID: 38103984 DOI: 10.1016/j.tics.2023.11.004] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2023] [Revised: 11/14/2023] [Accepted: 11/16/2023] [Indexed: 12/19/2023]
Abstract
The amygdala is a heterogeneous network of subcortical nuclei with central importance in cognitive and clinical neuroscience. Various experimental designs in human psychology and animal model research have mapped multiple conceptual frameworks (e.g., valence/salience and decision making) to ever more refined amygdala circuitry. However, these predominantly bottom up-driven accounts often rely on interpretations tailored to a specific phenomenon, thus preventing comprehensive and integrative theories. We argue here that an active inference model of amygdala function could unify these fractionated approaches into an overarching framework for clearer empirical predictions and mechanistic interpretations. This framework embeds top-down predictive models, informed by prior knowledge and belief updating, within a dynamical system distributed across amygdala circuits in which self-regulation is implemented by continuously tracking environmental and homeostatic demands.
Collapse
Affiliation(s)
- Ronald Sladky
- Social, Cognitive, and Affective Neuroscience Unit, Department of Cognition, Emotion, and Methods in Psychology, Faculty of Psychology, University of Vienna, Liebiggasse 5, 1010 Vienna, Austria; Vienna Cognitive Science Hub, University of Vienna, 1010 Vienna, Austria.
| | - Dominic Kargl
- Department of Neuronal Cell Biology, Center for Brain Research, Medical University of Vienna, Spitalgasse 4, 1090 Vienna, Austria
| | - Wulf Haubensak
- Department of Neuronal Cell Biology, Center for Brain Research, Medical University of Vienna, Spitalgasse 4, 1090 Vienna, Austria; Research Institute of Molecular Pathology (IMP), Vienna Biocenter (VBC), Campus Vienna Biocenter 1, 1030 Vienna, Austria
| | - Claus Lamm
- Social, Cognitive, and Affective Neuroscience Unit, Department of Cognition, Emotion, and Methods in Psychology, Faculty of Psychology, University of Vienna, Liebiggasse 5, 1010 Vienna, Austria; Vienna Cognitive Science Hub, University of Vienna, 1010 Vienna, Austria
| |
Collapse
|
3
|
Iidaka T, Maesawa S, Kanayama N, Miyakoshi M, Ishizaki T, Saito R. Hemodynamic and electrophysiological responses of the human amygdala during face imitation-a study using functional MRI and intracranial EEG. Cereb Cortex 2024; 34:bhad488. [PMID: 38112625 DOI: 10.1093/cercor/bhad488] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2023] [Revised: 11/27/2023] [Accepted: 11/29/2023] [Indexed: 12/21/2023] Open
Abstract
The involvement of the human amygdala in facial mimicry remains a matter of debate. We investigated neural activity in the human amygdala during a task in which an imitation task was separated in time from an observation task involving facial expressions. Neural activity in the amygdala was measured using functional magnetic resonance imaging in 18 healthy individuals and using intracranial electroencephalogram in six medically refractory patients with epilepsy. The results of functional magnetic resonance imaging experiment showed that mimicry of negative and positive expressions activated the amygdala more than mimicry of non-emotional facial movements. In intracranial electroencephalogram experiment and time-frequency analysis, emotion-related activity of the amygdala during mimicry was observed as a significant neural oscillation in the high gamma band range. Furthermore, spectral event analysis of individual trial intracranial electroencephalogram data revealed that sustained oscillation of gamma band activity originated from an increased number and longer duration of neural events in the amygdala. Based on these findings, we conclude that during facial mimicry, visual information of expressions and feedback from facial movements are combined in the amygdalar nuclei. Considering the time difference of information approaching the amygdala, responses to facial movements are likely to modulate rather than initiate affective processing in human participants.
Collapse
Affiliation(s)
- Tetsuya Iidaka
- Brain & Mind Research Center, Nagoya University, Nagoya 461-8673, Japan
| | - Satoshi Maesawa
- Brain & Mind Research Center, Nagoya University, Nagoya 461-8673, Japan
- Department of Neurosurgery, Graduate School of Medicine, Nagoya University, Nagoya 466-8550, Japan
| | - Noriaki Kanayama
- Human Informatics Research Institute, National Institute of Advanced Industrial Science and Technology (AIST), Tsukuba, 305-8566, Japan
| | - Makoto Miyakoshi
- Division of Child and Adolescent Psychiatry, Cincinnati Children's Hospital Medical Center, Cincinnati, OH 45229-3026, United States
- Department of Psychiatry, University of Cincinnati College of Medicine, Cincinati, OH 45627-0555, United States
| | - Tomotaka Ishizaki
- Department of Neurosurgery, Graduate School of Medicine, Nagoya University, Nagoya 466-8550 , Japan
| | - Ryuta Saito
- Brain & Mind Research Center, Nagoya University, Nagoya 461-8673, Japan
- Department of Neurosurgery, Graduate School of Medicine, Nagoya University, Nagoya 466-8550 , Japan
| |
Collapse
|
4
|
Guex R, Ros T, Mégevand P, Spinelli L, Seeck M, Vuilleumier P, Domínguez-Borràs J. Prestimulus amygdala spectral activity is associated with visual face awareness. Cereb Cortex 2023; 33:1044-1057. [PMID: 35353177 PMCID: PMC9930624 DOI: 10.1093/cercor/bhac119] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2021] [Revised: 02/26/2022] [Accepted: 02/27/2022] [Indexed: 11/15/2022] Open
Abstract
Alpha cortical oscillations have been proposed to suppress sensory processing in the visual, auditory, and tactile domains, influencing conscious stimulus perception. However, it is unknown whether oscillatory neural activity in the amygdala, a subcortical structure involved in salience detection, has a similar impact on stimulus awareness. Recording intracranial electroencephalography (EEG) from 9 human amygdalae during face detection in a continuous flash suppression task, we found increased spectral prestimulus power and phase coherence, with most consistent effects in the alpha band, when faces were undetected relative to detected, similarly as previously observed in cortex with this task using scalp-EEG. Moreover, selective decreases in the alpha and gamma bands preceded face detection, with individual prestimulus alpha power correlating negatively with detection rate in patients. These findings reveal for the first time that prestimulus subcortical oscillations localized in human amygdala may contribute to perceptual gating mechanisms governing subsequent face detection and offer promising insights on the role of this structure in visual awareness.
Collapse
Affiliation(s)
- Raphael Guex
- Department of Fundamental Neuroscience, University of Geneva – Campus Biotech, Geneva 1211, Switzerland
- Department of Clinical Neuroscience, University of Geneva – HUG, Geneva 1211, Switzerland
- Swiss Center for Affective Sciences, University of Geneva, Geneva 1202, Switzerland
| | - Tomas Ros
- Department of Fundamental Neuroscience, Functional Brain Mapping Laboratory, Campus Biotech, University of Geneva, Geneva 1202, Switzerland
- Lemanic Biomedical Imaging Centre (CIBM), Geneva 1202, Switzerland
| | - Pierre Mégevand
- Department of Fundamental Neuroscience, University of Geneva – Campus Biotech, Geneva 1211, Switzerland
- Department of Clinical Neuroscience, University of Geneva – HUG, Geneva 1211, Switzerland
| | - Laurent Spinelli
- Department of Clinical Neuroscience, University of Geneva – HUG, Geneva 1211, Switzerland
| | - Margitta Seeck
- Department of Clinical Neuroscience, University of Geneva – HUG, Geneva 1211, Switzerland
| | - Patrik Vuilleumier
- Department of Fundamental Neuroscience, University of Geneva – Campus Biotech, Geneva 1211, Switzerland
- Swiss Center for Affective Sciences, University of Geneva, Geneva 1202, Switzerland
| | - Judith Domínguez-Borràs
- Department of Fundamental Neuroscience, University of Geneva – Campus Biotech, Geneva 1211, Switzerland
- Department of Clinical Psychology and Psychobiology, University of Barcelona, Barcelona 08035, Spain
| |
Collapse
|
5
|
Sonkusare S, Qiong D, Zhao Y, Liu W, Yang R, Mandali A, Manssuer L, Zhang C, Cao C, Sun B, Zhan S, Voon V. Frequency dependent emotion differentiation and directional coupling in amygdala, orbitofrontal and medial prefrontal cortex network with intracranial recordings. Mol Psychiatry 2022; 28:1636-1646. [PMID: 36460724 DOI: 10.1038/s41380-022-01883-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/08/2022] [Revised: 11/04/2022] [Accepted: 11/10/2022] [Indexed: 12/04/2022]
Abstract
The amygdala, orbitofrontal cortex (OFC) and medial prefrontal cortex (mPFC) form a crucial part of the emotion circuit, yet their emotion induced responses and interactions have been poorly investigated with direct intracranial recordings. Such high-fidelity signals can uncover precise spectral dynamics and frequency differences in valence processing allowing novel insights on neuromodulation. Here, leveraging the unique spatio-temporal advantages of intracranial electroencephalography (iEEG) from a cohort of 35 patients with intractable epilepsy (with 71 contacts in amygdala, 31 in OFC and 43 in mPFC), we assessed the spectral dynamics and interactions between the amygdala, OFC and mPFC during an emotional picture viewing task. Task induced activity showed greater broadband gamma activity in the negative condition compared to positive condition in all the three regions. Similarly, beta activity was increased in the negative condition in the amygdala and OFC while decreased in mPFC. Furthermore, beta activity of amygdala showed significant negative association with valence ratings. Critically, model-based computational analyses revealed unidirectional connectivity from mPFC to the amygdala and bidirectional communication between OFC-amygdala and OFC-mPFC. Our findings provide direct neurophysiological evidence for a much-posited model of top-down influence of mPFC over amygdala and a bidirectional influence between OFC and the amygdala. Altogether, in a relatively large sample size with human intracranial neuronal recordings, we highlight valence-dependent spectral dynamics and dyadic coupling within the amygdala-mPFC-OFC network with implications for potential targeted neuromodulation in emotion processing.
Collapse
Affiliation(s)
- Saurabh Sonkusare
- Department of Neurosurgery, Centre for Functional Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China.,Department of Psychiatry, University of Cambridge, Cambridge, UK.,Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China
| | - Ding Qiong
- Department of Neurosurgery, Centre for Functional Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China.,Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence, Fudan University, Ministry of Education, Shanghai, China
| | - Yijie Zhao
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China.,Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence, Fudan University, Ministry of Education, Shanghai, China
| | - Wei Liu
- Department of Neurosurgery, Centre for Functional Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Ruoqi Yang
- Department of Neurosurgery, Centre for Functional Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Alekhya Mandali
- Department of Psychiatry, University of Cambridge, Cambridge, UK.,Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, UK.,MRC Brain Network Dynamics Unit, University of Oxford, Oxford, UK
| | - Luis Manssuer
- Department of Neurosurgery, Centre for Functional Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China.,Department of Psychiatry, University of Cambridge, Cambridge, UK.,Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China
| | - Chencheng Zhang
- Department of Neurosurgery, Centre for Functional Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Chunyan Cao
- Department of Neurosurgery, Centre for Functional Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Bomin Sun
- Department of Neurosurgery, Centre for Functional Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Shikun Zhan
- Department of Neurosurgery, Centre for Functional Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China.
| | - Valerie Voon
- Department of Psychiatry, University of Cambridge, Cambridge, UK. .,Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China.
| |
Collapse
|
6
|
Dong H, Li N, Fan L, Wei J, Xu J. Integrative interaction of emotional speech in audio-visual modality. Front Neurosci 2022; 16:797277. [PMID: 36440282 PMCID: PMC9695733 DOI: 10.3389/fnins.2022.797277] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2021] [Accepted: 09/21/2022] [Indexed: 11/13/2022] Open
Abstract
Emotional clues are always expressed in many ways in our daily life, and the emotional information we receive is often represented by multiple modalities. Successful social interactions require a combination of multisensory cues to accurately determine the emotion of others. The integration mechanism of multimodal emotional information has been widely investigated. Different brain activity measurement methods were used to determine the location of brain regions involved in the audio-visual integration of emotional information, mainly in the bilateral superior temporal regions. However, the methods adopted in these studies are relatively simple, and the materials of the study rarely contain speech information. The integration mechanism of emotional speech in the human brain still needs further examinations. In this paper, a functional magnetic resonance imaging (fMRI) study was conducted using event-related design to explore the audio-visual integration mechanism of emotional speech in the human brain by using dynamic facial expressions and emotional speech to express emotions of different valences. Representational similarity analysis (RSA) based on regions of interest (ROIs), whole brain searchlight analysis, modality conjunction analysis and supra-additive analysis were used to analyze and verify the role of relevant brain regions. Meanwhile, a weighted RSA method was used to evaluate the contributions of each candidate model in the best fitted model of ROIs. The results showed that only the left insula was detected by all methods, suggesting that the left insula played an important role in the audio-visual integration of emotional speech. Whole brain searchlight analysis, modality conjunction analysis and supra-additive analysis together revealed that the bilateral middle temporal gyrus (MTG), right inferior parietal lobule and bilateral precuneus might be involved in the audio-visual integration of emotional speech from other aspects.
Collapse
Affiliation(s)
- Haibin Dong
- Tianjin Key Lab of Cognitive Computing and Application, College of Intelligence and Computing, Tianjin University, Tianjin, China
| | - Na Li
- Tianjin Key Lab of Cognitive Computing and Application, College of Intelligence and Computing, Tianjin University, Tianjin, China
| | - Lingzhong Fan
- Brainnetome Center, Institute of Automation, Chinese Academy of Sciences, Beijing, China
| | - Jianguo Wei
- Tianjin Key Lab of Cognitive Computing and Application, College of Intelligence and Computing, Tianjin University, Tianjin, China
| | - Junhai Xu
- Tianjin Key Lab of Cognitive Computing and Application, College of Intelligence and Computing, Tianjin University, Tianjin, China
- *Correspondence: Junhai Xu,
| |
Collapse
|
7
|
Weisholtz DS, Kreiman G, Silbersweig DA, Stern E, Cha B, Butler T. Localized task-invariant emotional valence encoding revealed by intracranial recordings. Soc Cogn Affect Neurosci 2022; 17:549-558. [PMID: 34941992 PMCID: PMC9164208 DOI: 10.1093/scan/nsab134] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2021] [Revised: 09/05/2021] [Accepted: 12/22/2021] [Indexed: 11/13/2022] Open
Abstract
The ability to distinguish between negative, positive and neutral valence is a key part of emotion perception. Emotional valence has conceptual meaning that supersedes any particular type of stimulus, although it is typically captured experimentally in association with particular tasks. We sought to identify neural encoding for task-invariant emotional valence. We evaluated whether high-gamma responses (HGRs) to visually displayed words conveying emotions could be used to decode emotional valence from HGRs to facial expressions. Intracranial electroencephalography was recorded from 14 individuals while they participated in two tasks, one involving reading words with positive, negative, and neutral valence, and the other involving viewing faces with positive, negative, and neutral facial expressions. Quadratic discriminant analysis was used to identify information in the HGR that differentiates the three emotion conditions. A classifier was trained on the emotional valence labels from one task and was cross-validated on data from the same task (within-task classifier) as well as the other task (between-task classifier). Emotional valence could be decoded in the left medial orbitofrontal cortex and middle temporal gyrus, both using within-task classifiers and between-task classifiers. These observations suggest the presence of task-independent emotional valence information in the signals from these regions.
Collapse
Affiliation(s)
- Daniel S Weisholtz
- Department of Neurology, Brigham and Women's Hospital, Harvard Medical School, Boston, MA 02115, USA
| | - Gabriel Kreiman
- Boston Children's Hospital, Harvard Medical School, Boston, MA 02115, USA
| | - David A Silbersweig
- Department of Neurology, Brigham and Women's Hospital, Harvard Medical School, Boston, MA 02115, USA
| | - Emily Stern
- Department of Neurology, Brigham and Women's Hospital, Harvard Medical School, Boston, MA 02115, USA.,Ceretype Neuromedicine, Inc
| | - Brannon Cha
- University of California San Diego School of Medicine.,Department of Neurology, Brigham and Women's Hospital, Harvard Medical School, Boston, MA 02115, USA
| | - Tracy Butler
- Department of Radiology, Weill Cornell Medical Center, New York 10065, USA
| |
Collapse
|
8
|
Domínguez-Borràs J, Vuilleumier P. Amygdala function in emotion, cognition, and behavior. HANDBOOK OF CLINICAL NEUROLOGY 2022; 187:359-380. [PMID: 35964983 DOI: 10.1016/b978-0-12-823493-8.00015-8] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
The amygdala is a core structure in the anterior medial temporal lobe, with an important role in several brain functions involving memory, emotion, perception, social cognition, and even awareness. As a key brain structure for saliency detection, it triggers and controls widespread modulatory signals onto multiple areas of the brain, with a great impact on numerous aspects of adaptive behavior. Here we discuss the neural mechanisms underlying these functions, as established by animal and human research, including insights provided in both healthy and pathological conditions.
Collapse
Affiliation(s)
- Judith Domínguez-Borràs
- Department of Clinical Psychology and Psychobiology & Institute of Neurosciences, University of Barcelona, Barcelona, Spain
| | - Patrik Vuilleumier
- Department of Neuroscience and Center for Affective Sciences, University of Geneva, Geneva, Switzerland.
| |
Collapse
|
9
|
Taffou M, Suied C, Viaud-Delmon I. Auditory roughness elicits defense reactions. Sci Rep 2021; 11:956. [PMID: 33441758 PMCID: PMC7806762 DOI: 10.1038/s41598-020-79767-0] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2020] [Accepted: 12/09/2020] [Indexed: 11/26/2022] Open
Abstract
Auditory roughness elicits aversion, and higher activation in cerebral areas involved in threat processing, but its link with defensive behavior is unknown. Defensive behaviors are triggered by intrusions into the space immediately surrounding the body, called peripersonal space (PPS). Integrating multisensory information in PPS is crucial to assure the protection of the body. Here, we assessed the behavioral effects of roughness on auditory-tactile integration, which reflects the monitoring of this multisensory region of space. Healthy human participants had to detect as fast as possible a tactile stimulation delivered on their hand while an irrelevant sound was approaching them from the rear hemifield. The sound was either a simple harmonic sound or a rough sound, processed through binaural rendering so that the virtual sound source was looming towards participants. The rough sound speeded tactile reaction times at a farther distance from the body than the non-rough sound. This indicates that PPS, as estimated here via auditory-tactile integration, is sensitive to auditory roughness. Auditory roughness modifies the behavioral relevance of simple auditory events in relation to the body. Even without emotional or social contextual information, auditory roughness constitutes an innate threat cue that elicits defensive responses.
Collapse
Affiliation(s)
- Marine Taffou
- Institut de Recherche Biomédicale des Armées, 91220, Brétigny-sur-Orge, France.
| | - Clara Suied
- Institut de Recherche Biomédicale des Armées, 91220, Brétigny-sur-Orge, France
| | - Isabelle Viaud-Delmon
- CNRS, Ircam, Sorbonne Université, Ministère de la Culture, Sciences et Technologies de la Musique et du son, STMS, 75004, Paris, France
| |
Collapse
|