1
|
Stocke S, Samuelsen CL. Multisensory Integration Underlies the Distinct Representation of Odor-Taste Mixtures in the Gustatory Cortex of Behaving Rats. J Neurosci 2024; 44:e0071242024. [PMID: 38548337 PMCID: PMC11097261 DOI: 10.1523/jneurosci.0071-24.2024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2024] [Revised: 02/21/2024] [Accepted: 03/14/2024] [Indexed: 05/15/2024] Open
Abstract
The perception of food relies on the integration of olfactory and gustatory signals originating from the mouth. This multisensory process generates robust associations between odors and tastes, significantly influencing the perceptual judgment of flavors. However, the specific neural substrates underlying this integrative process remain unclear. Previous electrophysiological studies identified the gustatory cortex as a site of convergent olfactory and gustatory signals, but whether neurons represent multimodal odor-taste mixtures as distinct from their unimodal odor and taste components is unknown. To investigate this, we recorded single-unit activity in the gustatory cortex of behaving female rats during the intraoral delivery of individual odors, individual tastes, and odor-taste mixtures. Our results demonstrate that chemoselective neurons in the gustatory cortex are broadly responsive to intraoral chemosensory stimuli, exhibiting time-varying multiphasic changes in activity. In a subset of these chemoselective neurons, odor-taste mixtures elicit nonlinear cross-modal responses that distinguish them from their olfactory and gustatory components. These findings provide novel insights into multimodal chemosensory processing by the gustatory cortex, highlighting the distinct representation of unimodal and multimodal intraoral chemosensory signals. Overall, our findings suggest that olfactory and gustatory signals interact nonlinearly in the gustatory cortex to enhance the identity coding of both unimodal and multimodal chemosensory stimuli.
Collapse
Affiliation(s)
- Sanaya Stocke
- Departments of Biology, University of Louisville, Louisville, Kentucky 40292
| | - Chad L Samuelsen
- Anatomical Sciences and Neurobiology, University of Louisville, Louisville, Kentucky 40292
| |
Collapse
|
2
|
Sharma KK, Diltz MA, Lincoln T, Albuquerque ER, Romanski LM. Neuronal Population Encoding of Identity in Primate Prefrontal Cortex. J Neurosci 2024; 44:e0703232023. [PMID: 37963766 PMCID: PMC10860606 DOI: 10.1523/jneurosci.0703-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2023] [Revised: 08/22/2023] [Accepted: 10/10/2023] [Indexed: 11/16/2023] Open
Abstract
The ventrolateral prefrontal cortex (VLPFC) shows robust activation during the perception of faces and voices. However, little is known about what categorical features of social stimuli drive neural activity in this region. Since perception of identity and expression are critical social functions, we examined whether neural responses to naturalistic stimuli were driven by these two categorical features in the prefrontal cortex. We recorded single neurons in the VLPFC, while two male rhesus macaques (Macaca mulatta) viewed short audiovisual videos of unfamiliar conspecifics making expressions of aggressive, affiliative, and neutral valence. Of the 285 neurons responsive to the audiovisual stimuli, 111 neurons had a main effect (two-way ANOVA) of identity, expression, or their interaction in their stimulus-related firing rates; however, decoding of expression and identity using single-unit firing rates rendered poor accuracy. Interestingly, when decoding from pseudo-populations of recorded neurons, the accuracy for both expression and identity increased with population size, suggesting that the population transmitted information relevant to both variables. Principal components analysis of mean population activity across time revealed that population responses to the same identity followed similar trajectories in the response space, facilitating segregation from other identities. Our results suggest that identity is a critical feature of social stimuli that dictates the structure of population activity in the VLPFC, during the perception of vocalizations and their corresponding facial expressions. These findings enhance our understanding of the role of the VLPFC in social behavior.
Collapse
Affiliation(s)
- K K Sharma
- Department of Neuroscience, School of Medicine and Dentistry, University of Rochester, Rochester, New York 14620
| | - M A Diltz
- Department of Neuroscience, School of Medicine and Dentistry, University of Rochester, Rochester, New York 14620
| | - T Lincoln
- Department of Neuroscience, School of Medicine and Dentistry, University of Rochester, Rochester, New York 14620
| | - E R Albuquerque
- Department of Neuroscience, School of Medicine and Dentistry, University of Rochester, Rochester, New York 14620
| | - L M Romanski
- Department of Neuroscience, School of Medicine and Dentistry, University of Rochester, Rochester, New York 14620
| |
Collapse
|
3
|
Shan L, Yuan L, Zhang B, Ma J, Xu X, Gu F, Jiang Y, Dai J. Neural Integration of Audiovisual Sensory Inputs in Macaque Amygdala and Adjacent Regions. Neurosci Bull 2023; 39:1749-1761. [PMID: 36920645 PMCID: PMC10661144 DOI: 10.1007/s12264-023-01043-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2023] [Accepted: 02/13/2023] [Indexed: 03/16/2023] Open
Abstract
Integrating multisensory inputs to generate accurate perception and guide behavior is among the most critical functions of the brain. Subcortical regions such as the amygdala are involved in sensory processing including vision and audition, yet their roles in multisensory integration remain unclear. In this study, we systematically investigated the function of neurons in the amygdala and adjacent regions in integrating audiovisual sensory inputs using a semi-chronic multi-electrode array and multiple combinations of audiovisual stimuli. From a sample of 332 neurons, we showed the diverse response patterns to audiovisual stimuli and the neural characteristics of bimodal over unimodal modulation, which could be classified into four types with differentiated regional origins. Using the hierarchical clustering method, neurons were further clustered into five groups and associated with different integrating functions and sub-regions. Finally, regions distinguishing congruent and incongruent bimodal sensory inputs were identified. Overall, visual processing dominates audiovisual integration in the amygdala and adjacent regions. Our findings shed new light on the neural mechanisms of multisensory integration in the primate brain.
Collapse
Affiliation(s)
- Liang Shan
- CAS Key Laboratory of Brain Connectome and Manipulation, the Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China
- Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen, 518055, China
| | - Liu Yuan
- CAS Key Laboratory of Brain Connectome and Manipulation, the Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China
- University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Bo Zhang
- CAS Key Laboratory of Brain Connectome and Manipulation, the Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China
- Key Laboratory of Brain Science, Zunyi Medical University, Zunyi, 563000, China
| | - Jian Ma
- CAS Key Laboratory of Brain Connectome and Manipulation, the Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China
| | - Xiao Xu
- CAS Key Laboratory of Brain Connectome and Manipulation, the Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China
| | - Fei Gu
- University of Chinese Academy of Sciences, Beijing, 100049, China
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China
| | - Yi Jiang
- University of Chinese Academy of Sciences, Beijing, 100049, China.
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China.
- Chinese Institute for Brain Research, Beijing, 102206, China.
| | - Ji Dai
- CAS Key Laboratory of Brain Connectome and Manipulation, the Brain Cognition and Brain Disease Institute (BCBDI), Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, China.
- Shenzhen-Hong Kong Institute of Brain Science-Shenzhen Fundamental Research Institutions, Shenzhen, 518055, China.
- University of Chinese Academy of Sciences, Beijing, 100049, China.
- Shenzhen Technological Research Center for Primate Translational Medicine, Shenzhen, 518055, China.
| |
Collapse
|
4
|
Romanski LM, Sharma KK. Multisensory interactions of face and vocal information during perception and memory in ventrolateral prefrontal cortex. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220343. [PMID: 37545305 PMCID: PMC10404928 DOI: 10.1098/rstb.2022.0343] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2023] [Accepted: 03/21/2023] [Indexed: 08/08/2023] Open
Abstract
The ventral frontal lobe is a critical node in the circuit that underlies communication, a multisensory process where sensory features of faces and vocalizations come together. The neural basis of face and vocal integration is a topic of great importance since the integration of multiple sensory signals is essential for the decisions that govern our social interactions. Investigations have shown that the macaque ventrolateral prefrontal cortex (VLPFC), a proposed homologue of the human inferior frontal gyrus, is involved in the processing, integration and remembering of audiovisual signals. Single neurons in VLPFC encode and integrate species-specific faces and corresponding vocalizations. During working memory, VLPFC neurons maintain face and vocal information online and exhibit selective activity for face and vocal stimuli. Population analyses indicate that identity, a critical feature of social stimuli, is encoded by VLPFC neurons and dictates the structure of dynamic population activity in the VLPFC during the perception of vocalizations and their corresponding facial expressions. These studies suggest that VLPFC may play a primary role in integrating face and vocal stimuli with contextual information, in order to support decision making during social communication. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Lizabeth M. Romanski
- Department of Neuroscience, University of Rochester School of Medicine, Rochester, NY 14642, USA
| | - Keshov K. Sharma
- Department of Neuroscience, University of Rochester School of Medicine, Rochester, NY 14642, USA
| |
Collapse
|
5
|
Diehl MM, Plakke B, Albuquerque E, Romanski LM. Representation of expression and identity by ventral prefrontal neurons. Neuroscience 2022; 496:243-260. [PMID: 35654293 PMCID: PMC10363293 DOI: 10.1016/j.neuroscience.2022.05.033] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2021] [Revised: 05/20/2022] [Accepted: 05/25/2022] [Indexed: 01/26/2023]
Abstract
Evidence has suggested that the ventrolateral prefrontal cortex (VLPFC) processes social stimuli, including faces and vocalizations, which are essential for communication. Features embedded within audiovisual stimuli, including emotional expression and caller identity, provide abundant information about an individual's intention, emotional state, motivation, and social status, which are important to encode in a social exchange. However, it is unknown to what extent the VLPFC encodes such features. To investigate the role of VLPFC during social communication, we recorded single-unit activity while rhesus macaques (Macaca mulatta) performed a nonmatch-to-sample task using species-specific face-vocalization stimuli that differed in emotional expression or caller identity. 75% of recorded cells were task-related and of these >70% were responsive during the nonmatch period. A larger proportion of nonmatch cells encoded the stimulus rather than the context of the trial type. A subset of responsive neurons were most commonly modulated by the identity of the nonmatch stimulus and less by the emotional expression, or both features within the face-vocalization stimuli presented during the nonmatch period. Neurons encoding identity were found in VLPFC across a broader region than expression related cells which were confined to only the anterolateral portion of the recording chamber in VLPFC. These findings suggest that, within a working memory paradigm, VLPFC processes features of face and vocal stimuli, such as emotional expression and identity, in addition to task and contextual information. Thus, stimulus and contextual information may be integrated by VLPFC during social communication.
Collapse
|
6
|
Cui J, Sawamura D, Sakuraba S, Saito R, Tanabe Y, Miura H, Sugi M, Yoshida K, Watanabe A, Tokikuni Y, Yoshida S, Sakai S. Effect of Audiovisual Cross-Modal Conflict during Working Memory Tasks: A Near-Infrared Spectroscopy Study. Brain Sci 2022; 12:brainsci12030349. [PMID: 35326305 PMCID: PMC8946709 DOI: 10.3390/brainsci12030349] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2022] [Revised: 03/01/2022] [Accepted: 03/01/2022] [Indexed: 12/04/2022] Open
Abstract
Cognitive conflict effects are well characterized within unimodality. However, little is known about cross-modal conflicts and their neural bases. This study characterizes the two types of visual and auditory cross-modal conflicts through working memory tasks and brain activities. The participants consisted of 31 healthy, right-handed, young male adults. The Paced Auditory Serial Addition Test (PASAT) and the Paced Visual Serial Addition Test (PVSAT) were performed under distractor and no distractor conditions. Distractor conditions comprised two conditions in which either the PASAT or PVSAT was the target task, and the other was used as a distractor stimulus. Additionally, oxygenated hemoglobin (Oxy-Hb) concentration changes in the frontoparietal regions were measured during tasks. The results showed significantly lower PASAT performance under distractor conditions than under no distractor conditions, but not in the PVSAT. Oxy-Hb changes in the bilateral ventrolateral prefrontal cortex (VLPFC) and inferior parietal cortex (IPC) significantly increased in the PASAT with distractor compared with no distractor conditions, but not in the PVSAT. Furthermore, there were significant positive correlations between Δtask performance accuracy and ΔOxy-Hb in the bilateral IPC only in the PASAT. Visual cross-modal conflict significantly impairs auditory task performance, and bilateral VLPFC and IPC are key regions in inhibiting visual cross-modal distractors.
Collapse
Affiliation(s)
- Jiahong Cui
- Graduate School of Health Sciences, Hokkaido University, Sapporo 060-0812, Japan; (J.C.); (R.S.); (H.M.); (A.W.); (Y.T.)
| | - Daisuke Sawamura
- Department of Rehabilitation Science, Faculty of Health Sciences, Hokkaido University, Sapporo 060-0812, Japan; (K.Y.); (S.S.)
- Correspondence:
| | - Satoshi Sakuraba
- Department of Rehabilitation Sciences, Health Sciences University of Hokkaido, Sapporo 061-0293, Japan; (S.S.); (S.Y.)
| | - Ryuji Saito
- Graduate School of Health Sciences, Hokkaido University, Sapporo 060-0812, Japan; (J.C.); (R.S.); (H.M.); (A.W.); (Y.T.)
| | - Yoshinobu Tanabe
- Department of Rehabilitation, Shinsapporo Paulo Hospital, Sapporo 004-0002, Japan;
| | - Hiroshi Miura
- Graduate School of Health Sciences, Hokkaido University, Sapporo 060-0812, Japan; (J.C.); (R.S.); (H.M.); (A.W.); (Y.T.)
| | - Masaaki Sugi
- Department of Rehabilitation, Tokeidai Memorial Hospital, Sapporo 060-0031, Japan;
| | - Kazuki Yoshida
- Department of Rehabilitation Science, Faculty of Health Sciences, Hokkaido University, Sapporo 060-0812, Japan; (K.Y.); (S.S.)
| | - Akihiro Watanabe
- Graduate School of Health Sciences, Hokkaido University, Sapporo 060-0812, Japan; (J.C.); (R.S.); (H.M.); (A.W.); (Y.T.)
| | - Yukina Tokikuni
- Graduate School of Health Sciences, Hokkaido University, Sapporo 060-0812, Japan; (J.C.); (R.S.); (H.M.); (A.W.); (Y.T.)
| | - Susumu Yoshida
- Department of Rehabilitation Sciences, Health Sciences University of Hokkaido, Sapporo 061-0293, Japan; (S.S.); (S.Y.)
| | - Shinya Sakai
- Department of Rehabilitation Science, Faculty of Health Sciences, Hokkaido University, Sapporo 060-0812, Japan; (K.Y.); (S.S.)
| |
Collapse
|
7
|
Murray EA, Fellows LK. Prefrontal cortex interactions with the amygdala in primates. Neuropsychopharmacology 2022; 47:163-179. [PMID: 34446829 PMCID: PMC8616954 DOI: 10.1038/s41386-021-01128-w] [Citation(s) in RCA: 23] [Impact Index Per Article: 11.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/29/2021] [Revised: 07/21/2021] [Accepted: 07/22/2021] [Indexed: 02/07/2023]
Abstract
This review addresses functional interactions between the primate prefrontal cortex (PFC) and the amygdala, with emphasis on their contributions to behavior and cognition. The interplay between these two telencephalic structures contributes to adaptive behavior and to the evolutionary success of all primate species. In our species, dysfunction in this circuitry creates vulnerabilities to psychopathologies. Here, we describe amygdala-PFC contributions to behaviors that have direct relevance to Darwinian fitness: learned approach and avoidance, foraging, predator defense, and social signaling, which have in common the need for flexibility and sensitivity to specific and rapidly changing contexts. Examples include the prediction of positive outcomes, such as food availability, food desirability, and various social rewards, or of negative outcomes, such as threats of harm from predators or conspecifics. To promote fitness optimally, these stimulus-outcome associations need to be rapidly updated when an associative contingency changes or when the value of a predicted outcome changes. We review evidence from nonhuman primates implicating the PFC, the amygdala, and their functional interactions in these processes, with links to experimental work and clinical findings in humans where possible.
Collapse
Affiliation(s)
| | - Lesley K Fellows
- Department of Neurology and Neurosurgery, Montreal Neurological Institute, McGill University, Montreal, QC, Canada
| |
Collapse
|
8
|
Bigelow J, Morrill RJ, Olsen T, Hasenstaub AR. Visual modulation of firing and spectrotemporal receptive fields in mouse auditory cortex. CURRENT RESEARCH IN NEUROBIOLOGY 2022; 3:100040. [DOI: 10.1016/j.crneur.2022.100040] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2022] [Revised: 04/26/2022] [Accepted: 05/06/2022] [Indexed: 10/18/2022] Open
|
9
|
Merrikhi Y, Kok MA, Carrasco A, Meredith MA, Lomber SG. MULTISENSORY RESPONSES IN A BELT REGION OF THE DORSAL AUDITORY CORTICAL PATHWAY. Eur J Neurosci 2021; 55:589-610. [PMID: 34927294 DOI: 10.1111/ejn.15573] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2021] [Revised: 12/13/2021] [Accepted: 12/14/2021] [Indexed: 11/30/2022]
Abstract
A basic function of the cerebral cortex is to receive and integrate information from different sensory modalities into a comprehensive percept of the environment. Neurons that demonstrate multisensory convergence occur across the necortex, but are especially prevalent in higher-order, association areas. However, a recent study of a cat higher-order auditory area, the dorsal zone (DZ) of auditory cortex, did not observe any multisensory features. Therefore, the goal of the present investigation was to address this conflict using recording and testing methodologies that are established for exposing and studying multisensory neuronal processing. Among the 482 neurons studied, we found that 76.6% were influenced by non-auditory stimuli. Of these neurons, 99% were affected by visual stimulation, but only 11% by somatosensory. Furthermore, a large proportion of the multisensory neurons showed integrated responses to multisensory stimulation, constituted a majority of the excitatory and inhibitory neurons encountered (as identified by the duration of their waveshape), and exhibited a distinct spatial distribution within DZ. These findings demonstrate that the dorsal zone of auditory cortex robustly exhibits multisensory properties and that the proportions of multisensory neurons encountered are consistent with those identified in other higher-order cortices.
Collapse
Affiliation(s)
- Yaser Merrikhi
- Department of Physiology, Faculty of Medicine, McGill University, Montreal, Quebec, Canada
| | - Melanie A Kok
- Graduate Program in Neuroscience, University of Western Ontario, London, Ontario, Canada
| | - Andres Carrasco
- Graduate Program in Neuroscience, University of Western Ontario, London, Ontario, Canada
| | - M Alex Meredith
- Department of Anatomy and Neurobiology, School of Medicine, Virginia Commonwealth University, Richmond, Virginia, USA
| | - Stephen G Lomber
- Department of Physiology, Faculty of Medicine, McGill University, Montreal, Quebec, Canada
| |
Collapse
|
10
|
Predictive Feedback, Early Sensory Representations, and Fast Responses to Predicted Stimuli Depend on NMDA Receptors. J Neurosci 2021; 41:10130-10147. [PMID: 34732525 DOI: 10.1523/jneurosci.1311-21.2021] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2021] [Revised: 09/23/2021] [Accepted: 10/25/2021] [Indexed: 01/03/2023] Open
Abstract
Learned associations between stimuli allow us to model the world and make predictions, crucial for efficient behavior (e.g., hearing a siren, we expect to see an ambulance and quickly make way). While there are theoretical and computational frameworks for prediction, the circuit and receptor-level mechanisms are unclear. Using high-density EEG, Bayesian modeling, and machine learning, we show that inferred "causal" relationships between stimuli and frontal alpha activity account for reaction times (a proxy for predictions) on a trial-by-trial basis in an audiovisual delayed match-to-sample task which elicited predictions. Predictive β feedback activated sensory representations in advance of predicted stimuli. Low-dose ketamine, an NMDAR blocker, but not the control drug dexmedetomidine, perturbed behavioral indices of predictions, their representation in higher-order cortex, feedback to posterior cortex, and pre-activation of sensory templates in higher-order sensory cortex. This study suggests that predictions depend on alpha activity in higher-order cortex, β feedback, and NMDARs, and ketamine blocks access to learned predictive information.SIGNIFICANCE STATEMENT We learn the statistical regularities around us, creating associations between sensory stimuli. These associations can be exploited by generating predictions, which enable fast and efficient behavior. When predictions are perturbed, it can negatively influence perception and even contribute to psychiatric disorders, such as schizophrenia. Here we show that the frontal lobe generates predictions and sends them to posterior brain areas, to activate representations of predicted sensory stimuli before their appearance. Oscillations in neural activity (α and β waves) are vital for these predictive mechanisms. The drug ketamine blocks predictions and the underlying mechanisms. This suggests that the generation of predictions in the frontal lobe, and the feedback pre-activating sensory representations in advance of stimuli, depend on NMDARs.
Collapse
|
11
|
Neuronal activity in the monkey prefrontal cortex during a duration discrimination task with visual and auditory cues. Sci Rep 2021; 11:17520. [PMID: 34471190 PMCID: PMC8410858 DOI: 10.1038/s41598-021-97094-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2021] [Accepted: 08/20/2021] [Indexed: 11/27/2022] Open
Abstract
To investigate neuronal processing involved in the integration of auditory and visual signals for time perception, we examined neuronal activity in prefrontal cortex (PFC) of macaque monkeys during a duration discrimination task with auditory and visual cues. In the task, two cues were consecutively presented for different durations between 0.2 and 1.8 s. Each cue was either auditory or visual and was followed by a delay period. After the second delay, subjects indicated whether the first or the second cue was longer. Cue- and delay-responsive neurons were found in PFC. Cue-responsive neurons mostly responded to either the auditory or the visual cue, and to either the first or the second cue. The neurons responsive to the first delay showed activity that changed depending on the first cue duration and were mostly sensitive to cue modality. The neurons responsive to the second delay exhibited activity that represented which cue, the first or second cue, was presented longer. Nearly half of this activity representing order-based duration was sensitive to cue modality. These results suggest that temporal information with visual and auditory signals was separately processed in PFC in the early stage of duration discrimination and integrated for the final decision.
Collapse
|
12
|
Rezaul Karim AKM, Proulx MJ, de Sousa AA, Likova LT. Neuroplasticity and Crossmodal Connectivity in the Normal, Healthy Brain. PSYCHOLOGY & NEUROSCIENCE 2021; 14:298-334. [PMID: 36937077 PMCID: PMC10019101 DOI: 10.1037/pne0000258] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Objective Neuroplasticity enables the brain to establish new crossmodal connections or reorganize old connections which are essential to perceiving a multisensorial world. The intent of this review is to identify and summarize the current developments in neuroplasticity and crossmodal connectivity, and deepen understanding of how crossmodal connectivity develops in the normal, healthy brain, highlighting novel perspectives about the principles that guide this connectivity. Methods To the above end, a narrative review is carried out. The data documented in prior relevant studies in neuroscience, psychology and other related fields available in a wide range of prominent electronic databases are critically assessed, synthesized, interpreted with qualitative rather than quantitative elements, and linked together to form new propositions and hypotheses about neuroplasticity and crossmodal connectivity. Results Three major themes are identified. First, it appears that neuroplasticity operates by following eight fundamental principles and crossmodal integration operates by following three principles. Second, two different forms of crossmodal connectivity, namely direct crossmodal connectivity and indirect crossmodal connectivity, are suggested to operate in both unisensory and multisensory perception. Third, three principles possibly guide the development of crossmodal connectivity into adulthood. These are labeled as the principle of innate crossmodality, the principle of evolution-driven 'neuromodular' reorganization and the principle of multimodal experience. These principles are combined to develop a three-factor interaction model of crossmodal connectivity. Conclusions The hypothesized principles and the proposed model together advance understanding of neuroplasticity, the nature of crossmodal connectivity, and how such connectivity develops in the normal, healthy brain.
Collapse
|
13
|
Ainsworth M, Sallet J, Joly O, Kyriazis D, Kriegeskorte N, Duncan J, Schüffelgen U, Rushworth MFS, Bell AH. Viewing Ambiguous Social Interactions Increases Functional Connectivity between Frontal and Temporal Nodes of the Social Brain. J Neurosci 2021; 41:6070-6086. [PMID: 34099508 PMCID: PMC8276745 DOI: 10.1523/jneurosci.0870-20.2021] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2020] [Revised: 04/19/2021] [Accepted: 04/28/2021] [Indexed: 11/25/2022] Open
Abstract
Social behavior is coordinated by a network of brain regions, including those involved in the perception of social stimuli and those involved in complex functions, such as inferring perceptual and mental states and controlling social interactions. The properties and function of many of these regions in isolation are relatively well understood, but less is known about how these regions interact while processing dynamic social interactions. To investigate whether the functional connectivity between brain regions is modulated by social context, we collected fMRI data from male monkeys (Macaca mulatta) viewing videos of social interactions labeled as "affiliative," "aggressive," or "ambiguous." We show activation related to the perception of social interactions along both banks of the superior temporal sulcus, parietal cortex, medial and lateral frontal cortex, and the caudate nucleus. Within this network, we show that fronto-temporal functional connectivity is significantly modulated by social context. Crucially, we link the observation of specific behaviors to changes in functional connectivity within our network. Viewing aggressive behavior was associated with a limited increase in temporo-temporal and a weak increase in cingulate-temporal connectivity. By contrast, viewing interactions where the outcome was uncertain was associated with a pronounced increase in temporo-temporal, and cingulate-temporal functional connectivity. We hypothesize that this widespread network synchronization occurs when cingulate and temporal areas coordinate their activity when more difficult social inferences are being made.SIGNIFICANCE STATEMENT Processing social information from our environment requires the activation of several brain regions, which are concentrated within the frontal and temporal lobes. However, little is known about how these areas interact to facilitate the processing of different social interactions. Here we show that functional connectivity within and between the frontal and temporal lobes is modulated by social context. Specifically, we demonstrate that viewing social interactions where the outcome was unclear is associated with increased synchrony within and between the cingulate cortex and temporal cortices. These findings suggest that the coordination between the cingulate and temporal cortices is enhanced when more difficult social inferences are being made.
Collapse
Affiliation(s)
- Matthew Ainsworth
- MRC Cognition and Brain Sciences Unit, University of Cambridge, Cambridge, United Kingdom, CB2 7EF
- Department of Experimental Psychology, University of Oxford, Oxford, United Kingdom, OX2 6GG
| | - Jérôme Sallet
- Department of Experimental Psychology, University of Oxford, Oxford, United Kingdom, OX2 6GG
- Wellcome Centre for Integrative Neuroimaging, University of Oxford, Oxford, United Kingdom, OX3 9DU
- Inserm, Stem Cell and Brain Research Institute U1208, Université Lyon 1, 69500 Bron, France
| | - Olivier Joly
- MRC Cognition and Brain Sciences Unit, University of Cambridge, Cambridge, United Kingdom, CB2 7EF
| | - Diana Kyriazis
- MRC Cognition and Brain Sciences Unit, University of Cambridge, Cambridge, United Kingdom, CB2 7EF
| | - Nikolaus Kriegeskorte
- MRC Cognition and Brain Sciences Unit, University of Cambridge, Cambridge, United Kingdom, CB2 7EF
- Zuckerman Mind Brain Institute, Columbia University, New York, New York, NY 10027
| | - John Duncan
- MRC Cognition and Brain Sciences Unit, University of Cambridge, Cambridge, United Kingdom, CB2 7EF
- Department of Experimental Psychology, University of Oxford, Oxford, United Kingdom, OX2 6GG
| | - Urs Schüffelgen
- Department of Experimental Psychology, University of Oxford, Oxford, United Kingdom, OX2 6GG
- Wellcome Centre for Integrative Neuroimaging, University of Oxford, Oxford, United Kingdom, OX3 9DU
| | - Matthew F S Rushworth
- Department of Experimental Psychology, University of Oxford, Oxford, United Kingdom, OX2 6GG
- Wellcome Centre for Integrative Neuroimaging, University of Oxford, Oxford, United Kingdom, OX3 9DU
| | - Andrew H Bell
- MRC Cognition and Brain Sciences Unit, University of Cambridge, Cambridge, United Kingdom, CB2 7EF
- Department of Experimental Psychology, University of Oxford, Oxford, United Kingdom, OX2 6GG
- Wellcome Centre for Integrative Neuroimaging, University of Oxford, Oxford, United Kingdom, OX3 9DU
| |
Collapse
|
14
|
Phillips JM, Kambi NA, Redinbaugh MJ, Mohanta S, Saalmann YB. Disentangling the influences of multiple thalamic nuclei on prefrontal cortex and cognitive control. Neurosci Biobehav Rev 2021; 128:487-510. [PMID: 34216654 DOI: 10.1016/j.neubiorev.2021.06.042] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Revised: 04/13/2021] [Accepted: 06/09/2021] [Indexed: 10/21/2022]
Abstract
The prefrontal cortex (PFC) has a complex relationship with the thalamus, involving many nuclei which occupy predominantly medial zones along its anterior-to-posterior extent. Thalamocortical neurons in most of these nuclei are modulated by the affective and cognitive signals which funnel through the basal ganglia. We review how PFC-connected thalamic nuclei likely contribute to all aspects of cognitive control: from the processing of information on internal states and goals, facilitating its interactions with mnemonic information and learned values of stimuli and actions, to their influence on high-level cognitive processes, attentional allocation and goal-directed behavior. This includes contributions to transformations such as rule-to-choice (parvocellular mediodorsal nucleus), value-to-choice (magnocellular mediodorsal nucleus), mnemonic-to-choice (anteromedial nucleus) and sensory-to-choice (medial pulvinar). Common mechanisms appear to be thalamic modulation of cortical gain and cortico-cortical functional connectivity. The anatomy also implies a unique role for medial PFC in modulating processing in thalamocortical circuits involving other orbital and lateral PFC regions. We further discuss how cortico-basal ganglia circuits may provide a mechanism through which PFC controls cortico-cortical functional connectivity.
Collapse
Affiliation(s)
- Jessica M Phillips
- Department of Psychology, University of Wisconsin-Madison, 1202 W Johnson St., Madison, WI 53706, United States.
| | - Niranjan A Kambi
- Department of Psychology, University of Wisconsin-Madison, 1202 W Johnson St., Madison, WI 53706, United States
| | - Michelle J Redinbaugh
- Department of Psychology, University of Wisconsin-Madison, 1202 W Johnson St., Madison, WI 53706, United States
| | - Sounak Mohanta
- Department of Psychology, University of Wisconsin-Madison, 1202 W Johnson St., Madison, WI 53706, United States
| | - Yuri B Saalmann
- Department of Psychology, University of Wisconsin-Madison, 1202 W Johnson St., Madison, WI 53706, United States; Wisconsin National Primate Research Center, University of Wisconsin-Madison, 1202 Capitol Ct., Madison, WI 53715, United States.
| |
Collapse
|
15
|
Visual response of ventrolateral prefrontal neurons and their behavior-related modulation. Sci Rep 2021; 11:10118. [PMID: 33980932 PMCID: PMC8115110 DOI: 10.1038/s41598-021-89500-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2020] [Accepted: 04/26/2021] [Indexed: 11/08/2022] Open
Abstract
The ventral part of lateral prefrontal cortex (VLPF) of the monkey receives strong visual input, mainly from inferotemporal cortex. It has been shown that VLPF neurons can show visual responses during paradigms requiring to associate arbitrary visual cues to behavioral reactions. Further studies showed that there are also VLPF neurons responding to the presentation of specific visual stimuli, such as objects and faces. However, it is largely unknown whether VLPF neurons respond and differentiate between stimuli belonging to different categories, also in absence of a specific requirement to actively categorize or to exploit these stimuli for choosing a given behavior. The first aim of the present study is to evaluate and map the responses of neurons of a large sector of VLPF to a wide set of visual stimuli when monkeys simply observe them. Recent studies showed that visual responses to objects are also present in VLPF neurons coding action execution, when they are the target of the action. Thus, the second aim of the present study is to compare the visual responses of VLPF neurons when the same objects are simply observed or when they become the target of a grasping action. Our results indicate that: (1) part of VLPF visually responsive neurons respond specifically to one stimulus or to a small set of stimuli, but there is no indication of a “passive” categorical coding; (2) VLPF neuronal visual responses to objects are often modulated by the task conditions in which the object is observed, with the strongest response when the object is target of an action. These data indicate that VLPF performs an early passive description of several types of visual stimuli, that can then be used for organizing and planning behavior. This could explain the modulation of visual response both in associative learning and in natural behavior.
Collapse
|
16
|
Khandhadia AP, Murphy AP, Romanski LM, Bizley JK, Leopold DA. Audiovisual integration in macaque face patch neurons. Curr Biol 2021; 31:1826-1835.e3. [PMID: 33636119 PMCID: PMC8521527 DOI: 10.1016/j.cub.2021.01.102] [Citation(s) in RCA: 26] [Impact Index Per Article: 8.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2020] [Revised: 12/29/2020] [Accepted: 01/28/2021] [Indexed: 12/03/2022]
Abstract
Primate social communication depends on the perceptual integration of visual and auditory cues, reflected in the multimodal mixing of sensory signals in certain cortical areas. The macaque cortical face patch network, identified through visual, face-selective responses measured with fMRI, is assumed to contribute to visual social interactions. However, whether face patch neurons are also influenced by acoustic information, such as the auditory component of a natural vocalization, remains unknown. Here, we recorded single-unit activity in the anterior fundus (AF) face patch, in the superior temporal sulcus, and anterior medial (AM) face patch, on the undersurface of the temporal lobe, in macaques presented with audiovisual, visual-only, and auditory-only renditions of natural movies of macaques vocalizing. The results revealed that 76% of neurons in face patch AF were significantly influenced by the auditory component of the movie, most often through enhancement of visual responses but sometimes in response to the auditory stimulus alone. By contrast, few neurons in face patch AM exhibited significant auditory responses or modulation. Control experiments in AF used an animated macaque avatar to demonstrate, first, that the structural elements of the face were often essential for audiovisual modulation and, second, that the temporal modulation of the acoustic stimulus was more important than its frequency spectrum. Together, these results identify a striking contrast between two face patches and specifically identify AF as playing a potential role in the integration of audiovisual cues during natural modes of social communication.
Collapse
Affiliation(s)
- Amit P Khandhadia
- Laboratory of Neuropsychology, National Institute of Mental Health, NIH, Bethesda, MD 20892, USA; Ear Institute, University College London, 332 Gray's Inn Road, London WC1X 8EE, UK.
| | - Aidan P Murphy
- Laboratory of Neuropsychology, National Institute of Mental Health, NIH, Bethesda, MD 20892, USA; Neurophysiology Imaging Facility, National Institute of Mental Health, National Institute of Neurological Disorders and Stroke, National Eye Institute, NIH, Bethesda, MD 20892, USA
| | - Lizabeth M Romanski
- Department of Neuroscience, University of Rochester School of Medicine, Rochester, NY 14642, USA
| | - Jennifer K Bizley
- Ear Institute, University College London, 332 Gray's Inn Road, London WC1X 8EE, UK
| | - David A Leopold
- Laboratory of Neuropsychology, National Institute of Mental Health, NIH, Bethesda, MD 20892, USA; Neurophysiology Imaging Facility, National Institute of Mental Health, National Institute of Neurological Disorders and Stroke, National Eye Institute, NIH, Bethesda, MD 20892, USA.
| |
Collapse
|
17
|
Konoike N, Iwaoki H, Nakamura K. Potent and Quick Responses to Conspecific Faces and Snakes in the Anterior Cingulate Cortex of Monkeys. Front Behav Neurosci 2020; 14:156. [PMID: 33132857 PMCID: PMC7552906 DOI: 10.3389/fnbeh.2020.00156] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2020] [Accepted: 08/07/2020] [Indexed: 11/13/2022] Open
Abstract
Appropriate processing of others’ facial emotions is a fundamental ability of primates in social situations. Several moods and anxiety disorders such as depression cause a negative bias in the perception of facial emotions. Depressive patients show abnormalities of activity and gray matter volume in the perigenual portion of the anterior cingulate cortex (ACC) and an increase of activation in the amygdala. However, it is not known whether neurons in the ACC have a function in the processing of facial emotions. Furthermore, detecting predators quickly and taking avoidance behavior are important functions in a matter of life and death for wild monkeys. the existence of predators in their vicinity is life-and-death information for monkeys. In the present study, we recorded the activity of single neurons from the monkey ACC and examined the responsiveness of the ACC neurons to various visual stimuli including monkey faces, snakes, foods, and artificial objects. About one-fourth of the recorded neurons showed a significant change in activity in response to the stimuli. The ACC neurons exhibited high selectivity to certain stimuli, and more neurons exhibited the maximal response to monkey faces and snakes than to foods and objects. The responses to monkey faces and snakes were faster and stronger compared to those to foods and objects. Almost all of the neurons that responded to video stimuli responded strongly to negative facial stimuli, threats, and scream. Most of the responsive neurons were located in the cingulate gyrus or the ventral bank of the cingulate sulcus just above or anterior to the genu of the corpus callosum, that is, the perigenual portion of the ACC, which has a strong mutual connection with the amygdala. These results suggest that the perigenual portion of the ACC in addition to the amygdala processes emotional information, especially negative life-and-death information such as conspecifics’ faces and snakes.
Collapse
Affiliation(s)
- Naho Konoike
- Section of Cognitive Neuroscience, Primate Research Institute, Kyoto University, Inuyama, Japan
| | - Haruhiko Iwaoki
- Section of Cognitive Neuroscience, Primate Research Institute, Kyoto University, Inuyama, Japan
| | - Katsuki Nakamura
- Section of Cognitive Neuroscience, Primate Research Institute, Kyoto University, Inuyama, Japan
| |
Collapse
|
18
|
Murphy LE, Bachevalier J. Damage to Orbitofrontal Areas 12 and 13, but Not Area 14, Results in Blunted Attention and Arousal to Socioemotional Stimuli in Rhesus Macaques. Front Behav Neurosci 2020; 14:150. [PMID: 33093825 PMCID: PMC7506161 DOI: 10.3389/fnbeh.2020.00150] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2020] [Accepted: 08/03/2020] [Indexed: 12/12/2022] Open
Abstract
An earlier study in monkeys indicated that lesions to the mid-portion of the ventral orbitofrontal cortex (OFC), including Walker’s areas 11 and 13 (OFC11/13), altered the spontaneous scanning of still pictures of primate faces (neutral and emotional) and the modulation of arousal. Yet, these conclusions were limited by several shortcomings, including the lesion approach, use of static rather than dynamic stimuli, and manual data analyses. To confirm and extend these earlier findings, we compared attention and arousal to social and nonsocial scenes in three groups of rhesus macaques with restricted lesions to one of three OFC areas (OFC12, OFC13, or OFC14) and a sham-operated control group using eye-tracking to capture scanning patterns, focal attention and pupil size. Animals with damage to the lateral OFC areas (OFC12 and OFC13) showed decreased attention specifically to the eyes of negative (threatening) social stimuli and increased arousal (increased pupil diameter) to positive social scenes. In contrast, animals with damage to the ventromedial OFC area (OFC14) displayed no differences in attention or arousal in the presence of social stimuli compared to controls. These findings support the notion that areas of the lateral OFC are critical for directing attention and modulating arousal to emotional social cues. Together with the existence of face-selective neurons in these lateral OFC areas, the data suggest that the lateral OFC may set the stage for multidimensional information processing related to faces and emotion and may be involved in social judgments.
Collapse
Affiliation(s)
- Lauren E Murphy
- Department of Psychology, Emory College of Arts and Sciences, Emory University, Atlanta, GA, United States
| | - Jocelyne Bachevalier
- Department of Psychology, Emory College of Arts and Sciences, Emory University, Atlanta, GA, United States.,Yerkes National Primate Research Center, Emory University, Atlanta, GA, United States
| |
Collapse
|
19
|
Ferreiro DN, Amaro D, Schmidtke D, Sobolev A, Gundi P, Belliveau L, Sirota A, Grothe B, Pecka M. Sensory Island Task (SIT): A New Behavioral Paradigm to Study Sensory Perception and Neural Processing in Freely Moving Animals. Front Behav Neurosci 2020; 14:576154. [PMID: 33100981 PMCID: PMC7546252 DOI: 10.3389/fnbeh.2020.576154] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/25/2020] [Accepted: 08/27/2020] [Indexed: 11/17/2022] Open
Abstract
A central function of sensory systems is the gathering of information about dynamic interactions with the environment during self-motion. To determine whether modulation of a sensory cue was externally caused or a result of self-motion is fundamental to perceptual invariance and requires the continuous update of sensory processing about recent movements. This process is highly context-dependent and crucial for perceptual performances such as decision-making and sensory object formation. Yet despite its fundamental ecological role, voluntary self-motion is rarely incorporated in perceptual or neurophysiological investigations of sensory processing in animals. Here, we present the Sensory Island Task (SIT), a new freely moving search paradigm to study sensory processing and perception. In SIT, animals explore an open-field arena to find a sensory target relying solely on changes in the presented stimulus, which is controlled by closed-loop position tracking in real-time. Within a few sessions, animals are trained via positive reinforcement to search for a particular area in the arena (“target island”), which triggers the presentation of the target stimulus. The location of the target island is randomized across trials, making the modulated stimulus feature the only informative cue for task completion. Animals report detection of the target stimulus by remaining within the island for a defined time (“sit-time”). Multiple “non-target” islands can be incorporated to test psychometric discrimination and identification performance. We exemplify the suitability of SIT for rodents (Mongolian gerbil, Meriones unguiculatus) and small primates (mouse lemur, Microcebus murinus) and for studying various sensory perceptual performances (auditory frequency discrimination, sound source localization, visual orientation discrimination). Furthermore, we show that pairing SIT with chronic electrophysiological recordings allows revealing neuronal signatures of sensory processing under ecologically relevant conditions during goal-oriented behavior. In conclusion, SIT represents a flexible and easily implementable behavioral paradigm for mammals that combines self-motion and natural exploratory behavior to study sensory sensitivity and decision-making and their underlying neuronal processing.
Collapse
Affiliation(s)
- Dardo N Ferreiro
- Division of Neurobiology, Department Biology II, Ludwig-Maximilians-Universität München, Munich, Germany.,Department of General Psychology and Education, Ludwig-Maximilians-Universität München, Munich, Germany
| | - Diana Amaro
- Division of Neurobiology, Department Biology II, Ludwig-Maximilians-Universität München, Munich, Germany.,Graduate School of Systemic Neurosciences, Ludwig-Maximilians-Universität München, Munich, Germany
| | - Daniel Schmidtke
- Institute of Zoology, University of Veterinary Medicine Hannover, Hanover, Germany
| | - Andrey Sobolev
- Graduate School of Systemic Neurosciences, Ludwig-Maximilians-Universität München, Munich, Germany
| | - Paula Gundi
- Division of Neurobiology, Department Biology II, Ludwig-Maximilians-Universität München, Munich, Germany.,Graduate School of Systemic Neurosciences, Ludwig-Maximilians-Universität München, Munich, Germany
| | - Lucile Belliveau
- Division of Neurobiology, Department Biology II, Ludwig-Maximilians-Universität München, Munich, Germany
| | - Anton Sirota
- Faculty of Medicine, Bernstein Center for Computational Neuroscience Munich, Munich Cluster of Systems Neurology (SyNergy), Ludwig-Maximilians-Universität München, Munich, Germany
| | - Benedikt Grothe
- Division of Neurobiology, Department Biology II, Ludwig-Maximilians-Universität München, Munich, Germany.,Graduate School of Systemic Neurosciences, Ludwig-Maximilians-Universität München, Munich, Germany
| | - Michael Pecka
- Division of Neurobiology, Department Biology II, Ludwig-Maximilians-Universität München, Munich, Germany
| |
Collapse
|
20
|
Fu D, Weber C, Yang G, Kerzel M, Nan W, Barros P, Wu H, Liu X, Wermter S. What Can Computational Models Learn From Human Selective Attention? A Review From an Audiovisual Unimodal and Crossmodal Perspective. Front Integr Neurosci 2020; 14:10. [PMID: 32174816 PMCID: PMC7056875 DOI: 10.3389/fnint.2020.00010] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2019] [Accepted: 02/11/2020] [Indexed: 11/13/2022] Open
Abstract
Selective attention plays an essential role in information acquisition and utilization from the environment. In the past 50 years, research on selective attention has been a central topic in cognitive science. Compared with unimodal studies, crossmodal studies are more complex but necessary to solve real-world challenges in both human experiments and computational modeling. Although an increasing number of findings on crossmodal selective attention have shed light on humans' behavioral patterns and neural underpinnings, a much better understanding is still necessary to yield the same benefit for intelligent computational agents. This article reviews studies of selective attention in unimodal visual and auditory and crossmodal audiovisual setups from the multidisciplinary perspectives of psychology and cognitive neuroscience, and evaluates different ways to simulate analogous mechanisms in computational models and robotics. We discuss the gaps between these fields in this interdisciplinary review and provide insights about how to use psychological findings and theories in artificial intelligence from different perspectives.
Collapse
Affiliation(s)
- Di Fu
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
- Department of Informatics, University of Hamburg, Hamburg, Germany
| | - Cornelius Weber
- Department of Informatics, University of Hamburg, Hamburg, Germany
| | - Guochun Yang
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Matthias Kerzel
- Department of Informatics, University of Hamburg, Hamburg, Germany
| | - Weizhi Nan
- Department of Psychology, Center for Brain and Cognitive Sciences, School of Education, Guangzhou University, Guangzhou, China
| | - Pablo Barros
- Department of Informatics, University of Hamburg, Hamburg, Germany
| | - Haiyan Wu
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Xun Liu
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Stefan Wermter
- Department of Informatics, University of Hamburg, Hamburg, Germany
| |
Collapse
|
21
|
Multisensory Neurons in the Primate Amygdala. J Neurosci 2019; 39:3663-3675. [PMID: 30858163 DOI: 10.1523/jneurosci.2903-18.2019] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2018] [Revised: 02/12/2019] [Accepted: 02/13/2019] [Indexed: 11/21/2022] Open
Abstract
Animals identify, interpret, and respond to complex, natural signals that are often multisensory. The ability to integrate signals across sensory modalities depends on the convergence of sensory inputs at the level of single neurons. Neurons in the amygdala are expected to be multisensory because they respond to complex, natural stimuli, and the amygdala receives inputs from multiple sensory areas. We recorded activity from the amygdala of 2 male monkeys (Macaca mulatta) in response to visual, tactile, and auditory stimuli. Although the stimuli were devoid of inherent emotional or social significance and were not paired with rewards or punishments, the majority of neurons that responded to these stimuli were multisensory. Selectivity for sensory modality was stronger and emerged earlier than selectivity for individual items within a sensory modality. Modality and item selectivity were expressed via three main spike-train metrics: (1) response magnitude, (2) response polarity, and (3) response duration. None of these metrics were unique to a particular sensory modality; rather, each neuron responded with distinct combinations of spike-train metrics to discriminate sensory modalities and items within a modality. The relative proportion of multisensory neurons was similar across the nuclei of the amygdala. The convergence of inputs of multiple sensory modalities at the level of single neurons in the amygdala rests at the foundation for multisensory integration. The integration of visual, auditory, and tactile inputs in the amygdala may serve social communication by binding together social signals carried by facial expressions, vocalizations, and social grooming.SIGNIFICANCE STATEMENT Our brain continuously decodes information detected by multiple sensory systems. The emotional and social significance of the incoming signals is likely extracted by the amygdala, which receives input from all sensory domains. Here we show that a large portion of neurons in the amygdala respond to stimuli from two or more sensory modalities. The convergence of visual, tactile, and auditory signals at the level of individual neurons in the amygdala establishes a foundation for multisensory integration within this structure. The ability to integrate signals across sensory modalities is critical for social communication and other high-level cognitive functions.
Collapse
|
22
|
Peiffer-Smadja N, Cohen L. The cerebral bases of the bouba-kiki effect. Neuroimage 2019; 186:679-689. [PMID: 30503933 DOI: 10.1016/j.neuroimage.2018.11.033] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2018] [Revised: 11/18/2018] [Accepted: 11/21/2018] [Indexed: 11/30/2022] Open
Abstract
The crossmodal correspondence between some speech sounds and some geometrical shapes, known as the bouba-kiki (BK) effect, constitutes a remarkable exception to the general arbitrariness of the links between word meaning and word sounds. We have analyzed the association of shapes and sounds in order to determine whether it occurs at a perceptual or at a decisional level, and whether it takes place in sensory cortices or in supramodal regions. First, using an Implicit Association Test (IAT), we have shown that the BK effect may occur without participants making any explicit decision relative to sound-shape associations. Second, looking for the brain correlates of implicit BK matching, we have found that intermodal matching influences activations in both auditory and visual sensory cortices. Moreover, we found stronger prefrontal activation to mismatching than to matching stimuli, presumably reflecting a modulation of executive processes by crossmodal correspondence. Thus, through its roots in the physiology of object categorization and crossmodal matching, the BK effect provides a unique insight into some non-linguistic components of word formation.
Collapse
Affiliation(s)
- Nathan Peiffer-Smadja
- Institut du Cerveau et de la Moelle épinière, ICM, Inserm U 1127, CNRS UMR 7225, Sorbonne Université, F-75013, Paris, France
| | - Laurent Cohen
- Institut du Cerveau et de la Moelle épinière, ICM, Inserm U 1127, CNRS UMR 7225, Sorbonne Université, F-75013, Paris, France; Département de Neurologie 1, Hôpital de la Pitié Salpêtrière, AP-HP, F-75013, Paris, France.
| |
Collapse
|
23
|
Abstract
Perceiving social and emotional information from faces is a critical primate skill. For this purpose, primates evolved dedicated cortical architecture, especially in occipitotemporal areas, utilizing face-selective cells. Less understood face-selective neurons are present in the orbitofrontal cortex (OFC) and are our object of study. We examined 179 face-selective cells in the lateral sulcus of the OFC by characterizing their responses to a rich set of photographs of conspecific faces varying in age, gender, and facial expression. Principal component analysis and unsupervised cluster analysis of stimulus space both revealed that face cells encode face dimensions for social categories and emotions. Categories represented strongly were facial expressions (grin and threat versus lip smack), juvenile, and female monkeys. Cluster analyses of a control population of nearby cells lacking face selectivity did not categorize face stimuli in a meaningful way, suggesting that only face-selective cells directly support face categorization in OFC. Time course analyses of face cell activity from stimulus onset showed that faces were discriminated from nonfaces early, followed by within-face categorization for social and emotion content (i.e., young and facial expression). Face cells revealed no response to acoustic stimuli such as vocalizations and were poorly modulated by vocalizations added to faces. Neuronal responses remained stable when paired with positive or negative reinforcement, implying that face cells encode social information but not learned reward value associated to faces. Overall, our results shed light on a substantial role of the OFC in the characterizations of facial information bearing on social and emotional behavior.
Collapse
|
24
|
Aboitiz F. A Brain for Speech. Evolutionary Continuity in Primate and Human Auditory-Vocal Processing. Front Neurosci 2018; 12:174. [PMID: 29636657 PMCID: PMC5880940 DOI: 10.3389/fnins.2018.00174] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2017] [Accepted: 03/05/2018] [Indexed: 12/27/2022] Open
Abstract
In this review article, I propose a continuous evolution from the auditory-vocal apparatus and its mechanisms of neural control in non-human primates, to the peripheral organs and the neural control of human speech. Although there is an overall conservatism both in peripheral systems and in central neural circuits, a few changes were critical for the expansion of vocal plasticity and the elaboration of proto-speech in early humans. Two of the most relevant changes were the acquisition of direct cortical control of the vocal fold musculature and the consolidation of an auditory-vocal articulatory circuit, encompassing auditory areas in the temporoparietal junction and prefrontal and motor areas in the frontal cortex. This articulatory loop, also referred to as the phonological loop, enhanced vocal working memory capacity, enabling early humans to learn increasingly complex utterances. The auditory-vocal circuit became progressively coupled to multimodal systems conveying information about objects and events, which gradually led to the acquisition of modern speech. Gestural communication accompanies the development of vocal communication since very early in human evolution, and although both systems co-evolved tightly in the beginning, at some point speech became the main channel of communication.
Collapse
Affiliation(s)
- Francisco Aboitiz
- Centro Interdisciplinario de Neurociencias, Escuela de Medicina, Pontificia Universidad Católica de Chile, Santiago, Chile
| |
Collapse
|
25
|
Borra E, Ferroni CG, Gerbella M, Giorgetti V, Mangiaracina C, Rozzi S, Luppino G. Rostro-caudal Connectional Heterogeneity of the Dorsal Part of the Macaque Prefrontal Area 46. Cereb Cortex 2017; 29:485-504. [DOI: 10.1093/cercor/bhx332] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2017] [Accepted: 11/20/2017] [Indexed: 11/13/2022] Open
Affiliation(s)
- Elena Borra
- Department of Medicine and Surgery, Neuroscience Unit, University of Parma, via Volturno 39, Parma, Italy
| | - Carolina Giulia Ferroni
- Department of Medicine and Surgery, Neuroscience Unit, University of Parma, via Volturno 39, Parma, Italy
| | - Marzio Gerbella
- Istituto Italiano di Tecnologia (IIT), Center for Biomolecular Nanotechnologies, via Eugenio Barsanti, Arnesano, Lecce, Italy
| | - Valentina Giorgetti
- Department of Medicine and Surgery, Neuroscience Unit, University of Parma, via Volturno 39, Parma, Italy
| | - Chiara Mangiaracina
- Department of Medicine and Surgery, Neuroscience Unit, University of Parma, via Volturno 39, Parma, Italy
| | - Stefano Rozzi
- Department of Medicine and Surgery, Neuroscience Unit, University of Parma, via Volturno 39, Parma, Italy
| | - Giuseppe Luppino
- Department of Medicine and Surgery, Neuroscience Unit, University of Parma, via Volturno 39, Parma, Italy
| |
Collapse
|
26
|
Ohshiro T, Angelaki DE, DeAngelis GC. A Neural Signature of Divisive Normalization at the Level of Multisensory Integration in Primate Cortex. Neuron 2017; 95:399-411.e8. [PMID: 28728025 DOI: 10.1016/j.neuron.2017.06.043] [Citation(s) in RCA: 35] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2017] [Revised: 06/19/2017] [Accepted: 06/26/2017] [Indexed: 10/19/2022]
Abstract
Studies of multisensory integration by single neurons have traditionally emphasized empirical principles that describe nonlinear interactions between inputs from two sensory modalities. We previously proposed that many of these empirical principles could be explained by a divisive normalization mechanism operating in brain regions where multisensory integration occurs. This normalization model makes a critical diagnostic prediction: a non-preferred sensory input from one modality, which activates the neuron on its own, should suppress the response to a preferred input from another modality. We tested this prediction by recording from neurons in macaque area MSTd that integrate visual and vestibular cues regarding self-motion. We show that many MSTd neurons exhibit the diagnostic form of cross-modal suppression, whereas unisensory neurons in area MT do not. The normalization model also fits population responses better than a model based on subtractive inhibition. These findings provide strong support for a divisive normalization mechanism in multisensory integration.
Collapse
Affiliation(s)
- Tomokazu Ohshiro
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY 14611, USA; Department of Physiology, Tohoku University School of Medicine, Sendai 980-8575, Japan
| | - Dora E Angelaki
- Department of Neuroscience, Baylor College of Medicine, Houston, TX 77030, USA
| | - Gregory C DeAngelis
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY 14611, USA.
| |
Collapse
|
27
|
Rozzi S, Fogassi L. Neural Coding for Action Execution and Action Observation in the Prefrontal Cortex and Its Role in the Organization of Socially Driven Behavior. Front Neurosci 2017; 11:492. [PMID: 28936159 PMCID: PMC5594103 DOI: 10.3389/fnins.2017.00492] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2017] [Accepted: 08/22/2017] [Indexed: 11/13/2022] Open
Abstract
The lateral prefrontal cortex (LPF) plays a fundamental role in planning, organizing, and optimizing behavioral performance. Neuroanatomical and neurophysiological studies have suggested that in this cortical sector, information processing becomes more abstract when moving from caudal to rostral and that such processing involves parietal and premotor areas. We review studies that have shown that the LPF, in addition to its involvement in implementing rules and setting behavioral goals, activates during the execution of forelimb movements even in the absence of a learned relationship between an instruction and its associated motor output. Thus, we propose that the prefrontal cortex is involved in exploiting contextual information for planning and guiding behavioral responses, also in natural situations. Among contextual cues, those provided by others' actions are particularly relevant for social interactions. Functional studies of macaques have demonstrated that the LPF is activated by the observation of biological stimuli, in particular those related to goal-directed actions. We review these studies and discuss the idea that the prefrontal cortex codes high-order representations of observed actions rather than simple visual descriptions of them. Based on evidence that the same sector of the LPF contains both neurons coding own action goals and neurons coding others' goals, we propose that this sector is involved in the selection of own actions appropriate for reacting in a particular social context and for the creation of new action sequences in imitative learning.
Collapse
Affiliation(s)
- Stefano Rozzi
- Department of Medicine and Surgery, Unit of Neuroscience, University of ParmaParma, Italy
| | - Leonardo Fogassi
- Department of Medicine and Surgery, Unit of Neuroscience, University of ParmaParma, Italy
| |
Collapse
|
28
|
Gothard KM, Mosher CP, Zimmerman PE, Putnam PT, Morrow JK, Fuglevand AJ. New perspectives on the neurophysiology of primate amygdala emerging from the study of naturalistic social behaviors. WILEY INTERDISCIPLINARY REVIEWS. COGNITIVE SCIENCE 2017; 9. [PMID: 28800678 DOI: 10.1002/wcs.1449] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/14/2017] [Revised: 06/03/2017] [Accepted: 06/05/2017] [Indexed: 11/07/2022]
Abstract
A major challenge of primate neurophysiology, particularly in the domain of social neuroscience, is to adopt more natural behaviors without compromising the ability to relate patterns of neural activity to specific actions or sensory inputs. Traditional approaches have identified neural activity patterns in the amygdala in response to simplified versions of social stimuli such as static images of faces. As a departure from this reduced approach, single images of faces were replaced with arrays of images or videos of conspecifics. These stimuli elicited more natural behaviors and new types of neural responses: (1) attention-gated responses to faces, (2) selective responses to eye contact, and (3) selective responses to touch and somatosensory feedback during the production of facial expressions. An additional advance toward more natural social behaviors in the laboratory was the implementation of dyadic social interactions. Under these conditions, neurons encoded similarly rewards that monkeys delivered to self and to their social partner. These findings reinforce the value of bringing natural, ethologically valid, behavioral tasks under neurophysiological scrutiny. WIREs Cogn Sci 2018, 9:e1449. doi: 10.1002/wcs.1449 This article is categorized under: Psychology > Emotion and Motivation Neuroscience > Cognition Neuroscience > Physiology.
Collapse
Affiliation(s)
- Katalin M Gothard
- Department of Physiology, College of Medicine, University of Arizona, Tucson, AZ, USA
| | - Clayton P Mosher
- Department of Physiology, College of Medicine, University of Arizona, Tucson, AZ, USA
| | - Prisca E Zimmerman
- Department of Physiology, College of Medicine, University of Arizona, Tucson, AZ, USA
| | - Philip T Putnam
- Department of Physiology, College of Medicine, University of Arizona, Tucson, AZ, USA
| | - Jeremiah K Morrow
- Department of Physiology, College of Medicine, University of Arizona, Tucson, AZ, USA
| | - Andrew J Fuglevand
- Department of Physiology, College of Medicine, University of Arizona, Tucson, AZ, USA
| |
Collapse
|
29
|
Nieder A. Magnitude Codes for Cross-Modal Working Memory in the Primate Frontal Association Cortex. Front Neurosci 2017; 11:202. [PMID: 28439225 PMCID: PMC5383665 DOI: 10.3389/fnins.2017.00202] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2017] [Accepted: 03/24/2017] [Indexed: 11/13/2022] Open
Abstract
Quantitative features of stimuli may be ordered along a magnitude continuum, or line. Magnitude refers to parameters of different types of stimulus properties. For instance, the frequency of a sound relates to sensory and continuous stimulus properties, whereas the number of items in a set is an abstract and discrete property. In addition, within a stimulus property, magnitudes need to be processed not only in one modality, but across multiple modalities. In the sensory domain, for example, magnitude applies to both to the frequency of auditory sounds and tactile vibrations. Similarly, both the number of visual items and acoustic events constitute numerical quantity, or numerosity. To support goal-directed behavior and executive functions across time, magnitudes need to be held in working memory, the ability to briefly retain and manipulate information in mind. How different types of magnitudes across multiple modalities are represented in working memory by single neurons has only recently been explored in primates. These studies show that neurons in the frontal lobe can encode the same magnitude type across sensory modalities. However, while multimodal sensory magnitude in relative comparison tasks is represented by monotonically increasing or decreasing response functions ("summation code"), multimodal numerical quantity in absolute matching tasks is encoded by neurons tuned to preferred numerosities ("labeled-line code"). These findings indicate that most likely there is not a single type of cross-modal working-memory code for magnitudes, but rather a flexible code that depends on the stimulus dimension as well as on the task requirements.
Collapse
Affiliation(s)
- Andreas Nieder
- Animal Physiology Unit, Institute of Neurobiology, University of TübingenTübingen, Germany
| |
Collapse
|
30
|
Hage SR, Nieder A. Dual Neural Network Model for the Evolution of Speech and Language. Trends Neurosci 2016; 39:813-829. [DOI: 10.1016/j.tins.2016.10.006] [Citation(s) in RCA: 89] [Impact Index Per Article: 11.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2016] [Revised: 10/14/2016] [Accepted: 10/20/2016] [Indexed: 12/31/2022]
|
31
|
Plakke B, Romanski LM. Neural circuits in auditory and audiovisual memory. Brain Res 2016; 1640:278-88. [PMID: 26656069 PMCID: PMC4868791 DOI: 10.1016/j.brainres.2015.11.042] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2015] [Revised: 10/28/2015] [Accepted: 11/25/2015] [Indexed: 01/01/2023]
Abstract
Working memory is the ability to employ recently seen or heard stimuli and apply them to changing cognitive context. Although much is known about language processing and visual working memory, the neurobiological basis of auditory working memory is less clear. Historically, part of the problem has been the difficulty in obtaining a robust animal model to study auditory short-term memory. In recent years there has been neurophysiological and lesion studies indicating a cortical network involving both temporal and frontal cortices. Studies specifically targeting the role of the prefrontal cortex (PFC) in auditory working memory have suggested that dorsal and ventral prefrontal regions perform different roles during the processing of auditory mnemonic information, with the dorsolateral PFC performing similar functions for both auditory and visual working memory. In contrast, the ventrolateral PFC (VLPFC), which contains cells that respond robustly to auditory stimuli and that process both face and vocal stimuli may be an essential locus for both auditory and audiovisual working memory. These findings suggest a critical role for the VLPFC in the processing, integrating, and retaining of communication information. This article is part of a Special Issue entitled SI: Auditory working memory.
Collapse
Affiliation(s)
- B Plakke
- University of Rochester School of Medicine & Dentistry, Department Neurobiology & Anatomy, United States.
| | - L M Romanski
- University of Rochester School of Medicine & Dentistry, Department Neurobiology & Anatomy, United States.
| |
Collapse
|
32
|
Tsilionis E, Vatakis A. Multisensory binding: is the contribution of synchrony and semantic congruency obligatory? Curr Opin Behav Sci 2016. [DOI: 10.1016/j.cobeha.2016.01.002] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
33
|
Bizley JK, Maddox RK, Lee AKC. Defining Auditory-Visual Objects: Behavioral Tests and Physiological Mechanisms. Trends Neurosci 2016; 39:74-85. [PMID: 26775728 PMCID: PMC4738154 DOI: 10.1016/j.tins.2015.12.007] [Citation(s) in RCA: 50] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2015] [Revised: 12/03/2015] [Accepted: 12/11/2015] [Indexed: 11/30/2022]
Abstract
Crossmodal integration is a term applicable to many phenomena in which one sensory modality influences task performance or perception in another sensory modality. We distinguish the term binding as one that should be reserved specifically for the process that underpins perceptual object formation. To unambiguously differentiate binding form other types of integration, behavioral and neural studies must investigate perception of a feature orthogonal to the features that link the auditory and visual stimuli. We argue that supporting true perceptual binding (as opposed to other processes such as decision-making) is one role for cross-sensory influences in early sensory cortex. These early multisensory interactions may therefore form a physiological substrate for the bottom-up grouping of auditory and visual stimuli into auditory-visual (AV) objects. Crossmodal integration and binding have been treated as synonymous in the literature, with no clear delineation between perceptual changes and other interactions such as decision-making. Crossmodal binding is proposed as a distinct form of integration leading to multisensory object formation. Multisensory stimuli are most beneficial in noisy situations, but few studies use stimulus competition to investigate the processes underpinning multisensory integration. Evidence suggests that both visual and auditory attention is object-based – all features within an object are enhanced and there is a cost to attending features across versus within objects. Multisensory interactions can be observed throughout the brain, including early sensory cortex. The role of early sensory cortex in multisensory integration is unknown, but may underlie crossmodal binding.
Collapse
Affiliation(s)
- Jennifer K Bizley
- University College London (UCL) Ear Institute, 332 Gray's Inn Road, London, WC1X 8EE, UK.
| | - Ross K Maddox
- Institute for Learning and Brain Sciences, University of Washington, 1715 NE Columbia Road, Portage Bay Building, Box 357988, Seattle, WA 98195, USA
| | - Adrian K C Lee
- Institute for Learning and Brain Sciences, University of Washington, 1715 NE Columbia Road, Portage Bay Building, Box 357988, Seattle, WA 98195, USA; Department of Speech and Hearing Sciences, University of Washington, 1417 NE 42nd Street, Eagleson Hall, Box 354875, Seattle, WA 98105, USA.
| |
Collapse
|
34
|
Abstract
Complex audio-vocal integration systems depend on a strong interconnection between the auditory and the vocal motor system. To gain cognitive control over audio-vocal interaction during vocal motor control, the PFC needs to be involved. Neurons in the ventrolateral PFC (VLPFC) have been shown to separately encode the sensory perceptions and motor production of vocalizations. It is unknown, however, whether single neurons in the PFC reflect audio-vocal interactions. We therefore recorded single-unit activity in the VLPFC of rhesus monkeys (Macaca mulatta) while they produced vocalizations on command or passively listened to monkey calls. We found that 12% of randomly selected neurons in VLPFC modulated their discharge rate in response to acoustic stimulation with species-specific calls. Almost three-fourths of these auditory neurons showed an additional modulation of their discharge rates either before and/or during the monkeys' motor production of vocalization. Based on these audio-vocal interactions, the VLPFC might be well positioned to combine higher order auditory processing with cognitive control of the vocal motor output. Such audio-vocal integration processes in the VLPFC might constitute a precursor for the evolution of complex learned audio-vocal integration systems, ultimately giving rise to human speech.
Collapse
|
35
|
Lee M, Blake R, Kim S, Kim CY. Melodic sound enhances visual awareness of congruent musical notes, but only if you can read music. Proc Natl Acad Sci U S A 2015; 112:8493-8. [PMID: 26077907 PMCID: PMC4500286 DOI: 10.1073/pnas.1509529112] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Predictive influences of auditory information on resolution of visual competition were investigated using music, whose visual symbolic notation is familiar only to those with musical training. Results from two experiments using different experimental paradigms revealed that melodic congruence between what is seen and what is heard impacts perceptual dynamics during binocular rivalry. This bisensory interaction was observed only when the musical score was perceptually dominant, not when it was suppressed from awareness, and it was observed only in people who could read music. Results from two ancillary experiments showed that this effect of congruence cannot be explained by differential patterns of eye movements or by differential response sluggishness associated with congruent score/melody combinations. Taken together, these results demonstrate robust audiovisual interaction based on high-level, symbolic representations and its predictive influence on perceptual dynamics during binocular rivalry.
Collapse
Affiliation(s)
- Minyoung Lee
- Department of Psychology, Korea University, Seoul 136701, Korea
| | - Randolph Blake
- Department of Psychological Sciences, Vanderbilt Vision Research Center, Vanderbilt University, Nashville, TN 37240; Department of Brain and Cognitive Sciences, Seoul National University, Seoul 151742, Korea
| | - Sujin Kim
- Department of Psychology, Korea University, Seoul 136701, Korea;
| | - Chai-Youn Kim
- Department of Psychology, Korea University, Seoul 136701, Korea;
| |
Collapse
|
36
|
Plakke B, Hwang J, Romanski LM. Inactivation of Primate Prefrontal Cortex Impairs Auditory and Audiovisual Working Memory. J Neurosci 2015; 35:9666-75. [PMID: 26134649 PMCID: PMC4571503 DOI: 10.1523/jneurosci.1218-15.2015] [Citation(s) in RCA: 34] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2015] [Revised: 05/21/2015] [Accepted: 05/27/2015] [Indexed: 11/21/2022] Open
Abstract
The prefrontal cortex is associated with cognitive functions that include planning, reasoning, decision-making, working memory, and communication. Neurophysiology and neuropsychology studies have established that dorsolateral prefrontal cortex is essential in spatial working memory while the ventral frontal lobe processes language and communication signals. Single-unit recordings in nonhuman primates has shown that ventral prefrontal (VLPFC) neurons integrate face and vocal information and are active during audiovisual working memory. However, whether VLPFC is essential in remembering face and voice information is unknown. We therefore trained nonhuman primates in an audiovisual working memory paradigm using naturalistic face-vocalization movies as memoranda. We inactivated VLPFC, with reversible cortical cooling, and examined performance when faces, vocalizations or both faces and vocalization had to be remembered. We found that VLPFC inactivation impaired subjects' performance in audiovisual and auditory-alone versions of the task. In contrast, VLPFC inactivation did not disrupt visual working memory. Our studies demonstrate the importance of VLPFC in auditory and audiovisual working memory for social stimuli but suggest a different role for VLPFC in unimodal visual processing. SIGNIFICANCE STATEMENT The ventral frontal lobe, or inferior frontal gyrus, plays an important role in audiovisual communication in the human brain. Studies with nonhuman primates have found that neurons within ventral prefrontal cortex (VLPFC) encode both faces and vocalizations and that VLPFC is active when animals need to remember these social stimuli. In the present study, we temporarily inactivated VLPFC by cooling the cortex while nonhuman primates performed a working memory task. This impaired the ability of subjects to remember a face and vocalization pair or just the vocalization alone. Our work highlights the importance of the primate VLPFC in the processing of faces and vocalizations in a manner that is similar to the inferior frontal gyrus in the human brain.
Collapse
Affiliation(s)
- Bethany Plakke
- University of Rochester School of Medicine and Dentistry, Department of Neurobiology and Anatomy, Rochester, New York 14642
| | - Jaewon Hwang
- University of Rochester School of Medicine and Dentistry, Department of Neurobiology and Anatomy, Rochester, New York 14642
| | - Lizabeth M Romanski
- University of Rochester School of Medicine and Dentistry, Department of Neurobiology and Anatomy, Rochester, New York 14642
| |
Collapse
|
37
|
Ortiz-Rios M, Kuśmierek P, DeWitt I, Archakov D, Azevedo FAC, Sams M, Jääskeläinen IP, Keliris GA, Rauschecker JP. Functional MRI of the vocalization-processing network in the macaque brain. Front Neurosci 2015; 9:113. [PMID: 25883546 PMCID: PMC4381638 DOI: 10.3389/fnins.2015.00113] [Citation(s) in RCA: 37] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2014] [Accepted: 03/17/2015] [Indexed: 12/12/2022] Open
Abstract
Using functional magnetic resonance imaging in awake behaving monkeys we investigated how species-specific vocalizations are represented in auditory and auditory-related regions of the macaque brain. We found clusters of active voxels along the ascending auditory pathway that responded to various types of complex sounds: inferior colliculus (IC), medial geniculate nucleus (MGN), auditory core, belt, and parabelt cortex, and other parts of the superior temporal gyrus (STG) and sulcus (STS). Regions sensitive to monkey calls were most prevalent in the anterior STG, but some clusters were also found in frontal and parietal cortex on the basis of comparisons between responses to calls and environmental sounds. Surprisingly, we found that spectrotemporal control sounds derived from the monkey calls (“scrambled calls”) also activated the parietal and frontal regions. Taken together, our results demonstrate that species-specific vocalizations in rhesus monkeys activate preferentially the auditory ventral stream, and in particular areas of the antero-lateral belt and parabelt.
Collapse
Affiliation(s)
- Michael Ortiz-Rios
- Department of Neuroscience, Georgetown University Medical Center Washington, DC, USA ; Department of Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics Tübingen, Germany ; IMPRS for Cognitive and Systems Neuroscience Tübingen, Germany
| | - Paweł Kuśmierek
- Department of Neuroscience, Georgetown University Medical Center Washington, DC, USA
| | - Iain DeWitt
- Department of Neuroscience, Georgetown University Medical Center Washington, DC, USA
| | - Denis Archakov
- Department of Neuroscience, Georgetown University Medical Center Washington, DC, USA ; Brain and Mind Laboratory, Department of Neuroscience and Biomedical Engineering, Aalto University School of Science Aalto, Finland
| | - Frederico A C Azevedo
- Department of Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics Tübingen, Germany ; IMPRS for Cognitive and Systems Neuroscience Tübingen, Germany
| | - Mikko Sams
- Brain and Mind Laboratory, Department of Neuroscience and Biomedical Engineering, Aalto University School of Science Aalto, Finland
| | - Iiro P Jääskeläinen
- Brain and Mind Laboratory, Department of Neuroscience and Biomedical Engineering, Aalto University School of Science Aalto, Finland
| | - Georgios A Keliris
- Department of Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics Tübingen, Germany ; Bernstein Centre for Computational Neuroscience Tübingen, Germany ; Department of Biomedical Sciences, University of Antwerp Wilrijk, Belgium
| | - Josef P Rauschecker
- Department of Neuroscience, Georgetown University Medical Center Washington, DC, USA ; Brain and Mind Laboratory, Department of Neuroscience and Biomedical Engineering, Aalto University School of Science Aalto, Finland ; Institute for Advanced Study and Department of Neurology, Klinikum Rechts der Isar, Technische Universität München München, Germany
| |
Collapse
|
38
|
Abstract
During communication we combine auditory and visual information. Neurophysiological research in nonhuman primates has shown that single neurons in ventrolateral prefrontal cortex (VLPFC) exhibit multisensory responses to faces and vocalizations presented simultaneously. However, whether VLPFC is also involved in maintaining those communication stimuli in working memory or combining stored information across different modalities is unknown, although its human homolog, the inferior frontal gyrus, is known to be important in integrating verbal information from auditory and visual working memory. To address this question, we recorded from VLPFC while rhesus macaques (Macaca mulatta) performed an audiovisual working memory task. Unlike traditional match-to-sample/nonmatch-to-sample paradigms, which use unimodal memoranda, our nonmatch-to-sample task used dynamic movies consisting of both facial gestures and the accompanying vocalizations. For the nonmatch conditions, a change in the auditory component (vocalization), the visual component (face), or both components was detected. Our results show that VLPFC neurons are activated by stimulus and task factors: while some neurons simply responded to a particular face or a vocalization regardless of the task period, others exhibited activity patterns typically related to working memory such as sustained delay activity and match enhancement/suppression. In addition, we found neurons that detected the component change during the nonmatch period. Interestingly, some of these neurons were sensitive to the change of both components and therefore combined information from auditory and visual working memory. These results suggest that VLPFC is not only involved in the perceptual processing of faces and vocalizations but also in their mnemonic processing.
Collapse
|