1
|
Takahashi M, Veale R. Pathways for Naturalistic Looking Behavior in Primate I: Behavioral Characteristics and Brainstem Circuits. Neuroscience 2023; 532:133-163. [PMID: 37776945 DOI: 10.1016/j.neuroscience.2023.09.009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2023] [Revised: 09/09/2023] [Accepted: 09/18/2023] [Indexed: 10/02/2023]
Abstract
Organisms control their visual worlds by moving their eyes, heads, and bodies. This control of "gaze" or "looking" is key to survival and intelligence, but our investigation of the underlying neural mechanisms in natural conditions is hindered by technical limitations. Recent advances have enabled measurement of both brain and behavior in freely moving animals in complex environments, expanding on historical head-fixed laboratory investigations. We juxtapose looking behavior as traditionally measured in the laboratory against looking behavior in naturalistic conditions, finding that behavior changes when animals are free to move or when stimuli have depth or sound. We specifically focus on the brainstem circuits driving gaze shifts and gaze stabilization. The overarching goal of this review is to reconcile historical understanding of the differential neural circuits for different "classes" of gaze shift with two inconvenient truths. (1) "classes" of gaze behavior are artificial. (2) The neural circuits historically identified to control each "class" of behavior do not operate in isolation during natural behavior. Instead, multiple pathways combine adaptively and non-linearly depending on individual experience. While the neural circuits for reflexive and voluntary gaze behaviors traverse somewhat independent brainstem and spinal cord circuits, both can be modulated by feedback, meaning that most gaze behaviors are learned rather than hardcoded. Despite this flexibility, there are broadly enumerable neural pathways commonly adopted among primate gaze systems. Parallel pathways which carry simultaneous evolutionary and homeostatic drives converge in superior colliculus, a layered midbrain structure which integrates and relays these volitional signals to brainstem gaze-control circuits.
Collapse
Affiliation(s)
- Mayu Takahashi
- Department of Systems Neurophysiology, Graduate School of Medical and Dental, Sciences, Tokyo Medical and Dental University, Japan.
| | - Richard Veale
- Department of Neurobiology, Graduate School of Medicine, Kyoto University, Japan
| |
Collapse
|
2
|
Vittek AL, Juan C, Nowak LG, Girard P, Cappe C. Multisensory integration in neurons of the medial pulvinar of macaque monkey. Cereb Cortex 2022; 33:4202-4215. [PMID: 36068947 PMCID: PMC10110443 DOI: 10.1093/cercor/bhac337] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2022] [Revised: 07/29/2022] [Accepted: 07/30/2022] [Indexed: 11/14/2022] Open
Abstract
The pulvinar is a heterogeneous thalamic nucleus, which is well developed in primates. One of its subdivisions, the medial pulvinar, is connected to many cortical areas, including the visual, auditory, and somatosensory cortices, as well as with multisensory areas and premotor areas. However, except for the visual modality, little is known about its sensory functions. A hypothesis is that, as a region of convergence of information from different sensory modalities, the medial pulvinar plays a role in multisensory integration. To test this hypothesis, 2 macaque monkeys were trained to a fixation task and the responses of single-units to visual, auditory, and auditory-visual stimuli were examined. Analysis revealed auditory, visual, and multisensory neurons in the medial pulvinar. It also revealed multisensory integration in this structure, mainly suppressive (the audiovisual response is less than the strongest unisensory response) and subadditive (the audiovisual response is less than the sum of the auditory and the visual responses). These findings suggest that the medial pulvinar is involved in multisensory integration.
Collapse
Affiliation(s)
- Anne-Laure Vittek
- Centre de Recherche Cerveau et Cognition (CerCo), CNRS UMR 5549, Université de Toulouse, UPS, Toulouse, France
| | - Cécile Juan
- Centre de Recherche Cerveau et Cognition (CerCo), CNRS UMR 5549, Université de Toulouse, UPS, Toulouse, France
| | - Lionel G Nowak
- Centre de Recherche Cerveau et Cognition (CerCo), CNRS UMR 5549, Université de Toulouse, UPS, Toulouse, France
| | - Pascal Girard
- Centre de Recherche Cerveau et Cognition (CerCo), CNRS UMR 5549, Université de Toulouse, UPS, Toulouse, France.,INSERM, CHU Purpan - BP 3028 - 31024 Toulouse Cedex 3, France
| | - Céline Cappe
- Centre de Recherche Cerveau et Cognition (CerCo), CNRS UMR 5549, Université de Toulouse, UPS, Toulouse, France
| |
Collapse
|
3
|
Merrikhi Y, Kok MA, Carrasco A, Meredith MA, Lomber SG. MULTISENSORY RESPONSES IN A BELT REGION OF THE DORSAL AUDITORY CORTICAL PATHWAY. Eur J Neurosci 2021; 55:589-610. [PMID: 34927294 DOI: 10.1111/ejn.15573] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2021] [Revised: 12/13/2021] [Accepted: 12/14/2021] [Indexed: 11/30/2022]
Abstract
A basic function of the cerebral cortex is to receive and integrate information from different sensory modalities into a comprehensive percept of the environment. Neurons that demonstrate multisensory convergence occur across the necortex, but are especially prevalent in higher-order, association areas. However, a recent study of a cat higher-order auditory area, the dorsal zone (DZ) of auditory cortex, did not observe any multisensory features. Therefore, the goal of the present investigation was to address this conflict using recording and testing methodologies that are established for exposing and studying multisensory neuronal processing. Among the 482 neurons studied, we found that 76.6% were influenced by non-auditory stimuli. Of these neurons, 99% were affected by visual stimulation, but only 11% by somatosensory. Furthermore, a large proportion of the multisensory neurons showed integrated responses to multisensory stimulation, constituted a majority of the excitatory and inhibitory neurons encountered (as identified by the duration of their waveshape), and exhibited a distinct spatial distribution within DZ. These findings demonstrate that the dorsal zone of auditory cortex robustly exhibits multisensory properties and that the proportions of multisensory neurons encountered are consistent with those identified in other higher-order cortices.
Collapse
Affiliation(s)
- Yaser Merrikhi
- Department of Physiology, Faculty of Medicine, McGill University, Montreal, Quebec, Canada
| | - Melanie A Kok
- Graduate Program in Neuroscience, University of Western Ontario, London, Ontario, Canada
| | - Andres Carrasco
- Graduate Program in Neuroscience, University of Western Ontario, London, Ontario, Canada
| | - M Alex Meredith
- Department of Anatomy and Neurobiology, School of Medicine, Virginia Commonwealth University, Richmond, Virginia, USA
| | - Stephen G Lomber
- Department of Physiology, Faculty of Medicine, McGill University, Montreal, Quebec, Canada
| |
Collapse
|
4
|
Smith ES, Crawford TJ. Memory-Guided Saccades in Psychosis: Effects of Medication and Stimulus Location. Brain Sci 2021; 11:1071. [PMID: 34439693 PMCID: PMC8393375 DOI: 10.3390/brainsci11081071] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2021] [Revised: 08/06/2021] [Accepted: 08/11/2021] [Indexed: 11/16/2022] Open
Abstract
The memory-guided saccade task requires the remembrance of a peripheral target location, whilst inhibiting the urge to make a saccade ahead of an auditory cue. The literature has explored the endophenotypic deficits associated with differences in target laterality, but less is known about target amplitude. The data presented came from Crawford et al. (1995), employing a memory-guided saccade task among neuroleptically medicated and non-medicated patients with schizophrenia (n = 31, n = 12), neuroleptically medicated and non-medicated bipolar affective disorder (n = 12, n = 17), and neurotypical controls (n = 30). The current analyses explore the relationships between memory-guided saccades toward targets with different eccentricities (7.5° and 15°), the discernible behaviour exhibited amongst diagnostic groups, and cohorts distinguished based on psychotic symptomatology. Saccade gain control and final eye position were reduced among medicated-schizophrenia patients. These metrics were reduced further among targets with greater amplitudes (15°), indicating greater deficit. The medicated cohort exhibited reduced gain control and final eye positions in both amplitudes compared to the non-medicated cohort, with deficits markedly observed for the furthest targets. No group differences in symptomatology (positive and negative) were reported, however, a greater deficit was observed toward the larger amplitude. This suggests that within the memory-guided saccade paradigm, diagnostic classification is more prominent in characterising disparities in saccade performance than symptomatology.
Collapse
Affiliation(s)
- Eleanor S. Smith
- Department of Psychology, University of Cambridge, Cambridge CB2 3EB, UK
| | - Trevor J. Crawford
- Department of Psychology, Centre for Ageing Research, Lancaster University, Lancaster LA1 4YF, UK;
| |
Collapse
|
5
|
Chaplin TA, Allitt BJ, Hagan MA, Rosa MGP, Rajan R, Lui LL. Auditory motion does not modulate spiking activity in the middle temporal and medial superior temporal visual areas. Eur J Neurosci 2018; 48:2013-2029. [PMID: 30019438 DOI: 10.1111/ejn.14071] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/19/2018] [Revised: 06/27/2018] [Accepted: 07/07/2018] [Indexed: 12/29/2022]
Abstract
The integration of multiple sensory modalities is a key aspect of brain function, allowing animals to take advantage of concurrent sources of information to make more accurate perceptual judgments. For many years, multisensory integration in the cerebral cortex was deemed to occur only in high-level "polysensory" association areas. However, more recent studies have suggested that cross-modal stimulation can also influence neural activity in areas traditionally considered to be unimodal. In particular, several human neuroimaging studies have reported that extrastriate areas involved in visual motion perception are also activated by auditory motion, and may integrate audiovisual motion cues. However, the exact nature and extent of the effects of auditory motion on the visual cortex have not been studied at the single neuron level. We recorded the spiking activity of neurons in the middle temporal (MT) and medial superior temporal (MST) areas of anesthetized marmoset monkeys upon presentation of unimodal stimuli (moving auditory or visual patterns), as well as bimodal stimuli (concurrent audiovisual motion). Despite robust, direction selective responses to visual motion, none of the sampled neurons responded to auditory motion stimuli. Moreover, concurrent moving auditory stimuli had no significant effect on the ability of single MT and MST neurons, or populations of simultaneously recorded neurons, to discriminate the direction of motion of visual stimuli (moving random dot patterns with varying levels of motion noise). Our findings do not support the hypothesis that direct interactions between MT, MST and areas low in the hierarchy of auditory areas underlie audiovisual motion integration.
Collapse
Affiliation(s)
- Tristan A Chaplin
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, Victoria, Australia.,ARC Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, Victoria, Australia
| | - Benjamin J Allitt
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, Victoria, Australia
| | - Maureen A Hagan
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, Victoria, Australia.,ARC Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, Victoria, Australia
| | - Marcello G P Rosa
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, Victoria, Australia.,ARC Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, Victoria, Australia
| | - Ramesh Rajan
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, Victoria, Australia.,ARC Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, Victoria, Australia
| | - Leo L Lui
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, Victoria, Australia.,ARC Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, Victoria, Australia
| |
Collapse
|
6
|
Multisensory integration in orienting behavior: Pupil size, microsaccades, and saccades. Biol Psychol 2017; 129:36-44. [DOI: 10.1016/j.biopsycho.2017.07.024] [Citation(s) in RCA: 44] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2016] [Revised: 06/26/2017] [Accepted: 07/31/2017] [Indexed: 11/22/2022]
|
7
|
Khan AZ, Munoz DP, Takahashi N, Blohm G, McPeek RM. Effects of a pretarget distractor on saccade reaction times across space and time in monkeys and humans. J Vis 2017; 16:5. [PMID: 27148697 PMCID: PMC5833323 DOI: 10.1167/16.7.5] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/02/2022] Open
Abstract
Previous studies have shown that the influence of a behaviorally irrelevant distractor on saccade reaction times (SRTs) varies depending on the temporal and spatial relationship between the distractor and the saccade target. We measured distractor influence on SRTs to a subsequently presented target, varying the spatial location and the timing between the distractor and the target. The distractor appeared at one of four equally eccentric locations, followed by a target (either 50 ms or 200 ms after) at one of 136 different locations encompassing an area of 20° square. We extensively tested two humans and two monkeys on this task to determine interspecies similarities and differences, since monkey neurophysiology is often used to interpret human behavioral findings. Results were similar across species; for the short interval (50 ms), SRTs were shortest to a target presented close to or at the distractor location and increased primarily as a function of the distance from the distractor. There was also an effect of distractor-target direction and visual field. For the long interval (200 ms) the results were inverted; SRTs were longest for short distances between the distractor and target and decreased as a function of distance from distractor. Both SRT patterns were well captured by a two-dimensional dynamic field model with short-distance excitation and long-distance inhibition, based upon known functional connectivity found in the superior colliculus that includes wide-spread excitation and inhibition. Based on these findings, we posit that the different time-dependent patterns of distractor-related SRTs can emerge from the same underlying neuronal mechanisms common to both species.
Collapse
|
8
|
Abstract
Goal-directed behavior can be characterized as a dynamic link between a sensory stimulus and a motor act. Neural correlates of many of the intermediate events of goal-directed behavior are found in the posterior parietal cortex. Although the parietal cortex’s role in guiding visual behaviors has received considerable attention, relatively little is known about its role in mediating auditory behaviors. Here, the authors review recent studies that have focused on how neurons in the lateral intraparietal area (area LIP) differentially process auditory and visual stimuli. These studies suggest that area LIP contains a modality-dependent representation that is highly dependent on behavioral context.
Collapse
Affiliation(s)
- Yale E Cohen
- Department of Psychological and Brain Sciences, Center for Cognitive Neuroscience, Dartmouth College, Hanover, NH
| | | | | |
Collapse
|
9
|
Sliwa J, Planté A, Duhamel JR, Wirth S. Independent Neuronal Representation of Facial and Vocal Identity in the Monkey Hippocampus and Inferotemporal Cortex. Cereb Cortex 2014; 26:950-966. [DOI: 10.1093/cercor/bhu257] [Citation(s) in RCA: 30] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022] Open
|
10
|
Identifying and quantifying multisensory integration: a tutorial review. Brain Topogr 2014; 27:707-30. [PMID: 24722880 DOI: 10.1007/s10548-014-0365-7] [Citation(s) in RCA: 133] [Impact Index Per Article: 13.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2013] [Accepted: 03/26/2014] [Indexed: 12/19/2022]
Abstract
We process information from the world through multiple senses, and the brain must decide what information belongs together and what information should be segregated. One challenge in studying such multisensory integration is how to quantify the multisensory interactions, a challenge that is amplified by the host of methods that are now used to measure neural, behavioral, and perceptual responses. Many of the measures that have been developed to quantify multisensory integration (and which have been derived from single unit analyses), have been applied to these different measures without much consideration for the nature of the process being studied. Here, we provide a review focused on the means with which experimenters quantify multisensory processes and integration across a range of commonly used experimental methodologies. We emphasize the most commonly employed measures, including single- and multiunit responses, local field potentials, functional magnetic resonance imaging, and electroencephalography, along with behavioral measures of detection, accuracy, and response times. In each section, we will discuss the different metrics commonly used to quantify multisensory interactions, including the rationale for their use, their advantages, and the drawbacks and caveats associated with them. Also discussed are possible alternatives to the most commonly used metrics.
Collapse
|
11
|
Dynamic faces speed up the onset of auditory cortical spiking responses during vocal detection. Proc Natl Acad Sci U S A 2013; 110:E4668-77. [PMID: 24218574 DOI: 10.1073/pnas.1312518110] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
How low-level sensory areas help mediate the detection and discrimination advantages of integrating faces and voices is the subject of intense debate. To gain insights, we investigated the role of the auditory cortex in face/voice integration in macaque monkeys performing a vocal-detection task. Behaviorally, subjects were slower to detect vocalizations as the signal-to-noise ratio decreased, but seeing mouth movements associated with vocalizations sped up detection. Paralleling this behavioral relationship, as the signal to noise ratio decreased, the onset of spiking responses were delayed and magnitudes were decreased. However, when mouth motion accompanied the vocalization, these responses were uniformly faster. Conversely, and at odds with previous assumptions regarding the neural basis of face/voice integration, changes in the magnitude of neural responses were not related consistently to audiovisual behavior. Taken together, our data reveal that facilitation of spike latency is a means by which the auditory cortex partially mediates the reaction time benefits of combining faces and voices.
Collapse
|
12
|
Buchholz VN, Goonetilleke SC, Medendorp WP, Corneil BD. Greater benefits of multisensory integration during complex sensorimotor transformations. J Neurophysiol 2012; 107:3135-43. [PMID: 22457453 DOI: 10.1152/jn.01188.2011] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Multisensory integration enables rapid and accurate behavior. To orient in space, sensory information registered initially in different reference frames has to be integrated with the current postural information to produce an appropriate motor response. In some postures, multisensory integration requires convergence of sensory evidence across hemispheres, which would presumably lessen or hinder integration. Here, we examined orienting gaze shifts in humans to visual, tactile, or visuotactile stimuli when the hands were either in a default uncrossed posture or a crossed posture requiring convergence across hemispheres. Surprisingly, we observed the greatest benefits of multisensory integration in the crossed posture, as indexed by reaction time (RT) decreases. Moreover, such shortening of RTs to multisensory stimuli did not come at the cost of increased error propensity. To explain these results, we propose that two accepted principles of multisensory integration, the spatial principle and inverse effectiveness, dynamically interact to aid the rapid and accurate resolution of complex sensorimotor transformations. First, early mutual inhibition of initial visual and tactile responses registered in different hemispheres reduces error propensity. Second, inverse effectiveness in the integration of the weakened visual response with the remapped tactile representation expedites the generation of the correct motor response. Our results imply that the concept of inverse effectiveness, which is usually associated with external stimulus properties, might extend to internal spatial representations that are more complex given certain body postures.
Collapse
Affiliation(s)
- Verena N Buchholz
- Radboud Univ. Nijmegen, Donders Institute for Brain, Cognition, and Behavior, Nijmegen, The Netherlands.
| | | | | | | |
Collapse
|
13
|
Kajikawa Y, Falchier A, Musacchia G, Lakatos P, Schroeder C. Audiovisual Integration in Nonhuman Primates. Front Neurosci 2011. [DOI: 10.1201/9781439812174-8] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022] Open
|
14
|
Kajikawa Y, Falchier A, Musacchia G, Lakatos P, Schroeder C. Audiovisual Integration in Nonhuman Primates. Front Neurosci 2011. [DOI: 10.1201/b11092-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022] Open
|
15
|
Current perspectives and methods in studying neural mechanisms of multisensory interactions. Neurosci Biobehav Rev 2011; 36:111-33. [PMID: 21569794 DOI: 10.1016/j.neubiorev.2011.04.015] [Citation(s) in RCA: 65] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2011] [Accepted: 04/21/2011] [Indexed: 11/22/2022]
Abstract
In the past decade neuroscience has witnessed major advances in the field of multisensory interactions. A large body of research has revealed several new types of cross-sensory interactions. In addition, multisensory interactions have been reported at temporal and spatial system levels previously thought of as strictly unimodal. We review the findings that have led to the current broad consensus that most, if not all, higher, as well as lower level neural processes are in some form multisensory. We continue by outlining the progress that has been made in identifying the functional significance of different types of interactions, for example, in subserving stimulus binding and enhancement of perceptual certainty. Finally, we provide a critical introduction to cutting edge methods from bayes optimal integration to multivoxel pattern analysis as applied to multisensory research at different system levels.
Collapse
|
16
|
Satel J, Wang Z, Trappenberg T, Klein R. Modeling inhibition of return as short-term depression of early sensory input to the superior colliculus. Vision Res 2011; 51:987-96. [DOI: 10.1016/j.visres.2011.02.013] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2010] [Revised: 02/02/2011] [Accepted: 02/11/2011] [Indexed: 11/27/2022]
|
17
|
Katyal S, Zughni S, Greene C, Ress D. Topography of covert visual attention in human superior colliculus. J Neurophysiol 2010; 104:3074-83. [PMID: 20861435 DOI: 10.1152/jn.00283.2010] [Citation(s) in RCA: 47] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Experiments were performed to examine the topography of covert visual attention signals in human superior colliculus (SC), both across its surface and in its depth. We measured the retinotopic organization of SC to direct visual stimulation using a 90° wedge of moving dots that slowly rotated around fixation. Subjects (n = 5) were cued to perform a difficult speed-discrimination task in the rotating region. To measure the retinotopy of covert attention, we used a full-field array of similarly moving dots. Subjects were cued to perform the same speed-discrimination task within a 90° wedge-shaped region, and only the cue rotated around fixation. High-resolution functional magnetic resonance imaging (fMRI, 1.2 mm voxels) data were acquired throughout SC. These data were then aligned to a high-resolution T1-weighted reference volume. The SC was segmented in this volume so that the surface of the SC could be computationally modeled and to permit calculation of a depth map for laminar analysis. Retinotopic maps were obtained for both direct visual stimulation and covert attention. These maps showed a similar spatial distribution to visual stimulation maps observed in rhesus macaque and were in registration with each other. Within the depth of SC, both visual attention and stimulation produced activity primarily in the superficial and intermediate layers, but stimulation activity extended significantly more deeply than attention.
Collapse
Affiliation(s)
- Sucharit Katyal
- University of Texas at Austin, Center for Perceptual Systems, Section for Neurobiology, and Department of Psychology, Austin, Texas 78712, USA
| | | | | | | |
Collapse
|
18
|
Wakita M, Shibasaki M, Ishizuka T, Schnackenberg J, Fujiawara M, Masataka N. Measurement of neuronal activity in a macaque monkey in response to animate images using near-infrared spectroscopy. Front Behav Neurosci 2010; 4:31. [PMID: 20676236 PMCID: PMC2912168 DOI: 10.3389/fnbeh.2010.00031] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2010] [Accepted: 05/16/2010] [Indexed: 11/23/2022] Open
Abstract
Near-infrared spectroscopy (NIRS) has been used extensively for functional neuroimaging over the past decade, in part because it is considered a powerful tool for investigating brain function in human infants and young children, for whom other neuroimaging techniques are not suitable. In particular, several studies have measured hemodynamic responses in the occipital region in infants upon exposure to visual stimuli. In the present study, we used a multi-channel NIRS to measure neuronal activity in a macaque monkey who was trained to watch videos showing various circus animals performing acrobatic activities without fixing the head position of the monkey. Cortical activity from the occipital region was measured first by placing a probe comprising a 3 × 5 array of emitters and detectors (2 × 4 cm) on the area (area 17), and the robustness and stability of the results were confirmed across sessions. Cortical responses were then measured from the dorsofrontal region. The oxygenated hemoglobin signals increased in area 9 and decreased in area 8b in response to viewing the videos. The results suggest that these regions are involved in cognitive processing of visually presented stimuli. The monkey showed positive responsiveness to the stimuli from the affective standpoint, but its attentional response to them was an inhibitory one.
Collapse
Affiliation(s)
- Masumi Wakita
- Primate Research Institute, Kyoto University Inuyama, Japan
| | | | | | | | | | | |
Collapse
|
19
|
Rhesus monkeys' valuation of vocalizations during a free-choice task. PLoS One 2009; 4:e7834. [PMID: 19924223 PMCID: PMC2771902 DOI: 10.1371/journal.pone.0007834] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2009] [Accepted: 10/16/2009] [Indexed: 11/29/2022] Open
Abstract
Adaptive behavior requires that animals integrate current and past information with their decision-making. One important type of information is auditory-communication signals (i.e., species-specific vocalizations). Here, we tested how rhesus monkeys incorporate the opportunity to listen to different species-specific vocalizations into their decision-making processes. In particular, we tested how monkeys value these vocalizations relative to the opportunity to get a juice reward. To test this hypothesis, monkeys chose one of two targets to get a varying juice reward; at one of those targets, in addition to the juice reward, a vocalization was presented. By titrating the juice amounts at the two targets, we quantified the relationship between the monkeys' juice choices relative to the opportunity to listen to a vocalization. We found that, rhesus were not willing to give up a large juice reward to listen to vocalizations indicating that, relative to a juice reward, listening to vocalizations has a low value.
Collapse
|
20
|
Chandrasekaran C, Ghazanfar AA. Different neural frequency bands integrate faces and voices differently in the superior temporal sulcus. J Neurophysiol 2008; 101:773-88. [PMID: 19036867 DOI: 10.1152/jn.90843.2008] [Citation(s) in RCA: 75] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
The integration of auditory and visual information is required for the default mode of speech-face-to-face communication. As revealed by functional magnetic resonance imaging and electrophysiological studies, the regions in and around the superior temporal sulcus (STS) are implicated in this process. To provide greater insights into the network-level dynamics of the STS during audiovisual integration, we used a macaque model system to analyze the different frequency bands of local field potential (LFP) responses to the auditory and visual components of vocalizations. These vocalizations (like human speech) have a natural time delay between the onset of visible mouth movements and the onset of the voice (the "time-to-voice" or TTV). We show that the LFP responses to faces and voices elicit distinct bands of activity in the theta (4-8 Hz), alpha (8-14 Hz), and gamma (>40 Hz) frequency ranges. Along with single neuron responses, the gamma band activity was greater for face stimuli than voice stimuli. Surprisingly, the opposite was true for the low-frequency bands-auditory responses were of a greater magnitude. Furthermore, gamma band responses in STS were sustained for dynamic faces but not so for voices (the opposite is true for auditory cortex). These data suggest that visual and auditory stimuli are processed in fundamentally different ways in the STS. Finally, we show that the three bands integrate faces and voices differently: theta band activity showed weak multisensory behavior regardless of TTV, the alpha band activity was enhanced for calls with short TTVs but showed little integration for longer TTVs, and finally, the gamma band activity was consistently enhanced for all TTVs. These data demonstrate that LFP activity from the STS can be segregated into distinct frequency bands which integrate audiovisual communication signals in an independent manner. These different bands may reflect different spatial scales of network processing during face-to-face communication.
Collapse
Affiliation(s)
- Chandramouli Chandrasekaran
- Neuroscience Institute and Department of Psychology, Princeton University, Green Hall, Princeton, NJ 08540, USA
| | | |
Collapse
|
21
|
Morgan ML, Deangelis GC, Angelaki DE. Multisensory integration in macaque visual cortex depends on cue reliability. Neuron 2008; 59:662-73. [PMID: 18760701 DOI: 10.1016/j.neuron.2008.06.024] [Citation(s) in RCA: 145] [Impact Index Per Article: 9.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2008] [Revised: 06/08/2008] [Accepted: 06/22/2008] [Indexed: 10/21/2022]
Abstract
Responses of multisensory neurons to combinations of sensory cues are generally enhanced or depressed relative to single cues presented alone, but the rules that govern these interactions have remained unclear. We examined integration of visual and vestibular self-motion cues in macaque area MSTd in response to unimodal as well as congruent and conflicting bimodal stimuli in order to evaluate hypothetical combination rules employed by multisensory neurons. Bimodal responses were well fit by weighted linear sums of unimodal responses, with weights typically less than one (subadditive). Surprisingly, our results indicate that weights change with the relative reliabilities of the two cues: visual weights decrease and vestibular weights increase when visual stimuli are degraded. Moreover, both modulation depth and neuronal discrimination thresholds improve for matched bimodal compared to unimodal stimuli, which might allow for increased neural sensitivity during multisensory stimulation. These findings establish important new constraints for neural models of cue integration.
Collapse
Affiliation(s)
- Michael L Morgan
- Department of Anatomy and Neurobiology, Washington University School of Medicine, St. Louis, MO 63110, USA
| | | | | |
Collapse
|
22
|
Wang Y, Celebrini S, Trotter Y, Barone P. Visuo-auditory interactions in the primary visual cortex of the behaving monkey: electrophysiological evidence. BMC Neurosci 2008; 9:79. [PMID: 18699988 PMCID: PMC2527609 DOI: 10.1186/1471-2202-9-79] [Citation(s) in RCA: 108] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2008] [Accepted: 08/12/2008] [Indexed: 11/18/2022] Open
Abstract
Background Visual, tactile and auditory information is processed from the periphery to the cortical level through separate channels that target primary sensory cortices, from which it is further distributed to functionally specialized areas. Multisensory integration is classically assigned to higher hierarchical cortical areas, but there is growing electrophysiological evidence in man and monkey of multimodal interactions in areas thought to be unimodal, interactions that can occur at very short latencies. Such fast timing of multisensory interactions rules out the possibility of an origin in the polymodal areas mediated through back projections, but is rather in favor of heteromodal connections such as the direct projections observed in the monkey, from auditory areas (including the primary auditory cortex AI) directly to the primary visual cortex V1. Based on the existence of such AI to V1 projections, we looked for modulation of neuronal visual responses in V1 by an auditory stimulus in the awake behaving monkey. Results Behavioral or electrophysiological data were obtained from two behaving monkeys. One monkey was trained to maintain a passive central fixation while a peripheral visual (V) or visuo-auditory (AV) stimulus was presented. From a population of 45 V1 neurons, there was no difference in the mean latencies or strength of visual responses when comparing V and AV conditions. In a second active task, the monkey was required to orient his gaze toward the visual or visuo-auditory stimulus. From a population of 49 cells recorded during this saccadic task, we observed a significant reduction in response latencies in the visuo-auditory condition compared to the visual condition (mean 61.0 vs. 64.5 ms) only when the visual stimulus was at midlevel contrast. No effect was observed at high contrast. Conclusion Our data show that single neurons from a primary sensory cortex such as V1 can integrate sensory information of a different modality, a result that argues against a strict hierarchical model of multisensory integration. Multisensory interaction in V1 is, in our experiment, expressed by a significant reduction in visual response latencies specifically in suboptimal conditions and depending on the task demand. This suggests that neuronal mechanisms of multisensory integration are specific and adapted to the perceptual features of behavior.
Collapse
Affiliation(s)
- Ye Wang
- Centre de Recherche Cerveau & Cognition, UMR CNRS 5549, Faculté de Médecine de Rangueil, 31062 Toulouse Cedex 9, France.
| | | | | | | |
Collapse
|
23
|
Allman BL, Bittencourt-Navarrete RE, Keniston LP, Medina AE, Wang MY, Meredith MA. Do cross-modal projections always result in multisensory integration? Cereb Cortex 2008; 18:2066-76. [PMID: 18203695 DOI: 10.1093/cercor/bhm230] [Citation(s) in RCA: 38] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Convergence of afferents from different sensory modalities has generally been thought to produce bimodal (and trimodal) neurons (i.e., exhibit suprathreshold excitation to more than 1 sensory modality). Consequently, studies identifying cross-modal connections assume that such convergence results in bimodal (or trimodal) neurons that produce familiar forms of multisensory integration: response enhancement or depression. The present study questioned that assumption by anatomically identifying a projection from ferret auditory to visual cortex Area 21. However, electrophysiological recording within Area 21 not only failed to identify a single bimodal neuron but also familiar forms of multisensory integration were not observed either. Instead, a small proportion of neurons (9%; 27/296) showed subthreshold multisensory integration, in which visual responses were significantly modulated by auditory inputs. Such subthreshold multisensory effects were enhanced by gamma-aminobutyric acid antagonism, whereby a majority of neurons (87%; 20/23) now participated in a significant, multisensory population effect. Thus, multisensory convergence does not de facto result in bimodal (or trimodal) neurons or the traditional forms of multisensory integration. However, the fact that unimodal neurons exhibited a subthreshold form of multisensory integration not only affirms the relationship between convergence and integration but also expands our understanding of the functional repertoire of multisensory processing itself.
Collapse
Affiliation(s)
- Brian L Allman
- Department of Anatomy and Neurobiology, Virginia Commonwealth University, School of Medicine, Richmond, VA 23298, USA.
| | | | | | | | | | | |
Collapse
|
24
|
Kayser C, Petkov CI, Logothetis NK. Visual modulation of neurons in auditory cortex. ACTA ACUST UNITED AC 2008; 18:1560-74. [PMID: 18180245 DOI: 10.1093/cercor/bhm187] [Citation(s) in RCA: 367] [Impact Index Per Article: 22.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
Our brain integrates the information provided by the different sensory modalities into a coherent percept, and recent studies suggest that this process is not restricted to higher association areas. Here we evaluate the hypothesis that auditory cortical fields are involved in cross-modal processing by probing individual neurons for audiovisual interactions. We find that visual stimuli modulate auditory processing both at the level of field potentials and single-unit activity and already in primary and secondary auditory fields. These interactions strongly depend on a stimulus' efficacy in driving the neurons but occur independently of stimulus category and for naturalistic as well as artificial stimuli. In addition, interactions are sensitive to the relative timing of audiovisual stimuli and are strongest when visual stimuli lead by 20-80 msec. Exploring the underlying mechanisms, we find that enhancement correlates with the resetting of slow (approximately 10 Hz) oscillations to a phase angle of optimal excitability. These results demonstrate that visual stimuli can modulate the firing of neurons in auditory cortex in a manner that depends on stimulus efficacy and timing. These neurons thus meet the criteria for sensory integration and provide the auditory modality with multisensory contextual information about co-occurring environmental events.
Collapse
Affiliation(s)
- Christoph Kayser
- Max Planck Institute for Biological Cybernetics, Spemannstrasse 38, 72076 Tübingen, Germany.
| | | | | |
Collapse
|
25
|
Lucchetti C, Lanzilotto M, Bon L. Auditory-motor and cognitive aspects in area 8B of macaque monkey's frontal cortex: a premotor ear-eye field (PEEF). Exp Brain Res 2007; 186:131-41. [PMID: 18038127 DOI: 10.1007/s00221-007-1216-5] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2007] [Accepted: 11/06/2007] [Indexed: 11/26/2022]
Abstract
In previous reports, we showed the involvement of area 8B neurons in both spontaneous ear and eye movement and in auditory information processing. Audition-related cells responded to complex environmental stimuli, but not to pure tones, and their activity changed during visual fixation as a possible inhibitory expression of the engagement of attention. We observed auditory, auditory-motor and motor cells for both eye and ear movements. This finding suggests that area 8B may be involved in the integration of auditory input with ear and eye motor output. In this paper, we extended these previous studies by examining area 8B activity in relation to auditive orienting behaviour, as well as the ocular orientation (i.e., visual fixation) studied previously. Visual fixation led to inhibition of activity in auditory and auditory-motor cells, which suggests that attention may be involved in both, maintaining the eye position and reducing the response of these cell types. Accordingly, during a given task or natural behaviour, spatial attention seems to affect more than one sensorimotor channel simultaneously. These data add to our understanding of how the neural network, through a two-channel attentive process, accomplishes to switch between two effectors, namely eyes and ears. Considering the functional, anatomical and cytoarchitectonic differences among the frontal eye field (FEF), the supplementary eye field (SEF) and area 8B, we propose to consider area 8B as a separate premotor ear-eye field (PEEF).
Collapse
Affiliation(s)
- C Lucchetti
- Department of Biomedical Sciences Section of Physiology and Animal Facilities Centre Section of Policlinic, University of Modena and Reggio Emilia, Via Campi 287, 41100, Modena, Italy
| | | | | |
Collapse
|
26
|
Operant reflex-related neuronal activity in the tectum of the superior colliculus and mesencephalic reticular formation of the cat. NEUROPHYSIOLOGY+ 2007. [DOI: 10.1007/s11062-007-0028-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
27
|
Avillac M, Ben Hamed S, Duhamel JR. Multisensory integration in the ventral intraparietal area of the macaque monkey. J Neurosci 2007; 27:1922-32. [PMID: 17314288 PMCID: PMC6673547 DOI: 10.1523/jneurosci.2646-06.2007] [Citation(s) in RCA: 204] [Impact Index Per Article: 12.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
The goal of this study was to characterize multisensory interaction patterns in cortical ventral intraparietal area (VIP). We recorded single-unit activity in two alert monkeys during the presentation of visual (drifting gratings) and tactile (low-pressure air puffs) stimuli. One stimulus was always positioned inside the receptive field of the neuron. The other stimulus was defined so as to manipulate the spatial and temporal disparity between the two stimuli. More than 70% of VIP cells showed a significant modulation of their response by bimodal stimulations. These cells included both bimodal cells, i.e., cells responsive to both tested modalities, and seemingly unimodal cells, i.e., cells responding to only one of the two tested modalities. This latter observation suggests that postsynaptic latent mechanisms are involved in multisensory integration. In both cell categories, neuronal responses are either enhanced or depressed and reflect nonlinear sub-, super-, or additive mechanisms. The occurrence of these observations is maximum when stimuli are in temporal synchrony and spatially congruent. Interestingly, introducing spatial or temporal disparities between stimuli does not affect the sign or the magnitude of interactions but rather their occurrence. Multisensory stimulation also affects the neuronal response latencies of bimodal stimuli. For a given neuron, these are on average intermediate between the two unimodal response latencies, again suggesting latent postsynaptic mechanisms. In summary, we show that the majority of VIP neurons perform multisensory integration, following general rules (e.g., spatial congruency and temporal synchrony) that are closely similar to those described in other cortical and subcortical regions.
Collapse
Affiliation(s)
- Marie Avillac
- Institut des Sciences Cognitives, Centre National de la Recherche Scientifique, Université de Lyon 1, F-69675 Bron, France
| | | | | |
Collapse
|
28
|
Kuraoka K, Nakamura K. Responses of single neurons in monkey amygdala to facial and vocal emotions. J Neurophysiol 2006; 97:1379-87. [PMID: 17182913 DOI: 10.1152/jn.00464.2006] [Citation(s) in RCA: 79] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
The face and voice can independently convey the same information about emotion. When we see an angry face or hear an angry voice, we can perceive a person's anger. These two different sensory cues are interchangeable in this sense. However, it is still unclear whether the same group of neurons process signals for facial and vocal emotions. We recorded neuronal activity in the amygdala of monkeys while watching nine video clips of species-specific emotional expressions: three monkeys showing three emotional expressions (aggressive threat, scream, and coo). Of the 227 amygdala neurons tested, 116 neurons (51%) responded to at least one of the emotional expressions. These "monkey-responsive" neurons-that is, neurons that responded to monkey-specific emotional expression-preferred the scream to other emotional expressions irrespective of identity. To determine the element crucial to neuronal responses, the activity of 79 monkey-responsive neurons was recorded while a facial or vocal element of a stimulus was presented alone. Although most neurons (61/79, 77%) strongly responded to the visual but not to the auditory element, about one fifth (16/79, 20%) maintained a good response when either the facial or vocal element was presented. Moreover, these neurons maintained their stimulus-preference profiles under facial and vocal conditions. These neurons were found in the central nucleus of the amygdala, the nucleus that receives inputs from other amygdala nuclei and in turn sends outputs to other emotion-related brain areas. These supramodal responses to emotion would be of use in generating appropriate responses to information regarding either facial or vocal emotion.
Collapse
Affiliation(s)
- Koji Kuraoka
- National Institute of Neuroscience, National Center of Neurology and Psychiatry, 4-1-1 Ogawa-Higashi, Kodaira, Tokyo 187-8502, Japan
| | | |
Collapse
|
29
|
Bizley JK, Nodal FR, Bajo VM, Nelken I, King AJ. Physiological and anatomical evidence for multisensory interactions in auditory cortex. Cereb Cortex 2006; 17:2172-89. [PMID: 17135481 PMCID: PMC7116518 DOI: 10.1093/cercor/bhl128] [Citation(s) in RCA: 247] [Impact Index Per Article: 13.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Recent studies, conducted almost exclusively in primates, have shown that several cortical areas usually associated with modality-specific sensory processing are subject to influences from other senses. Here we demonstrate using single-unit recordings and estimates of mutual information that visual stimuli can influence the activity of units in the auditory cortex of anesthetized ferrets. In many cases, these units were also acoustically responsive and frequently transmitted more information in their spike discharge patterns in response to paired visual-auditory stimulation than when either modality was presented by itself. For each stimulus, this information was conveyed by a combination of spike count and spike timing. Even in primary auditory areas (primary auditory cortex [A1] and anterior auditory field [AAF]), approximately 15% of recorded units were found to have nonauditory input. This proportion increased in the higher level fields that lie ventral to A1/AAF and was highest in the anterior ventral field, where nearly 50% of the units were found to be responsive to visual stimuli only and a further quarter to both visual and auditory stimuli. Within each field, the pure-tone response properties of neurons sensitive to visual stimuli did not differ in any systematic way from those of visually unresponsive neurons. Neural tracer injections revealed direct inputs from visual cortex into auditory cortex, indicating a potential source of origin for the visual responses. Primary visual cortex projects sparsely to A1, whereas higher visual areas innervate auditory areas in a field-specific manner. These data indicate that multisensory convergence and integration are features common to all auditory cortical areas but are especially prevalent in higher areas.
Collapse
Affiliation(s)
- Jennifer K Bizley
- Department of Physiology, Anatomy, and Genetics, University of Oxford, Oxford OX1 3PT, UK.
| | | | | | | | | |
Collapse
|
30
|
Ghazanfar AA, Maier JX, Hoffman KL, Logothetis NK. Multisensory integration of dynamic faces and voices in rhesus monkey auditory cortex. J Neurosci 2006; 25:5004-12. [PMID: 15901781 PMCID: PMC6724848 DOI: 10.1523/jneurosci.0799-05.2005] [Citation(s) in RCA: 398] [Impact Index Per Article: 22.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
In the social world, multiple sensory channels are used concurrently to facilitate communication. Among human and nonhuman primates, faces and voices are the primary means of transmitting social signals (Adolphs, 2003; Ghazanfar and Santos, 2004). Primates recognize the correspondence between species-specific facial and vocal expressions (Massaro, 1998; Ghazanfar and Logothetis, 2003; Izumi and Kojima, 2004), and these visual and auditory channels can be integrated into unified percepts to enhance detection and discrimination. Where and how such communication signals are integrated at the neural level are poorly understood. In particular, it is unclear what role "unimodal" sensory areas, such as the auditory cortex, may play. We recorded local field potential activity, the signal that best correlates with human imaging and event-related potential signals, in both the core and lateral belt regions of the auditory cortex in awake behaving rhesus monkeys while they viewed vocalizing conspecifics. We demonstrate unequivocally that the primate auditory cortex integrates facial and vocal signals through enhancement and suppression of field potentials in both the core and lateral belt regions. The majority of these multisensory responses were specific to face/voice integration, and the lateral belt region shows a greater frequency of multisensory integration than the core region. These multisensory processes in the auditory cortex likely occur via reciprocal interactions with the superior temporal sulcus.
Collapse
Affiliation(s)
- Asif A Ghazanfar
- Max Planck Institute for Biological Cybernetics, 72076 Tuebingen, Germany.
| | | | | | | |
Collapse
|
31
|
Meredith MA, Keniston LR, Dehner LR, Clemo HR. Crossmodal projections from somatosensory area SIV to the auditory field of the anterior ectosylvian sulcus (FAES) in Cat: further evidence for subthreshold forms of multisensory processing. Exp Brain Res 2006; 172:472-84. [PMID: 16501962 DOI: 10.1007/s00221-006-0356-3] [Citation(s) in RCA: 39] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2005] [Accepted: 12/30/2005] [Indexed: 10/25/2022]
Abstract
To date, evaluation of the neuronal basis for multisensory processing has focused on the convergence pattern that provides excitation from more than one sensory modality. However, a recent study (Dehner et al. in Cereb Cortex 14:387-401, 2004) has demonstrated excitatory-inhibitory multisensory effects that do not follow this conventional pattern and the present investigation documented a similar example of subthreshold cross-modal effects. Neuroanatomical tracers revealed that pyramidal neurons of the somatosensory area SIV project to the auditory field of the anterior ectosylvian sulcus (FAES), but subsequent electrophysiological tests showed that stimulation of SIV failed to elicit the expected orthodromic responses in FAES. Instead, combined auditory-SIV stimulation significantly suppressed FAES responses to auditory cues in approximately 25% of the neurons tested, and facilitated responses in another 5%. These modulatory responses in auditory FAES were similar in kind to those observed in somatosensory SIV and, as such, comprise further evidence for subthreshold forms of multisensory processing in cortex. Consequently, it seems likely that subthreshold cross-modal effects may impact other apparently 'unimodal' areas of the brain.
Collapse
Affiliation(s)
- M Alex Meredith
- Department of Anatomy and Neurobiology, Virginia Commonwealth University School of Medicine, Richmond, VA 23298, USA.
| | | | | | | |
Collapse
|
32
|
Ross RG, Heinlein S, Zerbe GO, Radant A. Saccadic eye movement task identifies cognitive deficits in children with schizophrenia, but not in unaffected child relatives. J Child Psychol Psychiatry 2005; 46:1354-62. [PMID: 16313436 DOI: 10.1111/j.1469-7610.2005.01437.x] [Citation(s) in RCA: 29] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
BACKGROUND The delayed oculomotor response (DOR) task requires response inhibition followed by movement of gaze towards a known spatial location without a current stimulus. Abnormalities in response inhibition and in the spatial accuracy of the eye movement are found in individuals with schizophrenia and in many of their relatives, supporting the use of these saccadic abnormalities as endophenotypes in genetic studies. It is unknown whether school-age children, either with psychosis or as relatives of a schizophrenic proband, can be included. METHOD One hundred eighty-seven children, ages 5.8-16.0 years - 45 children with childhood-onset schizophrenia, 64 children with a first-degree relative with schizophrenia, and 84 typically developing children - completed DOR tasks with 1 and 3 second delays. RESULTS Children with childhood-onset schizophrenia demonstrated impaired response inhibition and impaired spatial accuracy compared to both relatives and typicals; however, relatives and typicals did not differ from each other. CONCLUSIONS Children with childhood-onset schizophrenia have saccadic abnormalities similar to those found in adults with schizophrenia, supporting the continuity of executive function deficits in childhood-onset with adolescent and adult-onset schizophrenia. However, saccadic tasks are not sensitive to genetic risk in non-psychotic children and 6-15-year-old children should not be included in genetic studies utilizing this endophenotype.
Collapse
Affiliation(s)
- Randal G Ross
- University of Colorado Health Sciences Center, CO, USA.
| | | | | | | |
Collapse
|
33
|
Bon L, Lucchetti C. Auditory environmental cells and visual fixation effect in area 8B of macaque monkey. Exp Brain Res 2005; 168:441-9. [PMID: 16317576 DOI: 10.1007/s00221-005-0197-5] [Citation(s) in RCA: 15] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2005] [Accepted: 08/12/2005] [Indexed: 11/28/2022]
Abstract
Area 8B may be treated as part of either the prefrontal cortex or the premotor cortex. Previous investigations showed an involvement of area 8B in both eye and ear motor control and in auditory perception. In this report, we studied 139 neurons in three macaque monkeys of these, 32 neurons showed an activity related to environmental auditory stimuli. Fifteen cells with auditory characteristics (15/32) presented a firing discharge inhibited during the execution of visual fixation. The remaining 107 units presented complex or indefinable behaviour. The presence of auditory environmental cells which activity is related prevalently to the voice of persons (researchers) suggests that area 8B may be an area involved in auditory cross-modal association, in natural behaviour. The inhibitory effects during visual fixation suggest that area 8B is part of the inhibitory network preventing the gaze shift in relation to an auditory stimulus. This may be the consequence of the engagement of attention during fixation that may affect the auditory perception. Both aspects indicate that area 8B is involved in high cognitive processes in auditory and orienting processes.
Collapse
Affiliation(s)
- Leopoldo Bon
- Department of Biomedical Sciences, Section of Physiology and Animal Facilities Center, University of Modena and Reggio Emilia, Via Campi 287, 41100, Modena, Italy.
| | | |
Collapse
|
34
|
Schenberg LC, Póvoa RMF, Costa ALP, Caldellas AV, Tufik S, Bittencourt AS. Functional specializations within the tectum defense systems of the rat. Neurosci Biobehav Rev 2005; 29:1279-98. [PMID: 16087233 DOI: 10.1016/j.neubiorev.2005.05.006] [Citation(s) in RCA: 65] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/24/2004] [Revised: 05/03/2005] [Accepted: 05/03/2005] [Indexed: 01/29/2023]
Abstract
Here we review the differential contribution of the periaqueductal gray matter (PAG) and superior colliculus (SC) to the generation of rat defensive behaviors. The results of studies involving sine-wave and rectangular pulse electrical stimulation and chemical (NMDA) stimulation are summarized. Stimulation of SC and PAG produced freezing and flight behaviors along with exophthalmus (fully opened bulged eyes), micturition and defecation. The columnar organization of the PAG was evident in the results obtained. Defecation was elicited primarily by lateral PAG stimulation, while the remaining defensive behaviors were similarly elicited by lateral and dorsolateral PAG stimulation, although with the lowest thresholds in the dorsolateral column. Conversely, the ventrolateral PAG did not appear to participate in unconditioned defensive behaviors, which were only elicited by high intensity stimulation likely to encroach on adjacent regions. In the SC, the most important differences relative to the PAG were the lack of stimulation-evoked jumping in both intermediate and deep layers, and of NMDA-evoked galloping in intermediate layers. Therefore, we conclude that the SC may be only involved in the increased attentiveness (exophthalmus, immobility) and restlessness (trotting) of prey species exposed to the cues of a nearby predator. These responses may be distinct from the full-blown flight reaction that is mediated by the dorsolateral and lateral PAG. However, other evidences suggest the possible influences of stimulation schedule, environment dimensions and rat strain in determining outcomes. Overall our results suggest a dynamically organized representation of defensive behaviors in the midbrain tectum.
Collapse
Affiliation(s)
- L C Schenberg
- Departamento de Ciências Fisiológicas--Centro Biomédico (Edifício do Programa de Pós-Graduação em Ciências Fisiológicas), Universidade Federal do Espírito Santo, Av. Marechal Campos 1468 (Maruípe), 29043-125, Vitória, ES, Brazil.
| | | | | | | | | | | |
Collapse
|
35
|
Bell AH, Meredith MA, Van Opstal AJ, Munoz DP. Crossmodal Integration in the Primate Superior Colliculus Underlying the Preparation and Initiation of Saccadic Eye Movements. J Neurophysiol 2005; 93:3659-73. [PMID: 15703222 DOI: 10.1152/jn.01214.2004] [Citation(s) in RCA: 91] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Saccades to combined audiovisual stimuli often have reduced saccadic reaction times (SRTs) compared with those to unimodal stimuli. Neurons in the intermediate/deep layers of the superior colliculus (dSC) are capable of integrating converging sensory inputs to influence the time to saccade initiation. To identify how neural processing in the dSC contributes to reducing SRTs to audiovisual stimuli, we recorded activity from dSC neurons while monkeys generated saccades to visual or audiovisual stimuli. To evoke crossmodal interactions of varying strength, we used auditory and visual stimuli of different intensities, presented either in spatial alignment or to opposite hemifields. Spatially aligned audiovisual stimuli evoked the shortest SRTs. In the case of low-intensity stimuli, the response to the auditory component of the aligned audiovisual target increased the activity preceding the response to the visual component, accelerating the onset of the visual response and facilitating the generation of shorter-latency saccades. In the case of high-intensity stimuli, the auditory and visual responses occurred much closer together in time and so there was little opportunity for the auditory stimulus to influence previsual activity. Instead, the reduction in SRT for high-intensity, aligned audiovisual stimuli was correlated with increased premotor activity (activity after visual burst but preceding saccade-aligned burst). These data provide a link between changes in neural activity related to stimulus modality with changes in behavior. They further demonstrate how crossmodal interactions are not limited to the initial sensory activity but can also influence premotor activity in the SC.
Collapse
Affiliation(s)
- Andrew H Bell
- Centre for Neuroscience Studies, Queen's University, Kingston, Ontario, Canada K7L 3N6
| | | | | | | |
Collapse
|
36
|
Barraclough NE, Xiao D, Baker CI, Oram MW, Perrett DI. Integration of visual and auditory information by superior temporal sulcus neurons responsive to the sight of actions. J Cogn Neurosci 2005; 17:377-91. [PMID: 15813999 DOI: 10.1162/0898929053279586] [Citation(s) in RCA: 256] [Impact Index Per Article: 13.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Processing of complex visual stimuli comprising facial movements, hand actions, and body movements is known to occur in the superior temporal sulcus (STS) of humans and nonhuman primates. The STS is also thought to play a role in the integration of multimodal sensory input. We investigated whether STS neurons coding the sight of actions also integrated the sound of those actions. For 23% of neurons responsive to the sight of an action, the sound of that action significantly modulated the visual response. The sound of the action increased or decreased the visually evoked response for an equal number of neurons. In the neurons whose visual response was increased by the addition of sound (but not those neurons whose responses were decreased), the audiovisual integration was dependent upon the sound of the action matching the sight of the action. These results suggest that neurons in the STS form multisensory representations of observed actions.
Collapse
|
37
|
Bell AH, Fecteau JH, Munoz DP. Using auditory and visual stimuli to investigate the behavioral and neuronal consequences of reflexive covert orienting. J Neurophysiol 2003; 91:2172-84. [PMID: 14702335 DOI: 10.1152/jn.01080.2003] [Citation(s) in RCA: 72] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Reflexively orienting toward a peripheral cue can influence subsequent responses to a target, depending on when and where the cue and target appear relative to each other. At short delays between the cue and target [cue-target onset asynchrony (CTOA)], subjects are faster to respond when they appear at the same location, an effect referred to as reflexive attentional capture. At longer CTOAs, subjects are slower to respond when the two appear at the same location, an effect referred to as inhibition of return (IOR). Recent evidence suggests that these phenomena originate from sensory interactions between the cue- and target-related responses. The capture of attention originates from a strong target-related response, derived from the overlap of the cue- and target-related activities, whereas IOR corresponds to a weaker target-aligned response. If such interactions are responsible, then modifying their nature should impact the neuronal and behavioral outcome. Monkeys performed a cue-target saccade task featuring visual and auditory cues while neural activity was recorded from the superior colliculus (SC). Compared with visual stimuli, auditory responses are weaker and occur earlier, thereby decreasing the likelihood of interactions between these signals. Similar to previous studies, visual stimuli evoked reflexive attentional capture at a short CTOA (60 ms) and IOR at longer CTOAs (160 and 610 ms) with corresponding changes in the target-aligned activity in the SC. Auditory cues used in this study failed to elicit either a behavioral effect or modification of SC activity at any CTOA, supporting the hypothesis that reflexive orienting is mediated by sensory interactions between the cue and target stimuli.
Collapse
Affiliation(s)
- Andrew H Bell
- Centre for Neuroscience Studies, Canadian Institutes of Health Research Group in Sensory-MotorSystems, Department of Physiology, Queen's University, Kingston, Ontario K7L 3N6, Canada
| | | | | |
Collapse
|