1
|
Park WJ, Fine I. The perception of auditory motion in sighted and early blind individuals. Proc Natl Acad Sci U S A 2023; 120:e2310156120. [PMID: 38015842 PMCID: PMC10710053 DOI: 10.1073/pnas.2310156120] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2023] [Accepted: 10/29/2023] [Indexed: 11/30/2023] Open
Abstract
Motion perception is a fundamental sensory task that plays a critical evolutionary role. In vision, motion processing is classically described using a motion energy model with spatiotemporally nonseparable filters suited for capturing the smooth continuous changes in spatial position over time afforded by moving objects. However, it is still not clear whether the filters underlying auditory motion discrimination are also continuous motion detectors or infer motion from comparing discrete sound locations over time (spatiotemporally separable). We used a psychophysical reverse correlation paradigm, where participants discriminated the direction of a motion signal in the presence of spatiotemporal noise, to determine whether the filters underlying auditory motion discrimination were spatiotemporally separable or nonseparable. We then examined whether these auditory motion filters were altered as a result of early blindness. We found that both sighted and early blind individuals have separable filters. However, early blind individuals show increased sensitivity to auditory motion, with reduced susceptibility to noise and filters that were more accurate in detecting motion onsets/offsets. Model simulations suggest that this reliance on separable filters is optimal given the limited spatial resolution of auditory input.
Collapse
Affiliation(s)
- Woon Ju Park
- Department of Psychology, University of Washington, Seattle, WA98195
| | - Ione Fine
- Department of Psychology, University of Washington, Seattle, WA98195
| |
Collapse
|
2
|
Fine I, Park WJ. Do you hear what I see? How do early blind individuals experience object motion? Philos Trans R Soc Lond B Biol Sci 2023; 378:20210460. [PMID: 36511418 PMCID: PMC9745882 DOI: 10.1098/rstb.2021.0460] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2022] [Accepted: 09/13/2022] [Indexed: 12/15/2022] Open
Abstract
One of the most important tasks for 3D vision is tracking the movement of objects in space. The ability of early blind individuals to understand motion in the environment from noisy and unreliable auditory information is an impressive example of cortical adaptation that is only just beginning to be understood. Here, we compare visual and auditory motion processing, and discuss the effect of early blindness on the perception of auditory motion. Blindness leads to cross-modal recruitment of the visual motion area hMT+ for auditory motion processing. Meanwhile, the planum temporale, associated with auditory motion in sighted individuals, shows reduced selectivity for auditory motion. We discuss how this dramatic shift in the cortical basis of motion processing might influence the perceptual experience of motion in early blind individuals. This article is part of a discussion meeting issue 'New approaches to 3D vision'.
Collapse
Affiliation(s)
- Ione Fine
- Department of Psychology, University of Washington, Seattle, WA 98195-1525, USA
| | - Woon Ju Park
- Department of Psychology, University of Washington, Seattle, WA 98195-1525, USA
| |
Collapse
|
3
|
Representation of Auditory Motion Directions and Sound Source Locations in the Human Planum Temporale. J Neurosci 2019; 39:2208-2220. [PMID: 30651333 DOI: 10.1523/jneurosci.2289-18.2018] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2018] [Revised: 12/20/2018] [Accepted: 12/21/2018] [Indexed: 11/21/2022] Open
Abstract
The ability to compute the location and direction of sounds is a crucial perceptual skill to efficiently interact with dynamic environments. How the human brain implements spatial hearing is, however, poorly understood. In our study, we used fMRI to characterize the brain activity of male and female humans listening to sounds moving left, right, up, and down as well as static sounds. Whole-brain univariate results contrasting moving and static sounds varying in their location revealed a robust functional preference for auditory motion in bilateral human planum temporale (hPT). Using independently localized hPT, we show that this region contains information about auditory motion directions and, to a lesser extent, sound source locations. Moreover, hPT showed an axis of motion organization reminiscent of the functional organization of the middle-temporal cortex (hMT+/V5) for vision. Importantly, whereas motion direction and location rely on partially shared pattern geometries in hPT, as demonstrated by successful cross-condition decoding, the responses elicited by static and moving sounds were, however, significantly distinct. Altogether, our results demonstrate that the hPT codes for auditory motion and location but that the underlying neural computation linked to motion processing is more reliable and partially distinct from the one supporting sound source location.SIGNIFICANCE STATEMENT Compared with what we know about visual motion, little is known about how the brain implements spatial hearing. Our study reveals that motion directions and sound source locations can be reliably decoded in the human planum temporale (hPT) and that they rely on partially shared pattern geometries. Our study, therefore, sheds important new light on how computing the location or direction of sounds is implemented in the human auditory cortex by showing that those two computations rely on partially shared neural codes. Furthermore, our results show that the neural representation of moving sounds in hPT follows a "preferred axis of motion" organization, reminiscent of the coding mechanisms typically observed in the occipital middle-temporal cortex (hMT+/V5) region for computing visual motion.
Collapse
|
4
|
Chaplin TA, Rosa MGP, Lui LL. Auditory and Visual Motion Processing and Integration in the Primate Cerebral Cortex. Front Neural Circuits 2018; 12:93. [PMID: 30416431 PMCID: PMC6212655 DOI: 10.3389/fncir.2018.00093] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2018] [Accepted: 10/08/2018] [Indexed: 11/13/2022] Open
Abstract
The ability of animals to detect motion is critical for survival, and errors or even delays in motion perception may prove costly. In the natural world, moving objects in the visual field often produce concurrent sounds. Thus, it can highly advantageous to detect motion elicited from sensory signals of either modality, and to integrate them to produce more reliable motion perception. A great deal of progress has been made in understanding how visual motion perception is governed by the activity of single neurons in the primate cerebral cortex, but far less progress has been made in understanding both auditory motion and audiovisual motion integration. Here we, review the key cortical regions for motion processing, focussing on translational motion. We compare the representations of space and motion in the visual and auditory systems, and examine how single neurons in these two sensory systems encode the direction of motion. We also discuss the way in which humans integrate of audio and visual motion cues, and the regions of the cortex that may mediate this process.
Collapse
Affiliation(s)
- Tristan A Chaplin
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, VIC, Australia.,Australian Research Council (ARC) Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, VIC, Australia
| | - Marcello G P Rosa
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, VIC, Australia.,Australian Research Council (ARC) Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, VIC, Australia
| | - Leo L Lui
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, VIC, Australia.,Australian Research Council (ARC) Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, VIC, Australia
| |
Collapse
|
5
|
Neural tracking of auditory motion is reflected by delta phase and alpha power of EEG. Neuroimage 2018; 181:683-691. [PMID: 30053517 DOI: 10.1016/j.neuroimage.2018.07.054] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2018] [Revised: 07/10/2018] [Accepted: 07/23/2018] [Indexed: 12/29/2022] Open
Abstract
It is of increasing practical interest to be able to decode the spatial characteristics of an auditory scene from electrophysiological signals. However, the cortical representation of auditory space is not well characterized, and it is unclear how cortical activity reflects the time-varying location of a moving sound. Recently, we demonstrated that cortical response measures to discrete noise bursts can be decoded to determine their origin in space. Here we build on these findings to investigate the cortical representation of a continuously moving auditory stimulus using scalp recorded electroencephalography (EEG). In a first experiment, subjects listened to pink noise over headphones which was spectro-temporally modified to be perceived as randomly moving on a semi-circular trajectory in the horizontal plane. While subjects listened to the stimuli, we recorded their EEG using a 128-channel acquisition system. The data were analysed by 1) building a linear regression model (decoder) mapping the relationship between the stimulus location and a training set of EEG data, and 2) using the decoder to reconstruct an estimate of the time-varying sound source azimuth from the EEG data. The results showed that we can decode sound trajectory with a reconstruction accuracy significantly above chance level. Specifically, we found that the phase of delta (<2 Hz) and power of alpha (8-12 Hz) EEG track the dynamics of a moving auditory object. In a follow-up experiment, we replaced the noise with pulse train stimuli containing only interaural level and time differences (ILDs and ITDs respectively). This allowed us to investigate whether our trajectory decoding is sensitive to both acoustic cues. We found that the sound trajectory can be decoded for both ILD and ITD stimuli. Moreover, their neural signatures were similar and even allowed successful cross-cue classification. This supports the notion of integrated processing of ILD and ITD at the cortical level. These results are particularly relevant for application in devices such as cognitively controlled hearing aids and for the evaluation of virtual acoustic environments.
Collapse
|
6
|
Poirier C, Baumann S, Dheerendra P, Joly O, Hunter D, Balezeau F, Sun L, Rees A, Petkov CI, Thiele A, Griffiths TD. Auditory motion-specific mechanisms in the primate brain. PLoS Biol 2017; 15:e2001379. [PMID: 28472038 PMCID: PMC5417421 DOI: 10.1371/journal.pbio.2001379] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2016] [Accepted: 04/07/2017] [Indexed: 12/25/2022] Open
Abstract
This work examined the mechanisms underlying auditory motion processing in the auditory cortex of awake monkeys using functional magnetic resonance imaging (fMRI). We tested to what extent auditory motion analysis can be explained by the linear combination of static spatial mechanisms, spectrotemporal processes, and their interaction. We found that the posterior auditory cortex, including A1 and the surrounding caudal belt and parabelt, is involved in auditory motion analysis. Static spatial and spectrotemporal processes were able to fully explain motion-induced activation in most parts of the auditory cortex, including A1, but not in circumscribed regions of the posterior belt and parabelt cortex. We show that in these regions motion-specific processes contribute to the activation, providing the first demonstration that auditory motion is not simply deduced from changes in static spatial location. These results demonstrate that parallel mechanisms for motion and static spatial analysis coexist within the auditory dorsal stream.
Collapse
Affiliation(s)
- Colline Poirier
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
- * E-mail: (CP); (TDG)
| | - Simon Baumann
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
| | - Pradeep Dheerendra
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
| | - Olivier Joly
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
| | - David Hunter
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
| | - Fabien Balezeau
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
| | - Li Sun
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
| | - Adrian Rees
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
| | - Christopher I. Petkov
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
| | - Alexander Thiele
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
| | - Timothy D. Griffiths
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, Tyne and Wear, United Kingdom
- * E-mail: (CP); (TDG)
| |
Collapse
|
7
|
Callan A, Callan D, Ando H. The Importance of Spatiotemporal Information in Biological Motion Perception: White Noise Presented with a Step-like Motion Activates the Biological Motion Area. J Cogn Neurosci 2016; 29:277-285. [PMID: 27647281 DOI: 10.1162/jocn_a_01046] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Humans can easily recognize the motion of living creatures using only a handful of point-lights that describe the motion of the main joints (biological motion perception). This special ability to perceive the motion of animate objects signifies the importance of the spatiotemporal information in perceiving biological motion. The posterior STS (pSTS) and posterior middle temporal gyrus (pMTG) region have been established by many functional neuroimaging studies as a locus for biological motion perception. Because listening to a walking human also activates the pSTS/pMTG region, the region has been proposed to be supramodal in nature. In this study, we investigated whether the spatiotemporal information from simple auditory stimuli is sufficient to activate this biological motion area. We compared spatially moving white noise, having a running-like tempo that was consistent with biological motion, with stationary white noise. The moving-minus-stationary contrast showed significant differences in activation of the pSTS/pMTG region. Our results suggest that the spatiotemporal information of the auditory stimuli is sufficient to activate the biological motion area.
Collapse
Affiliation(s)
- Akiko Callan
- CiNet, National Institute of Information and Communication Technology, and Osaka University
| | - Daniel Callan
- CiNet, National Institute of Information and Communication Technology, and Osaka University
| | - Hiroshi Ando
- CiNet, National Institute of Information and Communication Technology, and Osaka University
| |
Collapse
|
8
|
Velocity Selective Networks in Human Cortex Reveal Two Functionally Distinct Auditory Motion Systems. PLoS One 2016; 11:e0157131. [PMID: 27294673 PMCID: PMC4905637 DOI: 10.1371/journal.pone.0157131] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2015] [Accepted: 05/25/2016] [Indexed: 12/02/2022] Open
Abstract
The auditory system encounters motion cues through an acoustic object’s movement or rotation of the listener’s head in a stationary sound field, generating a wide range of naturally occurring velocities from a few to several hundred degrees per second. The angular velocity of moving acoustic objects relative to a listener is typically slow and does not exceed tens of degrees per second, whereas head rotations in a stationary acoustic field may generate fast-changing spatial cues in the order of several hundred degrees per second. We hypothesized that these two types of systems (i.e., encoding slow movements of an object or fast head rotations) may engage functionally distinct substrates in processing spatially dynamic auditory cues, with the latter potentially involved in maintaining perceptual constancy in a stationary field during head rotations and therefore possibly involving corollary-discharge mechanisms in premotor cortex. Using fMRI, we examined cortical response patterns to sound sources moving at a wide range of velocities in 3D virtual auditory space. We found a significant categorical difference between fast and slow moving sounds, with stronger activations in response to higher velocities in the posterior superior temporal regions, the planum temporale, and notably the premotor ventral-rostral (PMVr) area implicated in planning neck and head motor functions.
Collapse
|
9
|
Carlile S, Leung J. The Perception of Auditory Motion. Trends Hear 2016; 20:2331216516644254. [PMID: 27094029 PMCID: PMC4871213 DOI: 10.1177/2331216516644254] [Citation(s) in RCA: 38] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2015] [Revised: 03/22/2016] [Accepted: 03/22/2016] [Indexed: 11/16/2022] Open
Abstract
The growing availability of efficient and relatively inexpensive virtual auditory display technology has provided new research platforms to explore the perception of auditory motion. At the same time, deployment of these technologies in command and control as well as in entertainment roles is generating an increasing need to better understand the complex processes underlying auditory motion perception. This is a particularly challenging processing feat because it involves the rapid deconvolution of the relative change in the locations of sound sources produced by rotational and translations of the head in space (self-motion) to enable the perception of actual source motion. The fact that we perceive our auditory world to be stable despite almost continual movement of the head demonstrates the efficiency and effectiveness of this process. This review examines the acoustical basis of auditory motion perception and a wide range of psychophysical, electrophysiological, and cortical imaging studies that have probed the limits and possible mechanisms underlying this perception.
Collapse
Affiliation(s)
- Simon Carlile
- School of Medical Sciences, University of Sydney, NSW, Australia Starkey Hearing Research Center, Berkeley, CA, USA
| | - Johahn Leung
- School of Medical Sciences, University of Sydney, NSW, Australia
| |
Collapse
|
10
|
Abstract
The auditory system derives locations of sound sources from spatial cues provided by the interaction of sound with the head and external ears. Those cues are analyzed in specific brainstem pathways and then integrated as cortical representation of locations. The principal cues for horizontal localization are interaural time differences (ITDs) and interaural differences in sound level (ILDs). Vertical and front/back localization rely on spectral-shape cues derived from direction-dependent filtering properties of the external ears. The likely first sites of analysis of these cues are the medial superior olive (MSO) for ITDs, lateral superior olive (LSO) for ILDs, and dorsal cochlear nucleus (DCN) for spectral-shape cues. Localization in distance is much less accurate than that in horizontal and vertical dimensions, and interpretation of the basic cues is influenced by additional factors, including acoustics of the surroundings and familiarity of source spectra and levels. Listeners are quite sensitive to sound motion, but it remains unclear whether that reflects specific motion detection mechanisms or simply detection of changes in static location. Intact auditory cortex is essential for normal sound localization. Cortical representation of sound locations is highly distributed, with no evidence for point-to-point topography. Spatial representation is strictly contralateral in laboratory animals that have been studied, whereas humans show a prominent right-hemisphere dominance.
Collapse
Affiliation(s)
- John C Middlebrooks
- Departments of Otolaryngology, Neurobiology and Behavior, Cognitive Sciences, and Biomedical Engineering, University of California at Irvine, Irvine, CA, USA.
| |
Collapse
|
11
|
Chung Y, Delgutte B, Colburn HS. Modeling binaural responses in the auditory brainstem to electric stimulation of the auditory nerve. J Assoc Res Otolaryngol 2014; 16:135-58. [PMID: 25348578 DOI: 10.1007/s10162-014-0492-6] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2014] [Accepted: 09/24/2014] [Indexed: 12/11/2022] Open
Abstract
Bilateral cochlear implants (CIs) provide improvements in sound localization and speech perception in noise over unilateral CIs. However, the benefits arise mainly from the perception of interaural level differences, while bilateral CI listeners' sensitivity to interaural time difference (ITD) is poorer than normal. To help understand this limitation, a set of ITD-sensitive neural models was developed to study binaural responses to electric stimulation. Our working hypothesis was that central auditory processing is normal with bilateral CIs so that the abnormality in the response to electric stimulation at the level of the auditory nerve fibers (ANFs) is the source of the limited ITD sensitivity. A descriptive model of ANF response to both acoustic and electric stimulation was implemented and used to drive a simplified biophysical model of neurons in the medial superior olive (MSO). The model's ITD sensitivity was found to depend strongly on the specific configurations of membrane and synaptic parameters for different stimulation rates. Specifically, stronger excitatory synaptic inputs and faster membrane responses were required for the model neurons to be ITD-sensitive at high stimulation rates, whereas weaker excitatory synaptic input and slower membrane responses were necessary at low stimulation rates, for both electric and acoustic stimulation. This finding raises the possibility of frequency-dependent differences in neural mechanisms of binaural processing; limitations in ITD sensitivity with bilateral CIs may be due to a mismatch between stimulation rate and cell parameters in ITD-sensitive neurons.
Collapse
Affiliation(s)
- Yoojin Chung
- Biomedical Engineering Department, Hearing Research Center, Boston University, Boston, MA, 02215, USA,
| | | | | |
Collapse
|
12
|
Shiell MM. Tonotopic organization of V5/MT+ in congenital anopthalmia: implications for auditory motion processing and metamodal cross-modal reorganization. J Neurosci 2014; 34:3807-9. [PMID: 24623759 PMCID: PMC6705276 DOI: 10.1523/jneurosci.0150-14.2014] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2014] [Revised: 02/06/2014] [Accepted: 02/12/2014] [Indexed: 11/21/2022] Open
Affiliation(s)
- Martha M. Shiell
- Integrated Program in Neuroscience, McGill University, Montreal, Quebec H3A 2B4, Canada
| |
Collapse
|
13
|
Lau C, Zhang JW, Cheng JS, Zhou IY, Cheung MM, Wu EX. Noninvasive fMRI investigation of interaural level difference processing in the rat auditory subcortex. PLoS One 2013; 8:e70706. [PMID: 23940631 PMCID: PMC3733930 DOI: 10.1371/journal.pone.0070706] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2013] [Accepted: 06/21/2013] [Indexed: 12/02/2022] Open
Abstract
Objective Interaural level difference (ILD) is the difference in sound pressure level (SPL) between the two ears and is one of the key physical cues used by the auditory system in sound localization. Our current understanding of ILD encoding has come primarily from invasive studies of individual structures, which have implicated subcortical structures such as the cochlear nucleus (CN), superior olivary complex (SOC), lateral lemniscus (LL), and inferior colliculus (IC). Noninvasive brain imaging enables studying ILD processing in multiple structures simultaneously. Methods In this study, blood oxygenation level-dependent (BOLD) functional magnetic resonance imaging (fMRI) is used for the first time to measure changes in the hemodynamic responses in the adult Sprague-Dawley rat subcortex during binaural stimulation with different ILDs. Results and Significance Consistent responses are observed in the CN, SOC, LL, and IC in both hemispheres. Voxel-by-voxel analysis of the change of the response amplitude with ILD indicates statistically significant ILD dependence in dorsal LL, IC, and a region containing parts of the SOC and LL. For all three regions, the larger amplitude response is located in the hemisphere contralateral from the higher SPL stimulus. These findings are supported by region of interest analysis. fMRI shows that ILD dependence occurs in both hemispheres and multiple subcortical levels of the auditory system. This study is the first step towards future studies examining subcortical binaural processing and sound localization in animal models of hearing.
Collapse
Affiliation(s)
- Condon Lau
- Laboratory of Biomedical Imaging and Signal Processing, The University of Hong Kong, Pokfulam, Hong Kong, China
| | | | | | | | | | | |
Collapse
|
14
|
Ahveninen J, Kopčo N, Jääskeläinen IP. Psychophysics and neuronal bases of sound localization in humans. Hear Res 2013; 307:86-97. [PMID: 23886698 DOI: 10.1016/j.heares.2013.07.008] [Citation(s) in RCA: 56] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/15/2013] [Revised: 07/02/2013] [Accepted: 07/10/2013] [Indexed: 10/26/2022]
Abstract
Localization of sound sources is a considerable computational challenge for the human brain. Whereas the visual system can process basic spatial information in parallel, the auditory system lacks a straightforward correspondence between external spatial locations and sensory receptive fields. Consequently, the question how different acoustic features supporting spatial hearing are represented in the central nervous system is still open. Functional neuroimaging studies in humans have provided evidence for a posterior auditory "where" pathway that encompasses non-primary auditory cortex areas, including the planum temporale (PT) and posterior superior temporal gyrus (STG), which are strongly activated by horizontal sound direction changes, distance changes, and movement. However, these areas are also activated by a wide variety of other stimulus features, posing a challenge for the interpretation that the underlying areas are purely spatial. This review discusses behavioral and neuroimaging studies on sound localization, and some of the competing models of representation of auditory space in humans. This article is part of a Special Issue entitled Human Auditory Neuroimaging.
Collapse
Affiliation(s)
- Jyrki Ahveninen
- Harvard Medical School - Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Massachusetts General Hospital, Charlestown, MA, USA.
| | | | | |
Collapse
|
15
|
Lewald J, Getzmann S. Ventral and dorsal visual pathways support auditory motion processing in the blind: evidence from electrical neuroimaging. Eur J Neurosci 2013; 38:3201-9. [DOI: 10.1111/ejn.12306] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2013] [Revised: 06/05/2013] [Accepted: 06/10/2013] [Indexed: 11/30/2022]
Affiliation(s)
- Jörg Lewald
- Ruhr University Bochum; Faculty of Psychology; D-44780 Bochum Germany
- Leibniz Research Centre for Working Environment and Human Factors; Dortmund Germany
| | - Stephan Getzmann
- Ruhr University Bochum; Faculty of Psychology; D-44780 Bochum Germany
- Leibniz Research Centre for Working Environment and Human Factors; Dortmund Germany
| |
Collapse
|
16
|
Magezi DA, Buetler KA, Chouiter L, Annoni JM, Spierer L. Electrical neuroimaging during auditory motion aftereffects reveals that auditory motion processing is motion sensitive but not direction selective. J Neurophysiol 2012; 109:321-31. [PMID: 23076114 DOI: 10.1152/jn.00625.2012] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Following prolonged exposure to adaptor sounds moving in a single direction, participants may perceive stationary-probe sounds as moving in the opposite direction [direction-selective auditory motion aftereffect (aMAE)] and be less sensitive to motion of any probe sounds that are actually moving (motion-sensitive aMAE). The neural mechanisms of aMAEs, and notably whether they are due to adaptation of direction-selective motion detectors, as found in vision, is presently unknown and would provide critical insight into auditory motion processing. We measured human behavioral responses and auditory evoked potentials to probe sounds following four types of moving-adaptor sounds: leftward and rightward unidirectional, bidirectional, and stationary. Behavioral data replicated both direction-selective and motion-sensitive aMAEs. Electrical neuroimaging analyses of auditory evoked potentials to stationary probes revealed no significant difference in either global field power (GFP) or scalp topography between leftward and rightward conditions, suggesting that aMAEs are not based on adaptation of direction-selective motion detectors. By contrast, the bidirectional and stationary conditions differed significantly in the stationary-probe GFP at 200 ms poststimulus onset without concomitant topographic modulation, indicative of a difference in the response strength between statistically indistinguishable intracranial generators. The magnitude of this GFP difference was positively correlated with the magnitude of the motion-sensitive aMAE, supporting the functional relevance of the neurophysiological measures. Electrical source estimations revealed that the GFP difference followed from a modulation of activity in predominantly right hemisphere frontal-temporal-parietal brain regions previously implicated in auditory motion processing. Our collective results suggest that auditory motion processing relies on motion-sensitive, but, in contrast to vision, non-direction-selective mechanisms.
Collapse
Affiliation(s)
- David A Magezi
- Neurology Unit, Department of Medicine, Faculty of Sciences, University of Fribourg, Fribourg, Switzerland.
| | | | | | | | | |
Collapse
|
17
|
Isenberg AL, Vaden KI, Saberi K, Muftuler LT, Hickok G. Functionally distinct regions for spatial processing and sensory motor integration in the planum temporale. Hum Brain Mapp 2012; 33:2453-63. [PMID: 21932266 PMCID: PMC5242090 DOI: 10.1002/hbm.21373] [Citation(s) in RCA: 32] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2010] [Revised: 05/04/2011] [Accepted: 05/09/2011] [Indexed: 11/08/2022] Open
Abstract
There has been much debate recently over the functional role played by the planum temporale (PT) within the context of the dorsal auditory processing stream. Some studies indicate that regions in the PT support spatial hearing and other auditory functions, whereas others demonstrate sensory-motor response properties. This multifunctionality has led to the claim that the PT is performing a common computational pattern matching operation, then routing the signals (spatial, object, sensory-motor) into an appropriate processing stream. An alternative possibility is that the PT is functionally subdivided with separate regions supporting various functions. We assess this possibility using a within subject fMRI block design. DTI data were also collected to examine connectivity. There were four auditory conditions: stationary noise, moving noise, listening to pseudowords, and shadowing pseudowords (covert repetition). Contrasting the shadow and listen conditions should activate regions specific to sensory-motor processes, while contrasting the stationary and moving noise conditions should activate regions involved in spatial hearing. Subjects (N = 16) showed greater activation for shadowing in left posterior PT, area Spt, when the shadow and listen conditions were contrasted. The motion vs. stationary noise contrast revealed greater activation in a more medial and anterior portion of left PT. Seeds from these two contrasts were then used to guide the DTI analysis in an examination of connectivity via streamline tractography, which revealed different patterns of connectivity. Findings support a heterogeneous model of the PT, with functionally distinct regions for sensory-motor integration and processes involved in auditory spatial perception.
Collapse
Affiliation(s)
- A. Lisette Isenberg
- Department of Cognitive Sciences, University of California, Irvine, California
| | - Kenneth I. Vaden
- Otolaryngology Department, Medical University of South Carolina, South Carolina
| | - Kourosh Saberi
- Department of Cognitive Sciences, University of California, Irvine, California
| | - L. Tugan Muftuler
- Tu and Yuen Center for Functional Onco‐Imaging, University of California, Irvine, Carolina
| | - Gregory Hickok
- Department of Cognitive Sciences, University of California, Irvine, California
| |
Collapse
|
18
|
Getzmann S, Lewald J. Cortical processing of change in sound location: Smooth motion versus discontinuous displacement. Brain Res 2012; 1466:119-27. [DOI: 10.1016/j.brainres.2012.05.033] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2011] [Revised: 03/29/2012] [Accepted: 05/17/2012] [Indexed: 10/28/2022]
|
19
|
Abstract
The auditory system codes spatial locations in a way that deviates from the spatial representations found in other modalities. This difference is especially striking in the cortex, where neurons form topographical maps of visual and tactile space but where auditory space is represented through a population rate code. In this hemifield code, sound source location is represented in the activity of two widely tuned opponent populations, one tuned to the right and the other to the left side of auditory space. Scientists are only beginning to uncover how this coding strategy adapts to various spatial processing demands. This review presents the current understanding of auditory spatial processing in the cortex. To this end, the authors consider how various implementations of the hemifield code may exist within the auditory cortex and how these may be modulated by the stimulation and task context. As a result, a coherent set of neural strategies for auditory spatial processing emerges.
Collapse
|
20
|
Redefining the Functional Organization of the Planum Temporale Region: Space, Objects, and Sensory–Motor Integration. THE HUMAN AUDITORY CORTEX 2012. [DOI: 10.1007/978-1-4614-2314-0_12] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/10/2023]
|
21
|
Wallentin M, Nielsen AH, Vuust P, Dohn A, Roepstorff A, Lund TE. BOLD response to motion verbs in left posterior middle temporal gyrus during story comprehension. BRAIN AND LANGUAGE 2011; 119:221-225. [PMID: 21612817 DOI: 10.1016/j.bandl.2011.04.006] [Citation(s) in RCA: 32] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/02/2011] [Revised: 04/05/2011] [Accepted: 04/24/2011] [Indexed: 05/30/2023]
Abstract
A primary focus within neuroimaging research on language comprehension is on the distribution of semantic knowledge in the brain. Studies have shown that the left posterior middle temporal gyrus (LPMT), a region just anterior to area MT/V5, is important for the processing of complex action knowledge. It has also been found that motion verbs cause activation in LPMT. In this experiment we investigated whether this effect could be replicated in a setting resembling real life language comprehension, i.e. without any overt behavioral task during passive listening to a story. During fMRI participants listened to a recording of the story "The Ugly Duckling". We incorporated a nuisance elimination regression approach for factoring out known nuisance variables both in terms of physiological noise, sound intensity, linguistic variables and emotional content. Compared to the remaining text, clauses containing motion verbs were accompanied by a robust activation of LPMT with no other significant effects, consistent with the hypothesis that this brain region is important for processing motion knowledge, even during naturalistic language comprehension conditions.
Collapse
Affiliation(s)
- Mikkel Wallentin
- Center of Functionally Integrative Neuroscience, Aarhus University Hospital, Nørrebrogade, 8000 Aarhus C, Denmark.
| | | | | | | | | | | |
Collapse
|
22
|
Planetta PJ, Servos P. The postcentral gyrus shows sustained fMRI activation during the tactile motion aftereffect. Exp Brain Res 2011; 216:535-44. [PMID: 22120108 DOI: 10.1007/s00221-011-2957-8] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2011] [Accepted: 11/14/2011] [Indexed: 12/27/2022]
Abstract
The tactile motion aftereffect (tMAE) is a perceptual illusion in which a stationary stimulus feels as though it is moving when presented following adaptation to a unidirectionally moving tactile stimulus. Using functional magnetic resonance imaging (fMRI), we localized the brain areas responsive to tactile motion and then investigated whether these areas underlie the tMAE. Tactile stimulation was delivered to the glabrous surface of the right hand by means of a plastic cylinder with a square-wave patterned surface. In the tactile motion localizer, we contrasted periods in which the cylinder rotated at 15 rpm with periods of rest (stationary contact). Activation was observed in the contralateral (left) thalamus, postcentral gyrus, and parietal operculum. In the tMAE experiment, the cylinder rotated at 15 or 60 rpm for 2 min. The 60-rpm speed induced reliable tMAEs, whereas the 15-rpm speed did not. Of the areas activated by the tactile motion localizer, only the postcentral gyrus showed a sustained fMRI response following the offset of 60-rpm (but not 15-rpm) stimulation, presumably reflecting the illusory perception of motion.
Collapse
Affiliation(s)
- Peggy J Planetta
- Department of Kinesiology and Nutrition, University of Illinois at Chicago, 1919 West Taylor Street, 650 AHSB (M/C 994), Chicago, IL 60612, USA.
| | | |
Collapse
|
23
|
Alink A, Euler F, Kriegeskorte N, Singer W, Kohler A. Auditory motion direction encoding in auditory cortex and high-level visual cortex. Hum Brain Mapp 2011; 33:969-78. [PMID: 21692141 DOI: 10.1002/hbm.21263] [Citation(s) in RCA: 47] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2010] [Revised: 11/23/2010] [Accepted: 12/27/2010] [Indexed: 11/11/2022] Open
Abstract
The aim of this functional magnetic resonance imaging (fMRI) study was to identify human brain areas that are sensitive to the direction of auditory motion. Such directional sensitivity was assessed in a hypothesis-free manner by analyzing fMRI response patterns across the entire brain volume using a spherical-searchlight approach. In addition, we assessed directional sensitivity in three predefined brain areas that have been associated with auditory motion perception in previous neuroimaging studies. These were the primary auditory cortex, the planum temporale and the visual motion complex (hMT/V5+). Our whole-brain analysis revealed that the direction of sound-source movement could be decoded from fMRI response patterns in the right auditory cortex and in a high-level visual area located in the right lateral occipital cortex. Our region-of-interest-based analysis showed that the decoding of the direction of auditory motion was most reliable with activation patterns of the left and right planum temporale. Auditory motion direction could not be decoded from activation patterns in hMT/V5+. These findings provide further evidence for the planum temporale playing a central role in supporting auditory motion perception. In addition, our findings suggest a cross-modal transfer of directional information to high-level visual cortex in healthy humans.
Collapse
Affiliation(s)
- Arjen Alink
- Department of Neurophysiology, Max Planck Institute for Brain Research, D-60528 Frankfurt am Main, Germany.
| | | | | | | | | |
Collapse
|
24
|
Getzmann S, Lewald J. The effect of spatial adaptation on auditory motion processing. Hear Res 2011; 272:21-9. [DOI: 10.1016/j.heares.2010.11.005] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/13/2010] [Revised: 11/10/2010] [Accepted: 11/11/2010] [Indexed: 10/18/2022]
|
25
|
Smith KR, Hsieh IH, Saberi K, Hickok G. Auditory spatial and object processing in the human planum temporale: no evidence for selectivity. J Cogn Neurosci 2010; 22:632-9. [PMID: 19301992 DOI: 10.1162/jocn.2009.21196] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Although it is generally acknowledged that at least two processing streams exist in the primate cortical auditory system, the function of the posterior dorsal stream is a topic of much debate. Recent studies have reported selective activation to auditory spatial change in portions of the human planum temporale (PT) relative to nonspatial stimuli such as pitch changes or complex acoustic patterns. However, previous work has suggested that the PT may be sensitive to another kind of nonspatial variable, namely, the number of auditory objects simultaneously presented in the acoustic signal. The goal of the present fMRI experiment was to assess whether any portion of the PT showed spatial selectivity relative to manipulations of the number of auditory objects presented. Spatially sensitive regions in the PT were defined by comparing activity associated with listening to an auditory object (speech from a single talker) that changed location with one that remained stationary. Activity within these regions was then examined during a nonspatial manipulation: increasing the number of objects (talkers) from one to three. The nonspatial manipulation modulated activity within the "spatial" PT regions. No region within the PT was found to be selective for spatial or object processing. We suggest that previously documented spatial sensitivity in the PT reflects auditory source separation using spatial cues rather than spatial processing per se.
Collapse
Affiliation(s)
- Kevin R Smith
- University of California, Irvine, Irvine, CA 92697, USA
| | | | | | | |
Collapse
|
26
|
Getzmann S, Lewald J. Shared Cortical Systems for Processing of Horizontal and Vertical Sound Motion. J Neurophysiol 2010; 103:1896-904. [DOI: 10.1152/jn.00333.2009] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Cortical processing of horizontal and vertical sound motion in free-field space was investigated using high-density electroencephalography in combination with standardized low-resolution brain electromagnetic tomography (sLORETA). Eighteen subjects heard sound stimuli that, after an initial stationary phase in a central position, started to move centrifugally, either to the left, to the right, upward, or downward. The delayed onset of both horizontal and vertical motion elicited a specific motion-onset response (MOR), resulting in widely distributed activations, with prominent maxima in primary and nonprimary auditory cortices, insula, and parietal lobe. The comparison of MORs to horizontal and vertical motion orientations did not indicate any significant differences in latency or topography. Contrasting the sLORETA solutions for the two motion orientations revealed only marginal activation in postcentral gyrus. These data are consistent with the notion that azimuth and elevation components of dynamic auditory spatial information are processed in common, rather than separate, cortical substrates. Furthermore, the findings support the assumption that the MOR originates at a stage of auditory analysis after the different spatial cues (interaural and monaural spectral cues) have been integrated into a unified space code.
Collapse
Affiliation(s)
- Stephan Getzmann
- Leibniz Research Centre for Working Environment and Human Factors, Dortmund, Germany; and
| | - Jörg Lewald
- Leibniz Research Centre for Working Environment and Human Factors, Dortmund, Germany; and
- Department of Cognitive Psychology, Ruhr University Bochum, Bochum, Germany
| |
Collapse
|
27
|
Chan JS, Simões-Franklin C, Garavan H, Newell FN. Static images of novel, moveable objects learned through touch activate visual area hMT+. Neuroimage 2010; 49:1708-16. [DOI: 10.1016/j.neuroimage.2009.09.068] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2009] [Revised: 09/28/2009] [Accepted: 09/29/2009] [Indexed: 11/16/2022] Open
|
28
|
Getzmann S, Lewald J. Effects of natural versus artificial spatial cues on electrophysiological correlates of auditory motion. Hear Res 2009; 259:44-54. [PMID: 19800957 DOI: 10.1016/j.heares.2009.09.021] [Citation(s) in RCA: 42] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/07/2009] [Revised: 09/30/2009] [Accepted: 09/30/2009] [Indexed: 11/24/2022]
Abstract
The effect of the type of the auditory motion stimulus on neural correlates of motion processing were investigated using high-density electroencephalography. Sound motion was implemented by (a) gradual shifts in interaural time or (b) level difference; (c) motion of virtual 3D sound sources; or (d) successive activation of 45 loudspeakers along the horizontal plane. In a subset of trials, listeners (N=20) performed a two-alternative forced-choice motion discrimination task. Each trial began with a stationary phase of the acoustic stimulus in a central position, immediately followed by a motion of the stimulus. The motion onset elicited a specific cortical response that was dominated by large negative and positive deflections, the so-called change-N1 and change-P2. The temporal dynamics of these components depended on the auditory motion cues presented: Free-field motion and virtual 3D motion were associated with earlier cortical responses and with shorter reaction times than shifts in interaural time or level. Also, free-field motion elicited much stronger onset responses than simulated motion. These findings suggest that natural-like stimulation using stimuli presented in the free sound field allows more reliable conclusions on neural processing of sound motion, whereas artificial motion stimuli, in particular gradual shifts in interaural time or level, seem to be less suited with respect to this aim.
Collapse
Affiliation(s)
- Stephan Getzmann
- Leibniz Research Centre for Working Environment and Human Factors, D-44139 Dortmund, Germany.
| | | |
Collapse
|
29
|
Kawabe T. Sequential Stream Segregation Affects Localisation of Diotic Tones among Tones with Time-Varying Interaural Time Difference. Perception 2009; 38:1377-85. [DOI: 10.1068/p6369] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
In this study, I examined how sequential stream segregation contributes to the detection of diotic tones among tones with time-varying interaural time differences (ITDs). Target (T) and distractor (D) tones, and a silent duration (–) formed a sequence (DTD–) and this sequence was presented repeatedly. A frequency difference was introduced between target and distractor tones. The distractor tones were also given time-varying ITDs to produce a percept of smooth auditory motion along the interaural axis. In half of the trials, the target tones were not given time-varying ITDs, and thus were diotically presented. The task of the listeners was to determine whether the repeated sequences of DTD–had target tones without motion. The sensitivity d′ for the detection of diotic target tones was higher with larger frequency differences. On the other hand, the criterion c was lower with larger frequency differences. In another session, I confirmed that proportions of reports “two streams” was positively and negatively correlated with d′ and c, respectively. The results indicate that the localisation of a sound image could be influenced by sequential stream segregation in complex sound environments.
Collapse
Affiliation(s)
- Takahiro Kawabe
- Kyushu University, 6-19-1 Hakozaki, Higashi-ku, Fukuoka 812-8581, Japan
| |
Collapse
|
30
|
Zvyagintsev M, Nikolaev AR, Thönnessen H, Sachs O, Dammers J, Mathiak K. Spatially congruent visual motion modulates activity of the primary auditory cortex. Exp Brain Res 2009; 198:391-402. [PMID: 19449155 DOI: 10.1007/s00221-009-1830-5] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2008] [Accepted: 04/24/2009] [Indexed: 11/24/2022]
Abstract
We investigated the brain responses to the transitions from the static to moving audiovisual stimuli using magnetoencephalography. The spatially congruent auditory and visual stimuli moved in the same direction whereas the incongruent stimuli moved in the opposite directions. Using dipole modeling we found that the static-to-moving transitions evoked a neural response in the primary auditory cortex bilaterally. The response started about 100 ms after the motion onset from a negative component (mvN1) and lasted during the entire interval of the stimulus motion. The mvN1 component was similar to the classical auditory N1 response to the static sound, but had smaller amplitude and later latency. The coordinates of the mvN1 and N1 dipoles in the primary auditory cortex were also similar. The amplitude of the auditory response to the moving stimuli appears to be sensitive to spatial congruency of the audiovisual motion; it was larger in the incongruent than congruent condition. This is evidence that the moving visual stimuli modulate the early sensory activity in the primary auditory cortex. Such early audiovisual integration may be specific for motion processing.
Collapse
Affiliation(s)
- Mikhail Zvyagintsev
- Department of Psychiatry and Psychotherapy, University Hospital Aachen, RWTH Aachen University, 52074 Aachen, Germany.
| | | | | | | | | | | |
Collapse
|
31
|
Zimmer U, Macaluso E. Interaural temporal and coherence cues jointly contribute to successful sound movement perception and activation of parietal cortex. Neuroimage 2009; 46:1200-8. [PMID: 19303934 DOI: 10.1016/j.neuroimage.2009.03.022] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2008] [Revised: 01/27/2009] [Accepted: 03/08/2009] [Indexed: 11/24/2022] Open
Abstract
The perception of movement in the auditory modality requires dynamic changes in the input that reaches the two ears (e.g. sequential changes of interaural time differences; dynamic ITDs). However, it is still unclear as to what extent these temporal cues interact with other interaural cues to determine successful movement perception, and which brain regions are involved in sound movement processing. Here, we presented trains of white-noise bursts containing either static or dynamic ITDs, and we varied parametrically the level of binaural coherence (BC) of both types of stimuli. Behaviorally, we found that movement discrimination sensitivity decreased with decreasing levels of BC. fMRI analyses highlighted a network of temporal, frontal and parietal regions where activity decreased with decreasing BC. Critically, in the intra-parietal sulcus and the supra-marginal gyrus brain activity decreased with decreasing BC, but only for dynamic-ITD sounds (BC by ITD interaction). Thus, these regions activated selectively when the sounds contained both dynamic ITDs and high levels of BC; i.e. when subjects perceived sound movement. We conclude that sound movement perception requires both dynamic changes of the auditory input and effective sound-source localization, and that parietal cortex utilizes interaural temporal and coherence cues for the successful perception of sound movement.
Collapse
Affiliation(s)
- U Zimmer
- NeuroImaging Laboratory, Santa Lucia Foundation, Italy.
| | | |
Collapse
|
32
|
Lewald J, Peters S, Corballis MC, Hausmann M. Perception of stationary and moving sound following unilateral cortectomy. Neuropsychologia 2009; 47:962-71. [DOI: 10.1016/j.neuropsychologia.2008.10.016] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2008] [Revised: 10/21/2008] [Accepted: 10/23/2008] [Indexed: 10/21/2022]
|
33
|
Deas RW, Roach NW, McGraw PV. Distortions of perceived auditory and visual space following adaptation to motion. Exp Brain Res 2008; 191:473-85. [PMID: 18726589 DOI: 10.1007/s00221-008-1543-1] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2008] [Accepted: 08/04/2008] [Indexed: 11/26/2022]
Abstract
Adaptation to visual motion can induce marked distortions of the perceived spatial location of subsequently viewed stationary objects. These positional shifts are direction specific and exhibit tuning for the speed of the adapting stimulus. In this study, we sought to establish whether comparable motion-induced distortions of space can be induced in the auditory domain. Using individually measured head related transfer functions (HRTFs) we created auditory stimuli that moved either leftward or rightward in the horizontal plane. Participants adapted to unidirectional auditory motion presented at a range of speeds and then judged the spatial location of a brief stationary test stimulus. All participants displayed direction-dependent and speed-tuned shifts in perceived auditory position relative to a 'no adaptation' baseline measure. To permit direct comparison between effects in different sensory domains, measurements of visual motion-induced distortions of perceived position were also made using stimuli equated in positional sensitivity for each participant. Both the overall magnitude of the observed positional shifts, and the nature of their tuning with respect to adaptor speed were similar in each case. A third experiment was carried out where participants adapted to visual motion prior to making auditory position judgements. Similar to the previous experiments, shifts in the direction opposite to that of the adapting motion were observed. These results add to a growing body of evidence suggesting that the neural mechanisms that encode visual and auditory motion are more similar than previously thought.
Collapse
Affiliation(s)
- Ross W Deas
- Visual Neuroscience Group, School of Psychology, The University of Nottingham, University Park, Nottingham, UK.
| | | | | |
Collapse
|
34
|
Arnott SR, Cant JS, Dutton GN, Goodale MA. Crinkling and crumpling: an auditory fMRI study of material properties. Neuroimage 2008; 43:368-78. [PMID: 18718543 DOI: 10.1016/j.neuroimage.2008.07.033] [Citation(s) in RCA: 35] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2007] [Revised: 05/25/2008] [Accepted: 07/12/2008] [Indexed: 10/21/2022] Open
Abstract
Knowledge of an object's material composition (i.e., what it is made of) alters how we interact with that object. Seeing the bright glint or hearing the metallic crinkle of a foil plate for example, confers information about that object before we have even touched it. Recent research indicates that the medial aspect of the ventral visual pathway is sensitive to the surface properties of objects. In the present functional magnetic resonance imaging (fMRI) study, we investigated whether the ventral pathway is also sensitive to material properties derived from sound alone. Relative to scrambled material sounds and non-verbal human vocalizations, audio recordings of materials being manipulated (i.e., crumpled) in someone's hands elicited greater BOLD activity in the right parahippocampal cortex of neurologically intact listeners, as well as a cortically blind participant. Additional left inferior parietal lobe activity was also observed in the neurologically intact group. Taken together, these results support a ventro-medial pathway that is specialized for processing the material properties of objects, and suggest that there are sub-regions within this pathway that subserve the processing of acoustically-derived information about material composition.
Collapse
Affiliation(s)
- Stephen R Arnott
- CIHR Group for Action and Perception, Department of Psychology, University of Western Ontario, London, Ontario, Canada.
| | | | | | | |
Collapse
|