1
|
Fan ZT, Zhao ZH, Sharma M, Valderrama JT, Fu QJ, Liu JX, Fu X, Li H, Zhao XL, Guo XY, Fu LY, Wang NY, Zhang J. Acoustic Change Complex Evoked by Horizontal Sound Location Change in Young Adults With Normal Hearing. Front Neurosci 2022; 16:908989. [PMID: 35733932 PMCID: PMC9207405 DOI: 10.3389/fnins.2022.908989] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2022] [Accepted: 05/10/2022] [Indexed: 11/13/2022] Open
Abstract
Acoustic change complex (ACC) is a cortical auditory-evoked potential induced by a change of continuous sound stimulation. This study aimed to explore: (1) whether the change of horizontal sound location can elicit ACC; (2) the relationship between the change of sound location and the amplitude or latency of ACC; (3) the relationship between the behavioral measure of localization, minimum audible angle (MAA), and ACC. A total of 36 normal-hearing adults participated in this study. A 180° horizontal arc-shaped bracket with a 1.2 m radius was set in a sound field where participants sat at the center. MAA was measured in a two-alternative forced-choice setting. The objective electroencephalography recording of ACC was conducted with the location changed at four sets of positions, ±45°, ±15°, ±5°, and ±2°. The test stimulus was a 125–6,000 Hz broadband noise of 1 s at 60 ± 2 dB SPL with a 2 s interval. The N1′–P2′ amplitudes, N1′ latencies, and P2′ latencies of ACC under four positions were evaluated. The influence of electrode sites and the direction of sound position change on ACC waveform was analyzed with analysis of variance. Results suggested that (1) ACC can be elicited successfully by changing the horizontal sound location position. The elicitation rate of ACC increased with the increase of location change. (2) N1′–P2′ amplitude increased and N1′ and P2′ latencies decreased as the change of sound location increased. The effects of test angles on N1′–P2′ amplitude [F(1.91,238.1) = 97.172, p < 0.001], N1′ latency [F(1.78,221.90) = 96.96, p < 0.001], and P2′ latency [F(1.87,233.11) = 79.97, p < 0.001] showed a statistical significance. (3) The direction of sound location change had no significant effect on any of the ACC peak amplitudes or latencies. (4) Sound location discrimination threshold by the ACC test (97.0% elicitation rate at ±5°) was higher than MAA threshold (2.08 ± 0.5°). The current study results show that though the ACC thresholds are higher than the behavioral thresholds on MAA task, ACC can be used as an objective method to evaluate sound localization ability. This article discusses the implications of this research for clinical practice and evaluation of localization skills, especially for children.
Collapse
Affiliation(s)
- Zhi-Tong Fan
- Department of Otolaryngology Head and Neck Surgery, Beijing Chaoyang Hospital, Capital Medical University, Beijing, China
| | - Zi-Hui Zhao
- Department of Otolaryngology Head and Neck Surgery, Beijing Chaoyang Hospital, Capital Medical University, Beijing, China
| | - Mridula Sharma
- Department of Linguistics, Faculty of Human Sciences, Macquarie University, Sydney, NSW, Australia
| | - Joaquin T. Valderrama
- Department of Linguistics, Faculty of Human Sciences, Macquarie University, Sydney, NSW, Australia
- National Acoustic Laboratories, Sydney, NSW, Australia
| | - Qian-Jie Fu
- Department of Head and Neck Surgery, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, CA, United States
| | - Jia-Xing Liu
- Department of Otolaryngology Head and Neck Surgery, Beijing Chaoyang Hospital, Capital Medical University, Beijing, China
| | - Xin Fu
- Department of Otolaryngology Head and Neck Surgery, Beijing Chaoyang Hospital, Capital Medical University, Beijing, China
| | - Huan Li
- Department of Otolaryngology Head and Neck Surgery, Beijing Chaoyang Hospital, Capital Medical University, Beijing, China
| | - Xue-Lei Zhao
- Department of Otolaryngology Head and Neck Surgery, Beijing Chaoyang Hospital, Capital Medical University, Beijing, China
| | - Xin-Yu Guo
- Department of Otolaryngology Head and Neck Surgery, Beijing Chaoyang Hospital, Capital Medical University, Beijing, China
| | - Luo-Yi Fu
- Department of Otolaryngology Head and Neck Surgery, Beijing Chaoyang Hospital, Capital Medical University, Beijing, China
| | - Ning-Yu Wang
- Department of Otolaryngology Head and Neck Surgery, Beijing Chaoyang Hospital, Capital Medical University, Beijing, China
| | - Juan Zhang
- Department of Otolaryngology Head and Neck Surgery, Beijing Chaoyang Hospital, Capital Medical University, Beijing, China
- *Correspondence: Juan Zhang,
| |
Collapse
|
2
|
Altmann CF, Yamasaki D, Song Y, Bucher B. Processing of self-initiated sound motion in the human brain. Brain Res 2021; 1762:147433. [PMID: 33737062 DOI: 10.1016/j.brainres.2021.147433] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2020] [Revised: 03/10/2021] [Accepted: 03/11/2021] [Indexed: 12/01/2022]
Abstract
Interacting with objects in our environment usually leads to audible noise. Brain responses to such self-initiated sounds have been shown to be attenuated, in particular the so-called N1 component measured with electroencephalography (EEG). This attenuation has been proposed to be the effect of an internal forward model that allows for cancellation of the sensory consequences of a motor command. In the current study we asked whether the attenuation due to self-initiation of a sound also affects a later event-related potential - the so-called motion-onset response - that arises in response to moving sounds. To this end, volunteers were instructed to move their index fingers either left or rightward which resulted in virtual movement of a sound either to the left or to the right. In Experiment 1, sound motion was induced with in-ear head-phones by shifting interaural time and intensity differences and thus shifting the intracranial sound image. We compared the motion-onset responses under two conditions: a) congruent, and b) incongruent. In the congruent condition, the sound image moved in the direction of the finger movement, while in the incongruent condition sound motion was in the opposite direction of the finger movement. Clear motion-onset responses with a negative cN1 component peaking at about 160 ms and a positive cP2 component peaking at about 230 ms after motion-onset were obtained for both the congruent and incongruent conditions. However, the motion-onset responses did not significantly differ between congruent and incongruent conditions in amplitude or latency. In Experiment 2, in which sounds were presented with loudspeakers, we observed attenuation for self-induced versus externally triggered sound motion-onset, but again, there was no difference between congruent and incongruent conditions. In sum, these two experiments suggest that the motion-onset response measured by EEG can be attenuated for self-generated sounds. However, our result did not indicate that this attenuation depended on congruency of action and sound motion direction.
Collapse
Affiliation(s)
- Christian F Altmann
- Human Brain Research Center, Graduate School of Medicine, Kyoto University, Kyoto 606-8507, Japan; Parkinson-Klinik Ortenau, 77709 Wolfach, Germany.
| | - Daiki Yamasaki
- Department of Psychology, Graduate School of Letters, Kyoto University, Kyoto 606-8501, Japan; Japan Society for the Promotion of Science, Tokyo 102-0083, Japan
| | - Yunqing Song
- Human Brain Research Center, Graduate School of Medicine, Kyoto University, Kyoto 606-8507, Japan
| | - Benoit Bucher
- Department of Psychology, Graduate School of Letters, Kyoto University, Kyoto 606-8501, Japan
| |
Collapse
|
3
|
St George BV, Cone B. Perceptual and Electrophysiological Correlates of Fixed Versus Moving Sound Source Lateralization. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2020; 63:3176-3194. [PMID: 32812839 DOI: 10.1044/2020_jslhr-19-00289] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Purpose The aims of the study were (a) to evaluate the effects of systematically varied factors of stimulus duration, interaural-level difference (ILD), and direction on perceptual and electrophysiological metrics of lateralization for fixed versus moving targets and (b) to evaluate the hemispheric activity underlying perception of fixed versus moving auditory targets. Method Twelve normal-hearing, young adult listeners were evaluated using perceptual and P300 tests of lateralization. Both perceptual and P300 tests utilized stimuli that varied for type (fixed and moving), direction (right and left), duration (100 and 500 ms), and magnitude of ILD (9 and 18 dB). Listeners provided laterality judgments and stimulus-type discrimination (fixed vs. moving) judgments for all combinations of acoustic factors. During P300 recordings, listeners discriminated between left- versus right-directed targets, as the other acoustic parameters were varied. Results ILD magnitude and stimulus type had statistically significant effects on laterality ratings, with larger magnitude ILDs and fixed type resulting in greater lateralization. Discriminability between fixed versus moving targets was dependent on stimulus duration and ILD magnitude. ILD magnitude was a significant predictor of P300 amplitude. There was a statistically significant inverse relationship between the perceived velocity of targets and P300 latency. Lateralized targets evoked contralateral hemispheric P300 activity. Moreover, a right-hemisphere enhancement was observed for fixed-type lateralized deviant stimuli. Conclusions Perceptual and P300 findings indicate that lateralization of auditory movement is highly dependent on temporal integration. Both the behavioral and physiological findings of this study suggest that moving auditory targets with ecologically valid velocities are processed by the central auditory nervous system within a window of temporal integration that is greater than that for fixed auditory targets. Furthermore, these findings lend support for a left hemispatial perceptual bias and right hemispheric dominance for spatial listening.
Collapse
Affiliation(s)
| | - Barbara Cone
- Department of Speech, Language, and Hearing Sciences, The University of Arizona, Tucson
| |
Collapse
|
4
|
Neural tracking of auditory motion is reflected by delta phase and alpha power of EEG. Neuroimage 2018; 181:683-691. [PMID: 30053517 DOI: 10.1016/j.neuroimage.2018.07.054] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2018] [Revised: 07/10/2018] [Accepted: 07/23/2018] [Indexed: 12/29/2022] Open
Abstract
It is of increasing practical interest to be able to decode the spatial characteristics of an auditory scene from electrophysiological signals. However, the cortical representation of auditory space is not well characterized, and it is unclear how cortical activity reflects the time-varying location of a moving sound. Recently, we demonstrated that cortical response measures to discrete noise bursts can be decoded to determine their origin in space. Here we build on these findings to investigate the cortical representation of a continuously moving auditory stimulus using scalp recorded electroencephalography (EEG). In a first experiment, subjects listened to pink noise over headphones which was spectro-temporally modified to be perceived as randomly moving on a semi-circular trajectory in the horizontal plane. While subjects listened to the stimuli, we recorded their EEG using a 128-channel acquisition system. The data were analysed by 1) building a linear regression model (decoder) mapping the relationship between the stimulus location and a training set of EEG data, and 2) using the decoder to reconstruct an estimate of the time-varying sound source azimuth from the EEG data. The results showed that we can decode sound trajectory with a reconstruction accuracy significantly above chance level. Specifically, we found that the phase of delta (<2 Hz) and power of alpha (8-12 Hz) EEG track the dynamics of a moving auditory object. In a follow-up experiment, we replaced the noise with pulse train stimuli containing only interaural level and time differences (ILDs and ITDs respectively). This allowed us to investigate whether our trajectory decoding is sensitive to both acoustic cues. We found that the sound trajectory can be decoded for both ILD and ITD stimuli. Moreover, their neural signatures were similar and even allowed successful cross-cue classification. This supports the notion of integrated processing of ILD and ITD at the cortical level. These results are particularly relevant for application in devices such as cognitively controlled hearing aids and for the evaluation of virtual acoustic environments.
Collapse
|
5
|
Altmann CF, Ueda R, Bucher B, Furukawa S, Ono K, Kashino M, Mima T, Fukuyama H. Trading of dynamic interaural time and level difference cues and its effect on the auditory motion-onset response measured with electroencephalography. Neuroimage 2017; 159:185-194. [DOI: 10.1016/j.neuroimage.2017.07.055] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2017] [Revised: 06/15/2017] [Accepted: 07/25/2017] [Indexed: 11/29/2022] Open
|
6
|
Shestopalova L, Petropavlovskaia E, Vaitulevich S, Nikitin N. Hemispheric asymmetry of ERPs and MMNs evoked by slow, fast and abrupt auditory motion. Neuropsychologia 2016; 91:465-479. [DOI: 10.1016/j.neuropsychologia.2016.09.011] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2016] [Revised: 08/25/2016] [Accepted: 09/13/2016] [Indexed: 10/21/2022]
|
7
|
Carlile S, Leung J. The Perception of Auditory Motion. Trends Hear 2016; 20:2331216516644254. [PMID: 27094029 PMCID: PMC4871213 DOI: 10.1177/2331216516644254] [Citation(s) in RCA: 38] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2015] [Revised: 03/22/2016] [Accepted: 03/22/2016] [Indexed: 11/16/2022] Open
Abstract
The growing availability of efficient and relatively inexpensive virtual auditory display technology has provided new research platforms to explore the perception of auditory motion. At the same time, deployment of these technologies in command and control as well as in entertainment roles is generating an increasing need to better understand the complex processes underlying auditory motion perception. This is a particularly challenging processing feat because it involves the rapid deconvolution of the relative change in the locations of sound sources produced by rotational and translations of the head in space (self-motion) to enable the perception of actual source motion. The fact that we perceive our auditory world to be stable despite almost continual movement of the head demonstrates the efficiency and effectiveness of this process. This review examines the acoustical basis of auditory motion perception and a wide range of psychophysical, electrophysiological, and cortical imaging studies that have probed the limits and possible mechanisms underlying this perception.
Collapse
Affiliation(s)
- Simon Carlile
- School of Medical Sciences, University of Sydney, NSW, Australia Starkey Hearing Research Center, Berkeley, CA, USA
| | - Johahn Leung
- School of Medical Sciences, University of Sydney, NSW, Australia
| |
Collapse
|
8
|
Shestopalova L, Petropavlovskaia E, Vaitulevich S, Nikitin N. Contextual effects on preattentive processing of sound motion as revealed by spatial MMN. Int J Psychophysiol 2015; 96:49-56. [DOI: 10.1016/j.ijpsycho.2015.02.021] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2014] [Revised: 02/13/2015] [Accepted: 02/17/2015] [Indexed: 10/24/2022]
|
9
|
De Pascalis V, Varriale E, Fulco M, Fracasso F. Mental ability and information processing during discrimination of auditory motion patterns: Effects on P300 and mismatch negativity. INTELLIGENCE 2014. [DOI: 10.1016/j.intell.2014.09.006] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
10
|
Ahveninen J, Kopčo N, Jääskeläinen IP. Psychophysics and neuronal bases of sound localization in humans. Hear Res 2013; 307:86-97. [PMID: 23886698 DOI: 10.1016/j.heares.2013.07.008] [Citation(s) in RCA: 56] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/15/2013] [Revised: 07/02/2013] [Accepted: 07/10/2013] [Indexed: 10/26/2022]
Abstract
Localization of sound sources is a considerable computational challenge for the human brain. Whereas the visual system can process basic spatial information in parallel, the auditory system lacks a straightforward correspondence between external spatial locations and sensory receptive fields. Consequently, the question how different acoustic features supporting spatial hearing are represented in the central nervous system is still open. Functional neuroimaging studies in humans have provided evidence for a posterior auditory "where" pathway that encompasses non-primary auditory cortex areas, including the planum temporale (PT) and posterior superior temporal gyrus (STG), which are strongly activated by horizontal sound direction changes, distance changes, and movement. However, these areas are also activated by a wide variety of other stimulus features, posing a challenge for the interpretation that the underlying areas are purely spatial. This review discusses behavioral and neuroimaging studies on sound localization, and some of the competing models of representation of auditory space in humans. This article is part of a Special Issue entitled Human Auditory Neuroimaging.
Collapse
Affiliation(s)
- Jyrki Ahveninen
- Harvard Medical School - Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Massachusetts General Hospital, Charlestown, MA, USA.
| | | | | |
Collapse
|
11
|
Discrimination of auditory motion patterns: The mismatch negativity study. Neuropsychologia 2012; 50:2720-2729. [DOI: 10.1016/j.neuropsychologia.2012.07.043] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2012] [Revised: 07/17/2012] [Accepted: 07/30/2012] [Indexed: 11/20/2022]
|
12
|
Ohoyama K, Motomura E, Inui K, Nishihara M, Otsuru N, Oi M, Kakigi R, Okada M. Memory-based pre-attentive auditory N1 elicited by sound movement. Neurosci Res 2012; 73:248-51. [DOI: 10.1016/j.neures.2012.04.003] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2012] [Revised: 04/06/2012] [Accepted: 04/09/2012] [Indexed: 10/28/2022]
|