1
|
Are auditory cues special? Evidence from cross-modal distractor-induced blindness. Atten Percept Psychophys 2022; 85:889-904. [PMID: 35902451 PMCID: PMC10066119 DOI: 10.3758/s13414-022-02540-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/08/2022] [Indexed: 11/08/2022]
Abstract
A target that shares features with preceding distractor stimuli is less likely to be detected due to a distractor-driven activation of a negative attentional set. This transient impairment in perceiving the target (distractor-induced blindness/deafness) can be found within vision and audition. Recently, the phenomenon was observed in a cross-modal setting involving an auditory target and additional task-relevant visual information (cross-modal distractor-induced deafness). In the current study, consisting of three behavioral experiments, a visual target, indicated by an auditory cue, had to be detected despite the presence of visual distractors. Multiple distractors consistently led to reduced target detection if cue and target appeared in close temporal proximity, confirming cross-modal distractor-induced blindness. However, the effect on target detection was reduced compared to the effect of cross-modal distractor-induced deafness previously observed for reversed modalities. The physical features defining cue and target could not account for the diminished distractor effect in the current cross-modal task. Instead, this finding may be attributed to the auditory cue acting as an especially efficient release signal of the distractor-induced inhibition. Additionally, a multisensory enhancement of visual target detection by the concurrent auditory signal might have contributed to the reduced distractor effect.
Collapse
|
2
|
Buss E, Lorenzi C, Cabrera L, Leibold LJ, Grose JH. Amplitude modulation detection and modulation masking in school-age children and adults. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2019; 145:2565. [PMID: 31046373 PMCID: PMC6909994 DOI: 10.1121/1.5098950] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/10/2019] [Revised: 04/02/2019] [Accepted: 04/03/2019] [Indexed: 05/30/2023]
Abstract
Two experiments were performed to better understand on- and off-frequency modulation masking in normal-hearing school-age children and adults. Experiment 1 estimated thresholds for detecting 16-, 64- or 256-Hz sinusoidal amplitude modulation (AM) imposed on a 4300-Hz pure tone. Thresholds tended to improve with age, with larger developmental effects for 64- and 256-Hz AM than 16-Hz AM. Detection of 16-Hz AM was also measured with a 1000-Hz off-frequency masker tone carrying 16-Hz AM. Off-frequency modulation masking was larger for younger than older children and adults when the masker was gated with the target, but not when the masker was continuous. Experiment 2 measured detection of 16- or 64-Hz sinusoidal AM carried on a bandpass noise with and without additional on-frequency masker AM. Children and adults demonstrated modulation masking with similar tuning to modulation rate. Rate-dependent age effects for AM detection on a pure-tone carrier are consistent with maturation of temporal resolution, an effect that may be obscured by modulation masking for noise carriers. Children were more susceptible than adults to off-frequency modulation masking for gated stimuli, consistent with maturation in the ability to listen selectively in frequency, but the children were not more susceptible to on-frequency modulation masking than adults.
Collapse
Affiliation(s)
- Emily Buss
- Department of Otolaryngology/Head and Neck Surgery, School of Medicine, University of North Carolina, Chapel Hill, North Carolina 27599-7070, USA
| | - Christian Lorenzi
- Laboratoire des Systèmes Perceptifs, Département d'Études Cognitives, Ecole Normale Supérieure, Universite Paris Sciences et Lettres, Centre National de la Recherche Scientifique, Paris, France
| | - Laurianne Cabrera
- Laboratoire de Psychologie de la Perception, Université Paris Descartes, Centre National de la Recherche Scientifique, Paris, France
| | - Lori J Leibold
- Center for Hearing Research, Boys Town National Research Hospital, Omaha, Nebraska 68131, USA
| | - John H Grose
- Department of Otolaryngology/Head and Neck Surgery, School of Medicine, University of North Carolina, Chapel Hill, North Carolina 27599-7070, USA
| |
Collapse
|
3
|
Günel B, Thiel CM, Hildebrandt KJ. Effects of Exogenous Auditory Attention on Temporal and Spectral Resolution. Front Psychol 2018; 9:1984. [PMID: 30405479 PMCID: PMC6206225 DOI: 10.3389/fpsyg.2018.01984] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2018] [Accepted: 09/27/2018] [Indexed: 11/25/2022] Open
Abstract
Previous research in the visual domain suggests that exogenous attention in form of peripheral cueing increases spatial but lowers temporal resolution. It is unclear whether this effect transfers to other sensory modalities. Here, we tested the effects of exogenous attention on temporal and spectral resolution in the auditory domain. Eighteen young, normal-hearing adults were tested in both gap and frequency change detection tasks with exogenous cuing. Benefits of valid cuing were only present in the gap detection task while costs of invalid cuing were observed in both tasks. Our results suggest that exogenous attention in the auditory system improves temporal resolution without compromising spectral resolution.
Collapse
Affiliation(s)
- Basak Günel
- Department of Psychology, University of Oldenburg, Oldenburg, Germany
| | - Christiane M Thiel
- Department of Psychology, University of Oldenburg, Oldenburg, Germany.,Cluster of Excellence Hearing4all, University of Oldenburg, Oldenburg, Germany
| | - K Jannis Hildebrandt
- Cluster of Excellence Hearing4all, University of Oldenburg, Oldenburg, Germany.,Department of Neuroscience, University of Oldenburg, Oldenburg, Germany
| |
Collapse
|
4
|
Abstract
Mixed results have been found for the impact of auditory information presented during high-perceptual-load visual search tasks, with some studies showing large effects and others indicating inattentional deafness, with such stimuli going largely undetected. In three experiments, we demonstrated that task relatedness is a key factor in whether extraneous auditory stimuli impact high-load visual searches. Experiment 1 addressed a methodological concern (e.g., Lavie Trends in Cognitive Sciences, 9, 75-82, 2005) regarding the timing of the relative onsets and offsets of task-related, to-be-ignored auditory stimuli and visual search arrays in experiments that have shown auditory distractor effects. Robust auditory distractor effects were found in each timing condition, and no inattentional deafness for high-load searches. Experiments 2 and 3 demonstrated that the relationship between the auditory stimuli and visual targets determined whether attention was captured and whether the response times to identify targets were impacted. Auditory stimuli that named a response-specific category influenced responses to targets mapped exclusively to one response, but not to the same targets mapped nonexclusively. These compatibility effects were larger if the distractors named an actual target item than if they named the category to which the item belonged. This pattern suggests that to-be-ignored auditory information that closely relates to a visual target search task influences the processing of that task, particularly in a high-perceptual-load search.
Collapse
|
5
|
Murphy S, Spence C, Dalton P. Auditory perceptual load: A review. Hear Res 2017; 352:40-48. [DOI: 10.1016/j.heares.2017.02.005] [Citation(s) in RCA: 46] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/23/2016] [Revised: 12/21/2016] [Accepted: 02/05/2017] [Indexed: 11/26/2022]
|
6
|
Abstract
It is now well established that the visual attention system is shaped by reward learning. When visual features are associated with a reward outcome, they acquire high priority and can automatically capture visual attention. To date, evidence for value-driven attentional capture has been limited entirely to the visual system. In the present study, I demonstrate that previously reward-associated sounds also capture attention, interfering more strongly with the performance of a visual task. This finding suggests that value-driven attention reflects a broad principle of information processing that can be extended to other sensory modalities and that value-driven attention can bias cross-modal stimulus competition.
Collapse
|
7
|
Gamble ML, Woldorff MG. Rapid Context-based Identification of Target Sounds in an Auditory Scene. J Cogn Neurosci 2015; 27:1675-84. [PMID: 25848684 DOI: 10.1162/jocn_a_00814] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
To make sense of our dynamic and complex auditory environment, we must be able to parse the sensory input into usable parts and pick out relevant sounds from all the potentially distracting auditory information. Although it is unclear exactly how we accomplish this difficult task, Gamble and Woldorff [Gamble, M. L., & Woldorff, M. G. The temporal cascade of neural processes underlying target detection and attentional processing during auditory search. Cerebral Cortex (New York, N.Y.: 1991), 2014] recently reported an ERP study of an auditory target-search task in a temporally and spatially distributed, rapidly presented, auditory scene. They reported an early, differential, bilateral activation (beginning at 60 msec) between feature-deviating target stimuli and physically equivalent feature-deviating nontargets, reflecting a rapid target detection process. This was followed shortly later (at 130 msec) by the lateralized N2ac ERP activation, that reflects the focusing of auditory spatial attention toward the target sound and parallels the attentional-shifting processes widely studied in vision. Here we directly examined the early, bilateral, target-selective effect to better understand its nature and functional role. Participants listened to midline-presented sounds that included target and nontarget stimuli that were randomly either embedded in a brief rapid stream or presented alone. The results indicate that this early bilateral effect results from a template for the target that utilizes its feature deviancy within a stream to enable rapid identification. Moreover, individual-differences analysis showed that the size of this effect was larger for participants with faster RTs. The findings support the hypothesis that our auditory attentional systems can implement and utilize a context-based relational template for a target sound, making use of additional auditory information in the environment when needing to rapidly detect a relevant sound.
Collapse
|
8
|
Gamble ML, Woldorff MG. The Temporal Cascade of Neural Processes Underlying Target Detection and Attentional Processing During Auditory Search. Cereb Cortex 2014; 25:2456-65. [PMID: 24711486 DOI: 10.1093/cercor/bhu047] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
The posterior visual event-related potential (ERP) component, the N2pc, has been widely used to study lateralized shifts of attention within visual arrays. Recently, Gamble and Luck (2011) reported an auditory analog of this activity (the fronto-central "N2ac"), reflecting the lateralized focusing of attention toward a Target sound among 2 simultaneous auditory stimuli. Here, we directed an electrophysiological approach toward understanding auditory Target search within a more complex auditory environment in which rapidly occurring sounds were distributed across both time and space. Trials consisted of ten 40-ms monaural sounds rapidly presented to the 2 ears: 8 medium-pitch tones and 2 deviant sounds (one high and one low). For each block, one deviant type was designated as the Target, which participants needed to identify within each trial to discriminate its tonal quality. The extracted electrophysiological results included a very early enhancement, starting at approximately 50 ms, of a bilateral negative-polarity auditory brain response to the designated Target Deviant (compared with the Nontarget Deviant), followed at approximately 130 ms by the N2ac activity reflecting the lateralized focusing of attention toward that Target. The results delineate the tightly orchestrated sequence of neural processes underlying the detection of, and focusing of attention toward, Target sounds in complex auditory scenes.
Collapse
Affiliation(s)
- Marissa L Gamble
- Center for Cognitive Neuroscience, Duke University, Durham, NC 27708, USA Department of Psychology and Neuroscience, Duke University, Durham, NC, USA
| | - Marty G Woldorff
- Center for Cognitive Neuroscience, Duke University, Durham, NC 27708, USA Department of Psychology and Neuroscience, Duke University, Durham, NC, USA Department of Psychiatry, Duke University, Durham, NC, USA
| |
Collapse
|
9
|
Auditory attentional capture: implicit and explicit approaches. PSYCHOLOGICAL RESEARCH 2014; 78:313-20. [PMID: 24643575 DOI: 10.1007/s00426-014-0557-5] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2013] [Accepted: 02/24/2014] [Indexed: 10/25/2022]
Abstract
The extent to which distracting items capture attention despite being irrelevant to the task at hand can be measured either implicitly or explicitly (e.g., Simons, Trends Cogn Sci 4:147-155, 2000). Implicit approaches include the standard attentional capture paradigm in which distraction is measured in terms of reaction time and/or accuracy costs within a focal task in the presence (vs. absence) of a task-irrelevant distractor. Explicit measures include the inattention paradigm in which people are asked directly about their noticing of an unexpected task-irrelevant item. Although the processes of attentional capture have been studied extensively using both approaches in the visual domain, there is much less research on similar processes as they may operate within audition, and the research that does exist in the auditory domain has tended to focus exclusively on either an explicit or an implicit approach. This paper provides an overview of recent research on auditory attentional capture, integrating the key conclusions that may be drawn from both methodological approaches.
Collapse
|
10
|
Murphy S, Fraenkel N, Dalton P. Perceptual load does not modulate auditory distractor processing. Cognition 2013; 129:345-55. [DOI: 10.1016/j.cognition.2013.07.014] [Citation(s) in RCA: 40] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2013] [Revised: 07/11/2013] [Accepted: 07/21/2013] [Indexed: 11/25/2022]
|
11
|
Abstract
This paper seeks out to reduce the role of the homunculus, the 'little man in the head' that is still prominent in most psychological theories regarding the control our behaviour. We argue that once engaged in a task (which is a volitional act), visual selection run off more or less in an automatic fashion. We argue that the salience map that drives automatic selection is not only determined by raw physical salience of the objects in the environment but also by the way these objects appear to the person. We provide evidence that priming (feature priming, priming by working memory and reward priming) sharpens the cortical representation of these objects such that these objects appear to be more salient above and beyond their physical salience. We demonstrate that this type of priming is not under volitional control: it occurs even if observers try to volitionally prepare for something else. In other words, looking at red prepares our brain for things that are red even if we volitionally try to prepare for green.
Collapse
Affiliation(s)
- Jan Theeuwes
- Department of Cognitive Psychology, Vrije Universiteit, Van der Boechorststraat 1, 1081 BT Amsterdam, Netherlands.
| |
Collapse
|
12
|
Garrido MI, Dolan RJ, Sahani M. Surprise leads to noisier perceptual decisions. Iperception 2011; 2:112-20. [PMID: 23145228 PMCID: PMC3485781 DOI: 10.1068/i0411] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2010] [Revised: 03/08/2011] [Indexed: 11/19/2022] Open
Abstract
Surprising events in the environment can impair task performance. This might be due to complete distraction, leading to lapses during which performance is reduced to guessing. Alternatively, unpredictability might cause a graded withdrawal of perceptual resources from the task at hand and thereby reduce sensitivity. Here we attempt to distinguish between these two mechanisms. Listeners performed a novel auditory pitch-duration discrimination, where stimulus loudness changed occasionally and incidentally to the task. Responses were slower and less accurate in the surprising condition, where loudness changed unpredictably, than in the predictable condition, where the loudness was held constant. By explicitly modelling both lapses and changes in sensitivity, we found that unpredictable changes diminished sensitivity but did not increase the rate of lapses. These findings suggest that background environmental uncertainty can disrupt goal-directed behaviour. This graded processing strategy might be adaptive in potentially threatening contexts, and reflect a flexible system for automatic allocation of perceptual resources.
Collapse
Affiliation(s)
- Marta I Garrido
- Wellcome Trust Centre for Neuroimaging, University College London, London WC1N 3BG, England; e-mail:
| | | | | |
Collapse
|
13
|
Theeuwes J. Top-down and bottom-up control of visual selection. Acta Psychol (Amst) 2010; 135:77-99. [PMID: 20507828 DOI: 10.1016/j.actpsy.2010.02.006] [Citation(s) in RCA: 729] [Impact Index Per Article: 52.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2009] [Revised: 02/13/2010] [Accepted: 02/16/2010] [Indexed: 10/19/2022] Open
Abstract
The present paper argues for the notion that when attention is spread across the visual field in the first sweep of information through the brain visual selection is completely stimulus-driven. Only later in time, through recurrent feedback processing, volitional control based on expectancy and goal set will bias visual selection in a top-down manner. Here we review behavioral evidence as well as evidence from ERP, fMRI, TMS and single cell recording consistent with stimulus-driven selection. Alternative viewpoints that assume a large role for top-down processing are discussed. It is argued that in most cases evidence supporting top-down control on visual selection in fact demonstrates top-down control on processes occurring later in time, following initial selection. We conclude that top-down knowledge regarding non-spatial features of the objects cannot alter the initial selection priority. Only by adjusting the size of the attentional window, the initial sweep of information through the brain may be altered in a top-down way.
Collapse
|