1
|
Masters-Waage TC, Kinias Z, Argueta-Rivera J, Stewart D, Ivany R, King E, Hebl M. Social inattentional blindness to idea stealing in meetings. Sci Rep 2024; 14:8060. [PMID: 38580682 PMCID: PMC10997580 DOI: 10.1038/s41598-024-56905-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2023] [Accepted: 03/11/2024] [Indexed: 04/07/2024] Open
Abstract
Using a virtual reality social experiment, participants (N = 154) experienced being at the table during a decision-making meeting and identified the best solutions generated. During the meeting, one meeting participant repeated another participant's idea, presenting it as his own. Although this idea stealing was clearly visible and audible, only 30% of participants correctly identified who shared the idea first. Subsequent analyses suggest that the social environment affected this novel form of inattentional blindness. Although there was no experimental effect of team diversity on noticing, there was correlational evidence of an indirect effect of perceived team status on noticing via attentional engagement. In sum, this paper extends the inattentional blindness phenomenon to a realistic professional interaction and demonstrates how features of the social environment can reduce social inattention.
Collapse
|
2
|
Callan DE, Fukada T, Dehais F, Ishii S. The role of brain-localized gamma and alpha oscillations in inattentional deafness: implications for understanding human attention. Front Hum Neurosci 2023; 17:1168108. [PMID: 37305364 PMCID: PMC10248426 DOI: 10.3389/fnhum.2023.1168108] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2023] [Accepted: 04/27/2023] [Indexed: 06/13/2023] Open
Abstract
Introduction The processes involved in how the attention system selectively focuses on perceptual and motor aspects related to a specific task, while suppressing features of other tasks and/or objects in the environment, are of considerable interest for cognitive neuroscience. The goal of this experiment was to investigate neural processes involved in selective attention and performance under multi-task situations. Several studies have suggested that attention-related gamma-band activity facilitates processing in task-specific modalities, while alpha-band activity inhibits processing in non-task-related modalities. However, investigations into the phenomenon of inattentional deafness/blindness (inability to observe stimuli in non-dominant task when primary task is demanding) have yet to observe gamma-band activity. Methods This EEG experiment utilizes an engaging whole-body perceptual motor task while carrying out a secondary auditory detection task to investigate neural correlates of inattentional deafness in natural immersive high workload conditions. Differences between hits and misses on the auditory detection task in the gamma (30-50 Hz) and alpha frequency (8-12 Hz) range were carried out at the cortical source level using LORETA. Results Participant auditory task performance correlated with an increase in gamma-band activity for hits over misses pre- and post-stimulus in left auditory processing regions. Alpha-band activity was greater for misses relative to hits in right auditory processing regions pre- and post-stimulus onset. These results are consistent with the facilitatory/inhibitory role of gamma/alpha-band activity for neural processing. Additional gamma- and alpha-band activity was found in frontal and parietal brain regions which are thought to reflect various attentional monitoring, selection, and switching processes. Discussion The results of this study help to elucidate the role of gamma and alpha frequency bands in frontal and modality-specific regions involved with selective attention in multi-task immersive situations.
Collapse
Affiliation(s)
- Daniel E. Callan
- Brain Information Communication Research Laboratory, Advanced Telecommunications Research Institute International, Kyoto, Japan
- Institut Supérieur de l'Aéronautique et de l'Espace, University of Toulouse, Toulouse, France
| | - Takashi Fukada
- Brain Information Communication Research Laboratory, Advanced Telecommunications Research Institute International, Kyoto, Japan
- Graduate School of Informatics, Kyoto University, Kyoto, Japan
| | - Frédéric Dehais
- Institut Supérieur de l'Aéronautique et de l'Espace, University of Toulouse, Toulouse, France
| | - Shin Ishii
- Brain Information Communication Research Laboratory, Advanced Telecommunications Research Institute International, Kyoto, Japan
- Graduate School of Informatics, Kyoto University, Kyoto, Japan
| |
Collapse
|
3
|
The unnoticed zoo: Inattentional deafness to animal sounds in music. Atten Percept Psychophys 2022; 85:1238-1252. [PMID: 36008746 PMCID: PMC10167135 DOI: 10.3758/s13414-022-02553-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/08/2022] [Indexed: 11/08/2022]
Abstract
Inattentional unawareness potentially occurs in several different sensory domains but is mainly described in visual paradigms ("inattentional blindness"; e.g., Simons & Chabris, 1999, Perception, 28, 1059-1074). Dalton and Fraenkel (2012, Cognition, 124, 367-372) were introducing "inattentional deafness" by showing that participants missed by 70% a voice repeatedly saying "I'm a Gorilla" when focusing on a primary conversation. The present study expanded this finding from the acoustic domain in a multifaceted way: First, we extended the validity perspective by using 10 acoustic samples-specifically, excerpts of popular musical pieces from different music genres. Second, we used as the secondary acoustic signal animal sounds. Those sounds originate from a completely different acoustic domain and are therefore highly distinctive from the primary sound. Participants' task was to count different musical features. Results (N = 37 participants) showed that the frequency of missed animal sounds was higher in participants with higher attentional focus and motivation. Additionally, attentional focus, perceptual load, and feature similarity/saliency were analyzed and did not have an influence on detecting or missing animal sounds. We could demonstrate that for 31.2% of the music plays, people did not recognize highly salient animal voices (regarding the type of acoustic source as well as the frequency spectra) when executing the primary (counting) task. This uncovered, significant effect supports the idea that inattentional deafness is even available when the unattended acoustic stimuli are highly salient.
Collapse
|
4
|
Hölle D, Blum S, Kissner S, Debener S, Bleichner MG. Real-Time Audio Processing of Real-Life Soundscapes for EEG Analysis: ERPs Based on Natural Sound Onsets. FRONTIERS IN NEUROERGONOMICS 2022; 3:793061. [PMID: 38235458 PMCID: PMC10790832 DOI: 10.3389/fnrgo.2022.793061] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/11/2021] [Accepted: 01/03/2021] [Indexed: 01/19/2024]
Abstract
With smartphone-based mobile electroencephalography (EEG), we can investigate sound perception beyond the lab. To understand sound perception in the real world, we need to relate naturally occurring sounds to EEG data. For this, EEG and audio information need to be synchronized precisely, only then it is possible to capture fast and transient evoked neural responses and relate them to individual sounds. We have developed Android applications (AFEx and Record-a) that allow for the concurrent acquisition of EEG data and audio features, i.e., sound onsets, average signal power (RMS), and power spectral density (PSD) on smartphone. In this paper, we evaluate these apps by computing event-related potentials (ERPs) evoked by everyday sounds. One participant listened to piano notes (played live by a pianist) and to a home-office soundscape. Timing tests showed a stable lag and a small jitter (< 3 ms) indicating a high temporal precision of the system. We calculated ERPs to sound onsets and observed the typical P1-N1-P2 complex of auditory processing. Furthermore, we show how to relate information on loudness (RMS) and spectra (PSD) to brain activity. In future studies, we can use this system to study sound processing in everyday life.
Collapse
Affiliation(s)
- Daniel Hölle
- Neurophysiology of Everyday Life Group, Department of Psychology, University of Oldenburg, Oldenburg, Germany
| | - Sarah Blum
- Neuropsychology Lab, Department of Psychology, University of Oldenburg, Oldenburg, Germany
- Cluster of Excellence Hearing4all, Oldenburg, Germany
| | - Sven Kissner
- Institute for Hearing Technology and Audiology, Jade University of Applied Sciences, Oldenburg, Germany
| | - Stefan Debener
- Neuropsychology Lab, Department of Psychology, University of Oldenburg, Oldenburg, Germany
| | - Martin G. Bleichner
- Neurophysiology of Everyday Life Group, Department of Psychology, University of Oldenburg, Oldenburg, Germany
| |
Collapse
|
5
|
Somon B, Giebeler Y, Darmet L, Dehais F. Benchmarking cEEGrid and Solid Gel-Based Electrodes to Classify Inattentional Deafness in a Flight Simulator. FRONTIERS IN NEUROERGONOMICS 2022; 2:802486. [PMID: 38235232 PMCID: PMC10790867 DOI: 10.3389/fnrgo.2021.802486] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/26/2021] [Accepted: 12/06/2021] [Indexed: 01/19/2024]
Abstract
Transfer from experiments in the laboratory to real-life tasks is challenging due notably to the inability to reproduce the complexity of multitasking dynamic everyday life situations in a standardized lab condition and to the bulkiness and invasiveness of recording systems preventing participants from moving freely and disturbing the environment. In this study, we used a motion flight simulator to induce inattentional deafness to auditory alarms, a cognitive difficulty arising in complex environments. In addition, we assessed the possibility of two low-density EEG systems a solid gel-based electrode Enobio (Neuroelectrics, Barcelona, Spain) and a gel-based cEEGrid (TMSi, Oldenzaal, Netherlands) to record and classify brain activity associated with inattentional deafness (misses vs. hits to odd sounds) with a small pool of expert participants. In addition to inducing inattentional deafness (missing auditory alarms) at much higher rates than with usual lab tasks (34.7% compared to the usual 5%), we observed typical inattentional deafness-related activity in the time domain but also in the frequency and time-frequency domains with both systems. Finally, a classifier based on Riemannian Geometry principles allowed us to obtain more than 70% of single-trial classification accuracy for both mobile EEG, and up to 71.5% for the cEEGrid (TMSi, Oldenzaal, Netherlands). These results open promising avenues toward detecting cognitive failures in real-life situations, such as real flight.
Collapse
Affiliation(s)
- Bertille Somon
- Artificial and Natural Intelligence Toulouse Institute, Université de Toulouse, Toulouse, France
- Department for Aerospace Vehicles Design and Control, ISAE-SUPAERO, Université de Toulouse, Toulouse, France
| | - Yasmina Giebeler
- Department for Aerospace Vehicles Design and Control, ISAE-SUPAERO, Université de Toulouse, Toulouse, France
- Department of Psychology and Ergonomics, Technische Universität Berlin, Berlin, Germany
| | - Ludovic Darmet
- Department for Aerospace Vehicles Design and Control, ISAE-SUPAERO, Université de Toulouse, Toulouse, France
| | - Frédéric Dehais
- Artificial and Natural Intelligence Toulouse Institute, Université de Toulouse, Toulouse, France
- Department for Aerospace Vehicles Design and Control, ISAE-SUPAERO, Université de Toulouse, Toulouse, France
- School of Biomedical Engineering, Science and Health Systems, Drexel University, Philadelphia, PA, United States
| |
Collapse
|
6
|
De Cassai A, Negro S, Geraldini F, Boscolo A, Sella N, Munari M, Navalesi P. Inattentional blindness in anesthesiology: A gorilla is worth one thousand words. PLoS One 2021; 16:e0257508. [PMID: 34555092 PMCID: PMC8459955 DOI: 10.1371/journal.pone.0257508] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2021] [Accepted: 09/02/2021] [Indexed: 11/18/2022] Open
Abstract
INTRODUCTION People are not able to anticipate unexpected events. Inattentional blindness is demonstrated to happen not only in naïve observers engaged in an unfamiliar task but also in field experts with years of training. Anaesthesia is the perfect example of a discipline which requires a high level of attention and our aim was to evaluate if inattentional blindness can affect anesthesiologists during their daily activities. MATERIALS AND METHODS An online survey was distributed on Facebook between May 1, 2021 and May 31, 2021. The survey consisted of five simulated cases with questions investigating the anesthetic management of day-case surgeries. Each case had an introduction, a chest radiography, an electrocardiogram, preoperative blood testing and the last case had a gorilla embedded in the chest radiography. RESULTS In total 699 respondents from 17 different countries were finally included in the analysis. The main outcome was to assess the incidence of inattentional blindness. Only 34 (4.9%) respondents were able to spot the gorilla. No differences were found between anesthesiologists or residents, private or public hospitals, or between medical doctors with different experience. DISCUSSION Our findings assess that inattentional blindness is common in anesthesia, and ever-growing attention is deemed necessary to improve patient safety; to achieve this objective several strategies should be adopted such as an increased use of standardized protocols, promoting automation based strategies to reduce human error when performing repetitive tasks and discouraging evaluation of multiple consecutive patients in the same work shifts independently of the associated complexity.
Collapse
Affiliation(s)
| | | | - Federico Geraldini
- Anesthesia and Intensive Care Unit, University Hospital of Padua, Padua, Italy
| | - Annalisa Boscolo
- Anesthesia and Intensive Care Unit, University Hospital of Padua, Padua, Italy
| | - Nicolò Sella
- Department of Medicine, University of Padua, Padua, Italy
| | - Marina Munari
- Anesthesia and Intensive Care Unit, University Hospital of Padua, Padua, Italy
| | - Paolo Navalesi
- Anesthesia and Intensive Care Unit, University Hospital of Padua, Padua, Italy
- Department of Medicine, University of Padua, Padua, Italy
| |
Collapse
|
7
|
Schlossmacher I, Dellert T, Bruchmann M, Straube T. Dissociating neural correlates of consciousness and task relevance during auditory processing. Neuroimage 2020; 228:117712. [PMID: 33387630 DOI: 10.1016/j.neuroimage.2020.117712] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2020] [Revised: 09/25/2020] [Accepted: 12/19/2020] [Indexed: 10/22/2022] Open
Abstract
In recent years, several ERP components have been identified as potential neural correlates of consciousness (NCC), including early negativities and late positivities. Based on experiments in the visual modality, it has recently been shown that awareness is often confounded with reporting it, possibly overestimating the NCC. It is unknown whether similar constraints also exist in the auditory modality. In order to address this gap, we presented spoken words in a sustained inattentional deafness paradigm. Electrophysiological responses were obtained in three physically identical experimental conditions that differed only with respect to the participants' instructions. Participants were either left uninformed or informed about the presence of spoken words while confronted with an auditory distractor task (U/I condition), informed about the words while exposed to the same task as before (I condition), or requested to respond to the now task-relevant speech stimuli (TR condition). After completion of the U/I condition, only informed participants reported awareness of the words. In ERPs, awareness of words in the U/I and I condition was accompanied by an anterior auditory awareness negativity (AAN). Only when stimuli were task-relevant, i.e., during the TR condition, late positivities emerged. Taken together, these results indicate that early negativities but not late positivities index awareness across sensory modalities. Thus, they provide evidence for a recurrent processing framework, which highlights the importance of early sensory processing in conscious perception.
Collapse
Affiliation(s)
- Insa Schlossmacher
- Institute of Medical Psychology and Systems Neuroscience, University of Münster, Von-Esmarch-Str. 52, 48149 Münster, Germany; Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Münster, 48149 Münster, Germany.
| | - Torge Dellert
- Institute of Medical Psychology and Systems Neuroscience, University of Münster, Von-Esmarch-Str. 52, 48149 Münster, Germany; Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Münster, 48149 Münster, Germany
| | - Maximilian Bruchmann
- Institute of Medical Psychology and Systems Neuroscience, University of Münster, Von-Esmarch-Str. 52, 48149 Münster, Germany; Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Münster, 48149 Münster, Germany
| | - Thomas Straube
- Institute of Medical Psychology and Systems Neuroscience, University of Münster, Von-Esmarch-Str. 52, 48149 Münster, Germany; Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Münster, 48149 Münster, Germany
| |
Collapse
|
8
|
Khan RA, Husain FT. Tinnitus and cognition: Can load theory help us refine our understanding? Laryngoscope Investig Otolaryngol 2020; 5:1197-1204. [PMID: 33364412 PMCID: PMC7752071 DOI: 10.1002/lio2.501] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2020] [Revised: 10/30/2020] [Accepted: 11/05/2020] [Indexed: 01/02/2023] Open
Abstract
Objective: Tinnitus has been shown to be associated with specific cognitive deficits. Contemporary models of tinnitus, based primarily on human behavior, emphasize the influence of the cognitive response to tinnitus in tinnitus manifestation and level of associated annoyance. The models and hypotheses proposed thus far have (a) focused on the cognitive response to the onset of tinnitus, and not necessarily focused on the cognitive consequences of established chronic tinnitus, and (b) failed to dissociate the contributions of cognitive and perceptual load in their theories. Load theory states that we have a limited capacity of neural resources that can be used to process internal and external stimuli. This theory is differentially applied to perceptual load, which refers to the neural resources engaged in the processing of sensory stimuli in our environment, and cognitive load, which refers to the occupation of a more central resource that is involved in higher-level processing, such as stimulus discrimination, decision making, and working memory processing. Methods: A focused review was conducted on behavioral and brain-imaging studies examining cognitive deficits in tinnitus, in an attempt to reexamine the findings in a load theory framework. Results: Findings of these studies are discussed in the context of load theory, and a novel model for understanding these findings is proposed. Conclusion: We believe the incorporation of load theory into models of tinnitus may advance understanding of the cognitive impact of tinnitus and lead to better management of tinnitus.
Collapse
Affiliation(s)
- Rafay A. Khan
- Neuroscience ProgramUniversity of Illinois at Urbana—ChampaignChampaignIllinoisUSA
- Beckman Institute for Advanced Science and TechnologyUniversity of Illinois at Urbana—ChampaignChampaignIllinoisUSA
| | - Fatima T. Husain
- Neuroscience ProgramUniversity of Illinois at Urbana—ChampaignChampaignIllinoisUSA
- Beckman Institute for Advanced Science and TechnologyUniversity of Illinois at Urbana—ChampaignChampaignIllinoisUSA
- Department of Speech and Hearing ScienceUniversity of Illinois at Urbana—ChampaignChampaignIllinoisUSA
| |
Collapse
|
9
|
Conci A, Bilalić M, Gaschler R. Can You See What I Hear? Exp Psychol 2020; 67:186-193. [PMID: 32900295 DOI: 10.1027/1618-3169/a000487] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
Previous research on inattentional blindness (IB) has focused almost entirely on the visual modality. This study extends the paradigm by pairing visual with auditory stimuli. New visual and auditory stimuli were created to investigate the phenomenon of inattention in visual, auditory, and paired modality. The goal of the study was to assess to what extent the pairing of visual and auditory modality fosters the detection of change. Participants watched a video sequence and counted predetermined words in a spoken text. IB and inattentional deafness occurred in about 40% of participants when attention was engaged by this difficult (auditory) counting task. Most importantly, participants detected the changes considerably more often (88%) when the change occurred in both modalities rather than just one. One possible reason for the drastic reduction of IB or deafness in a multimodal context is that discrepancy between expected and encountered course of events proportionally increases across sensory modalities.
Collapse
Affiliation(s)
- Anna Conci
- FernUniversität in Hagen, Hagen, Germany.,Alpen-Adria-Universität Klagenfurt, Klagenfurt, Austria
| | - Merim Bilalić
- Northumbria University, Newcastle-upon-Tyne, United Kingdom
| | | |
Collapse
|
10
|
Morgan P, Macken B, Toet A, Bompas A, Bray M, Rushton S, Jones D. Distraction for the eye and ear. THEORETICAL ISSUES IN ERGONOMICS SCIENCE 2020. [DOI: 10.1080/1463922x.2020.1712493] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
Affiliation(s)
- Philip Morgan
- HuFEx, School of Psychology, Cardiff University, Cardiff, UK
| | - Bill Macken
- HuFEx, School of Psychology, Cardiff University, Cardiff, UK
| | - Alexander Toet
- The Netherlands Organization for Applied Scientific Research
| | - Aline Bompas
- HuFEx, School of Psychology, Cardiff University, Cardiff, UK
| | - Mark Bray
- BAE Systems-Applied Intelligence Laboratories, London, UK
| | - Simon Rushton
- HuFEx, School of Psychology, Cardiff University, Cardiff, UK
| | - Dylan Jones
- HuFEx, School of Psychology, Cardiff University, Cardiff, UK
| |
Collapse
|
11
|
Inattentional deafness to auditory alarms: Inter-individual differences, electrophysiological signature and single trial classification. Behav Brain Res 2018; 360:51-59. [PMID: 30508609 DOI: 10.1016/j.bbr.2018.11.045] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2018] [Revised: 11/22/2018] [Accepted: 11/29/2018] [Indexed: 02/03/2023]
Abstract
Inattentional deafness can have deleterious consequences in complex real-life situations (e.g. healthcare, aviation) leading to miss critical auditory signals. Such failure of auditory attention is thought to rely on top-down biasing mechanisms at the central executive level. A complementary approach to account for this phenomenon is to consider the existence of visual dominance over hearing that could be implemented via direct visual-to-auditory pathways. To investigate this phenomenon, thirteen aircraft pilots, equipped with a 32-channel EEG system, faced a low and high workload scenarii along with an auditory oddball task in a motion flight simulator. Prior to the flying task, the pilots were screened to assess their working memory span and visual dominance susceptibility. The behavioral results disclosed that the volunteers missed 57.7% of the auditory alarms in the difficult condition. Among all evaluated capabilities, only the visual dominance index was predictive of the miss rate in the difficult scenario. These findings provide behavioral evidences that other early cross-modal competitive process than top down modulation process could account for inattentional deafness. The electrophysiological analyses showed that the miss over the hit alarms led to a significant amplitude reduction of early perceptual (N100) and late attentional (P3a and P3b) event-related potentials components. Eventually, we implemented an EEG-based processing pipeline to perform single-trial classification of inattentional deafness. The results indicate that this processing chain could be used in an ecological setting as it led to 72.2% mean accuracy to discriminate missed from hit auditory alarms.
Collapse
|
12
|
Edworthy J, Reid S, Peel K, Lock S, Williams J, Newbury C, Foster J, Farrington M. The impact of workload on the ability to localize audible alarms. APPLIED ERGONOMICS 2018; 72:88-93. [PMID: 29885730 DOI: 10.1016/j.apergo.2018.05.006] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/17/2017] [Revised: 03/07/2018] [Accepted: 05/12/2018] [Indexed: 06/08/2023]
Abstract
Very little is known about people's ability to localize sound under varying workload conditions, though it would be expected that increasing workload should degrade performance. A set of eight auditory clinical alarms already known to have relatively high localizability (the ease with which their location is identified) when tested alone were tested in six conditions where workload was varied. Participants were required to indicate the location of a series of alarms emanating at random from one of eight speaker locations. Additionally, they were asked to read, carry out mental arithmetic tasks, be exposed to typical ICU noise, or carry out either the reading task or the mental arithmetic task in ICU noise. Performance in the localizability task was best in the control condition (no secondary task) and worst in those tasks which involved both a secondary task and noise. The data does therefore demonstrate the typical pattern of increasing workload affecting a primary task in an area where there is little data. In addition, the data demonstrates that performance in the control condition results in a missed alarm on one in ten occurrences, whereas performance in the heaviest workload conditions results in a missed alarm on every fourth occurrence. This finding has implications for the understanding of both 'inattentional deafness' and 'alarm fatigue' in clinical environments.
Collapse
Affiliation(s)
- Judy Edworthy
- Cognition Institute, Plymouth University, Plymouth, Devon PL4 8AA, UK.
| | - Scott Reid
- Cognition Institute, Plymouth University, Plymouth, Devon PL4 8AA, UK
| | - Katie Peel
- Cognition Institute, Plymouth University, Plymouth, Devon PL4 8AA, UK
| | - Samantha Lock
- Cognition Institute, Plymouth University, Plymouth, Devon PL4 8AA, UK
| | - Jessica Williams
- Cognition Institute, Plymouth University, Plymouth, Devon PL4 8AA, UK
| | - Chloe Newbury
- Cognition Institute, Plymouth University, Plymouth, Devon PL4 8AA, UK
| | - Joseph Foster
- Cognition Institute, Plymouth University, Plymouth, Devon PL4 8AA, UK
| | - Martin Farrington
- Cognition Institute, Plymouth University, Plymouth, Devon PL4 8AA, UK
| |
Collapse
|
13
|
Murphy S, Dalton P. Inattentional numbness and the influence of task difficulty. Cognition 2018; 178:1-6. [PMID: 29753983 DOI: 10.1016/j.cognition.2018.05.001] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2017] [Revised: 04/30/2018] [Accepted: 05/02/2018] [Indexed: 10/16/2022]
Abstract
Research suggests that clearly detectable stimuli can be missed when attention is focused elsewhere, particularly when the observer is engaged in a complex task. Although this phenomenon has been demonstrated in vision and audition, much less is known about the possibility of a similar phenomenon within touch. Across two experiments, we investigated reported awareness of an unexpected tactile event as a function of the difficulty of a concurrent tactile task. Participants were presented with sequences of tactile stimuli to one hand and performed either an easy or a difficult counting task. On the final trial, an additional tactile stimulus was concurrently presented to the unattended hand. Retrospective reports revealed that more participants in the difficult (vs. easy) condition remained unaware of this unexpected stimulus, even though it was clearly detectable under full attention conditions. These experiments are the first demonstrating the phenomenon of inattentional numbness modulated by concurrent tactile task difficulty.
Collapse
Affiliation(s)
- Sandra Murphy
- Department of Psychology, Royal Holloway, University of London, United Kingdom
| | - Polly Dalton
- Department of Psychology, Royal Holloway, University of London, United Kingdom.
| |
Collapse
|
14
|
Chérif L, Wood V, Marois A, Labonté K, Vachon F. Multitasking in the military: Cognitive consequences and potential solutions. APPLIED COGNITIVE PSYCHOLOGY 2018. [DOI: 10.1002/acp.3415] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/19/2022]
Affiliation(s)
- Lobna Chérif
- Royal Military College of Canada; Kingston Canada
| | - Valerie Wood
- Royal Military College of Canada; Kingston Canada
| | | | | | - François Vachon
- École de psychologie; Université Laval; Québec Canada
- Department of Building, Energy and Environmental Engineering; University of Gävle; Gävle Sweden
| |
Collapse
|
15
|
Neuhoff JG, Bochtler KS. Change deafness, dual-task performance, and domain-specific expertise. Q J Exp Psychol (Hove) 2018; 71:1100-1111. [PMID: 28326947 DOI: 10.1080/17470218.2017.1310266] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Abstract
In a change deafness manipulation using radio broadcasts of sporting events, we show that change deafness to a switch in talker increases when listeners are asked to monitor both lexical and indexical information for change. We held semantic content constant and demonstrated a change deafness rate of 85% when participants listened to the home team broadcast of a hockey game that switched midway to the away team broadcast with a different announcer. In Study 2, participants were asked to monitor either the indexical characteristics ( listen for a change in announcer) or both the indexical and semantic components ( listen for a change in announcer or a goal scored). Monitoring both components led to significantly greater change deafness even though both groups were alerted to the possibility of a change in announcer. In Study 3, we changed both the indexical and the semantic components when the broadcast switched from a hockey game to a basketball game. We found a negative correlation between sports expertise and change deafness. The results are discussed in terms of the nature of perceptual representation and the influence of expertise and evolution on attention allocation.
Collapse
Affiliation(s)
- John G Neuhoff
- Department of Psychology, The College of Wooster, Wooster, OH, USA
| | | |
Collapse
|
16
|
Shinn-Cunningham B. Cortical and Sensory Causes of Individual Differences in Selective Attention Ability Among Listeners With Normal Hearing Thresholds. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2017; 60:2976-2988. [PMID: 29049598 PMCID: PMC5945067 DOI: 10.1044/2017_jslhr-h-17-0080] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/27/2017] [Revised: 06/23/2017] [Accepted: 07/05/2017] [Indexed: 05/28/2023]
Abstract
PURPOSE This review provides clinicians with an overview of recent findings relevant to understanding why listeners with normal hearing thresholds (NHTs) sometimes suffer from communication difficulties in noisy settings. METHOD The results from neuroscience and psychoacoustics are reviewed. RESULTS In noisy settings, listeners focus their attention by engaging cortical brain networks to suppress unimportant sounds; they then can analyze and understand an important sound, such as speech, amidst competing sounds. Differences in the efficacy of top-down control of attention can affect communication abilities. In addition, subclinical deficits in sensory fidelity can disrupt the ability to perceptually segregate sound sources, interfering with selective attention, even in listeners with NHTs. Studies of variability in control of attention and in sensory coding fidelity may help to isolate and identify some of the causes of communication disorders in individuals presenting at the clinic with "normal hearing." CONCLUSIONS How well an individual with NHTs can understand speech amidst competing sounds depends not only on the sound being audible but also on the integrity of cortical control networks and the fidelity of the representation of suprathreshold sound. Understanding the root cause of difficulties experienced by listeners with NHTs ultimately can lead to new, targeted interventions that address specific deficits affecting communication in noise. PRESENTATION VIDEO http://cred.pubs.asha.org/article.aspx?articleid=2601617.
Collapse
Affiliation(s)
- Barbara Shinn-Cunningham
- Center for Research in Sensory Communication and Emerging Neural Technology, Boston University, MA
| |
Collapse
|
17
|
Remington A, Fairnie J. A sound advantage: Increased auditory capacity in autism. Cognition 2017; 166:459-465. [DOI: 10.1016/j.cognition.2017.04.002] [Citation(s) in RCA: 46] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2016] [Revised: 03/29/2017] [Accepted: 04/05/2017] [Indexed: 11/27/2022]
|
18
|
Murphy S, Spence C, Dalton P. Auditory perceptual load: A review. Hear Res 2017; 352:40-48. [DOI: 10.1016/j.heares.2017.02.005] [Citation(s) in RCA: 46] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/23/2016] [Revised: 12/21/2016] [Accepted: 02/05/2017] [Indexed: 11/26/2022]
|
19
|
Wolfe JM, Alaoui Soce A, Schill HM. How did I miss that? Developing mixed hybrid visual search as a 'model system' for incidental finding errors in radiology. COGNITIVE RESEARCH-PRINCIPLES AND IMPLICATIONS 2017; 2:35. [PMID: 28890920 PMCID: PMC5569644 DOI: 10.1186/s41235-017-0072-5] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/01/2017] [Accepted: 07/10/2017] [Indexed: 12/21/2022]
Abstract
In a real world search, it can be important to keep ‘an eye out’ for items of interest that are not the primary subject of the search. For instance, you might look for the exit sign on the freeway, but you should also respond to the armadillo crossing the road. In medicine, these items are known as “incidental findings,” findings of possible clinical significance that were not the main object of search. These errors (e.g., missing a broken rib while looking for pneumonia) have medical consequences for the patient and potential legal consequences for the physician. Here we report three experiments intended to develop a ‘model system’ for incidental findings – a paradigm that could be used in the lab to develop strategies to reduce incidental finding errors in the clinic. All the experiments involve ‘hybrid’ visual search for any of several targets held in memory. In this ‘mixed hybrid search task,’ observers search for any of three specific targets (e.g., this rabbit, this truck, and this spoon) and three categorical targets (e.g., masks, furniture, and plants). The hypothesis is that the specific items are like the specific goals of a real world search and the categorical targets are like the less well-defined incidental findings that might be present and that should be reported. In all these experiments, varying target prevalence, number of targets, etc., the categorical targets are missed at a much higher rate than the specific targets. This paradigm shows promise as a model of the incidental finding problem.
Collapse
Affiliation(s)
- Jeremy M Wolfe
- Ophthalmology and Radiology Departments, Harvard Medical School, 64 Sidney St. Suite 170, Cambridge, MA 02139 USA.,Visual Attention Lab, Brigham and Women's Hospital, 64 Sidney St. Suite 170, Cambridge, MA 02139 USA
| | - Abla Alaoui Soce
- Visual Attention Lab, Brigham and Women's Hospital, 64 Sidney St. Suite 170, Cambridge, MA 02139 USA
| | - Hayden M Schill
- Visual Attention Lab, Brigham and Women's Hospital, 64 Sidney St. Suite 170, Cambridge, MA 02139 USA
| |
Collapse
|
20
|
Dykstra AR, Cariani PA, Gutschalk A. A roadmap for the study of conscious audition and its neural basis. Philos Trans R Soc Lond B Biol Sci 2017; 372:20160103. [PMID: 28044014 PMCID: PMC5206271 DOI: 10.1098/rstb.2016.0103] [Citation(s) in RCA: 37] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/03/2016] [Indexed: 12/16/2022] Open
Abstract
How and which aspects of neural activity give rise to subjective perceptual experience-i.e. conscious perception-is a fundamental question of neuroscience. To date, the vast majority of work concerning this question has come from vision, raising the issue of generalizability of prominent resulting theories. However, recent work has begun to shed light on the neural processes subserving conscious perception in other modalities, particularly audition. Here, we outline a roadmap for the future study of conscious auditory perception and its neural basis, paying particular attention to how conscious perception emerges (and of which elements or groups of elements) in complex auditory scenes. We begin by discussing the functional role of the auditory system, particularly as it pertains to conscious perception. Next, we ask: what are the phenomena that need to be explained by a theory of conscious auditory perception? After surveying the available literature for candidate neural correlates, we end by considering the implications that such results have for a general theory of conscious perception as well as prominent outstanding questions and what approaches/techniques can best be used to address them.This article is part of the themed issue 'Auditory and visual scene analysis'.
Collapse
Affiliation(s)
- Andrew R Dykstra
- Department of Neurology, Ruprecht-Karls-Universität Heidelberg, Heidelberg, Germany
| | | | - Alexander Gutschalk
- Department of Neurology, Ruprecht-Karls-Universität Heidelberg, Heidelberg, Germany
| |
Collapse
|
21
|
Shinn-Cunningham B, Best V, Lee AKC. Auditory Object Formation and Selection. SPRINGER HANDBOOK OF AUDITORY RESEARCH 2017. [DOI: 10.1007/978-3-319-51662-2_2] [Citation(s) in RCA: 31] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/03/2022]
|
22
|
Hyman IE. Unaware Observers: The Impact of Inattentional Blindness on Walkers, Drivers, and Eyewitnesses. JOURNAL OF APPLIED RESEARCH IN MEMORY AND COGNITION 2016. [DOI: 10.1016/j.jarmac.2016.06.011] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
23
|
Lange K, Nowak M, Lauer W. A human factors perspective on medical device alarms: problems with operating alarming devices and responding to device alarms. BIOMED ENG-BIOMED TE 2016; 61:147-64. [PMID: 25427057 DOI: 10.1515/bmt-2014-0068] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2014] [Accepted: 10/24/2014] [Indexed: 11/15/2022]
Abstract
Medical devices emit alarms when a problem with the device or with the patient needs to be addressed by healthcare personnel. At present, problems with device alarms are frequently discussed in the literature, the main message being that patient safety is compromised because device alarms are not as effective and safe as they should - and could - be. There is a general consensus that alarm-related hazards result, to a considerable degree, from the interactions of human users with the device. The present paper addresses key aspects of human perception and cognition that may relate to both operating alarming devices and responding to device alarms. Recent publications suggested solutions to alarm-related hazards associated with usage errors based on assumptions on the causal relations between, for example, alarm management and human perception, cognition, and responding. However, although there is face validity in many of these assumptions, future research should provide objective empirical evidence in order to deepen our understanding of the actual causal relationships, and hence improve and expand the possibilities for taking appropriate action.
Collapse
|
24
|
Murphy S, Dalton P. Out of touch? Visual load induces inattentional numbness. J Exp Psychol Hum Percept Perform 2016; 42:761-5. [PMID: 26974412 PMCID: PMC4873046 DOI: 10.1037/xhp0000218] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
It is now well known that the absence of attention can leave people unaware of both visual and auditory stimuli (e.g., Dalton & Fraenkel, 2012; Mack & Rock, 1998). However, the possibility of similar effects within the tactile domain has received much less research. Here, we introduce a new tactile inattention paradigm and use it to test whether tactile awareness depends on the level of perceptual load in a concurrent visual task. Participants performed a visual search task of either low or high perceptual load, as well as responding to the presence or absence of a brief vibration delivered simultaneously to either the left or the right hand (50% of trials). Detection sensitivity to the clearly noticeable tactile stimulus was reduced under high (vs. low) visual perceptual load. These findings provide the first robust demonstration of “inattentional numbness,” as well as demonstrating that this phenomenon can be induced by concurrent visual perceptual load.
Collapse
Affiliation(s)
- Sandra Murphy
- Department of Psychology, Royal Holloway, University of London
| | - Polly Dalton
- Department of Psychology, Royal Holloway, University of London
| |
Collapse
|
25
|
Abstract
Behavioral and neural studies of selective attention have consistently demonstrated that explicit attentional cues to particular perceptual features profoundly alter perception and performance. The statistics of the sensory environment can also provide cues about what perceptual features to expect, but the extent to which these more implicit contextual cues impact perception and performance, as well as their relationship to explicit attentional cues, is not well understood. In this study, the explicit cues, or attentional prior probabilities, and the implicit cues, or contextual prior probabilities, associated with different acoustic frequencies in a detection task were simultaneously manipulated. Both attentional and contextual priors had similarly large but independent impacts on sound detectability, with evidence that listeners tracked and used contextual priors for a variety of sound classes (pure tones, harmonic complexes, and vowels). Further analyses showed that listeners updated their contextual priors rapidly and optimally, given the changing acoustic frequency statistics inherent in the paradigm. A Bayesian Observer model accounted for both attentional and contextual adaptations found with listeners. These results bolster the interpretation of perception as Bayesian inference, and suggest that some effects attributed to selective attention may be a special case of contextual prior integration along a feature axis.
Collapse
|
26
|
Does working memory capacity predict cross-modally induced failures of awareness? Conscious Cogn 2016; 39:18-27. [DOI: 10.1016/j.concog.2015.11.010] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2015] [Revised: 11/16/2015] [Accepted: 11/24/2015] [Indexed: 11/19/2022]
|
27
|
Loth S, Jettka K, Giuliani M, de Ruiter JP. Ghost-in-the-Machine reveals human social signals for human-robot interaction. Front Psychol 2015; 6:1641. [PMID: 26582998 PMCID: PMC4631814 DOI: 10.3389/fpsyg.2015.01641] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2015] [Accepted: 10/12/2015] [Indexed: 11/13/2022] Open
Abstract
We used a new method called "Ghost-in-the-Machine" (GiM) to investigate social interactions with a robotic bartender taking orders for drinks and serving them. Using the GiM paradigm allowed us to identify how human participants recognize the intentions of customers on the basis of the output of the robotic recognizers. Specifically, we measured which recognizer modalities (e.g., speech, the distance to the bar) were relevant at different stages of the interaction. This provided insights into human social behavior necessary for the development of socially competent robots. When initiating the drink-order interaction, the most important recognizers were those based on computer vision. When drink orders were being placed, however, the most important information source was the speech recognition. Interestingly, the participants used only a subset of the available information, focussing only on a few relevant recognizers while ignoring others. This reduced the risk of acting on erroneous sensor data and enabled them to complete service interactions more swiftly than a robot using all available sensor data. We also investigated socially appropriate response strategies. In their responses, the participants preferred to use the same modality as the customer's requests, e.g., they tended to respond verbally to verbal requests. Also, they added redundancy to their responses, for instance by using echo questions. We argue that incorporating the social strategies discovered with the GiM paradigm in multimodal grammars of human-robot interactions improves the robustness and the ease-of-use of these interactions, and therefore provides a smoother user experience.
Collapse
Affiliation(s)
- Sebastian Loth
- Psycholinguistics, Faculty of Linguistics and Literary Studies, Bielefeld University Bielefeld, Germany
| | - Katharina Jettka
- Psycholinguistics, Faculty of Linguistics and Literary Studies, Bielefeld University Bielefeld, Germany
| | - Manuel Giuliani
- Center for Human-Computer Interaction, Department of Computer Sciences, University of Salzburg Salzburg, Austria
| | - Jan P de Ruiter
- Psycholinguistics, Faculty of Linguistics and Literary Studies, Bielefeld University Bielefeld, Germany
| |
Collapse
|
28
|
Tardieu J, Misdariis N, Langlois S, Gaillard P, Lemercier C. Sonification of in-vehicle interface reduces gaze movements under dual-task condition. APPLIED ERGONOMICS 2015; 50:41-49. [PMID: 25959316 DOI: 10.1016/j.apergo.2015.02.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/21/2014] [Revised: 01/14/2015] [Accepted: 02/18/2015] [Indexed: 06/04/2023]
Abstract
In-car infotainment systems (ICIS) often degrade driving performances since they divert the driver's gaze from the driving scene. Sonification of hierarchical menus (such as those found in most ICIS) is examined in this paper as one possible solution to reduce gaze movements towards the visual display. In a dual-task experiment in the laboratory, 46 participants were requested to prioritize a primary task (a continuous target detection task) and to simultaneously navigate in a realistic mock-up of an ICIS, either sonified or not. Results indicated that sonification significantly increased the time spent looking at the primary task, and significantly decreased the number and the duration of gaze saccades towards the ICIS. In other words, the sonified ICIS could be used nearly exclusively by ear. On the other hand, the reaction times in the primary task were increased in both silent and sonified conditions. This study suggests that sonification of secondary tasks while driving could improve the driver's visual attention of the driving scene.
Collapse
Affiliation(s)
- Julien Tardieu
- MSHS-T USR3414, University of Toulouse and CNRS, Toulouse, France.
| | | | - Sabine Langlois
- Renault - Cognitive Ergonomics & HMI, 1 avenue du Golf, 78084 Guyancourt, France
| | - Pascal Gaillard
- CLLE UMR5263, University of Toulouse and CNRS, Toulouse, France
| | | |
Collapse
|
29
|
Dehais F, Causse M, Vachon F, Régis N, Menant E, Tremblay S. Failure to detect critical auditory alerts in the cockpit: evidence for inattentional deafness. HUMAN FACTORS 2014; 56:631-644. [PMID: 25029890 DOI: 10.1177/0018720813510735] [Citation(s) in RCA: 53] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
OBJECTIVE The aim of this study was to test whether inattentional deafness to critical alarms would be observed in a simulated cockpit. BACKGROUND The inability of pilots to detect unexpected changes in their auditory environment (e.g., alarms) is a major safety problem in aeronautics. In aviation, the lack of response to alarms is usually not attributed to attentional limitations, but rather to pilots choosing to ignore such warnings due to decision biases, hearing issues, or conscious risk taking. METHOD Twenty-eight general aviation pilots performed two landings in a flight simulator. In one scenario an auditory alert was triggered alone, whereas in the other the auditory alert occurred while the pilots dealt with a critical windshear. RESULTS In the windshear scenario, II pilots (39.3%) did not report or react appropriately to the alarm whereas all the pilots perceived the auditory warning in the no-windshear scenario. Also, of those pilots who were first exposed to the no-windshear scenario and detected the alarm, only three suffered from inattentional deafness in the subsequent windshear scenario. CONCLUSION These findings establish inattentional deafness as a cognitive phenomenon that is critical for air safety. Pre-exposure to a critical event triggering an auditory alarm can enhance alarm detection when a similar event is encountered subsequently. APPLICATION Case-based learning is a solution to mitigate auditory alarm misperception.
Collapse
|
30
|
Koreimann S, Gula B, Vitouch O. Inattentional deafness in music. PSYCHOLOGICAL RESEARCH 2014; 78:304-12. [DOI: 10.1007/s00426-014-0552-x] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2013] [Accepted: 02/19/2014] [Indexed: 10/25/2022]
|
31
|
Auditory attentional capture: implicit and explicit approaches. PSYCHOLOGICAL RESEARCH 2014; 78:313-20. [PMID: 24643575 DOI: 10.1007/s00426-014-0557-5] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2013] [Accepted: 02/24/2014] [Indexed: 10/25/2022]
Abstract
The extent to which distracting items capture attention despite being irrelevant to the task at hand can be measured either implicitly or explicitly (e.g., Simons, Trends Cogn Sci 4:147-155, 2000). Implicit approaches include the standard attentional capture paradigm in which distraction is measured in terms of reaction time and/or accuracy costs within a focal task in the presence (vs. absence) of a task-irrelevant distractor. Explicit measures include the inattention paradigm in which people are asked directly about their noticing of an unexpected task-irrelevant item. Although the processes of attentional capture have been studied extensively using both approaches in the visual domain, there is much less research on similar processes as they may operate within audition, and the research that does exist in the auditory domain has tended to focus exclusively on either an explicit or an implicit approach. This paper provides an overview of recent research on auditory attentional capture, integrating the key conclusions that may be drawn from both methodological approaches.
Collapse
|
32
|
Murphy S, Fraenkel N, Dalton P. Perceptual load does not modulate auditory distractor processing. Cognition 2013; 129:345-55. [DOI: 10.1016/j.cognition.2013.07.014] [Citation(s) in RCA: 40] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2013] [Revised: 07/11/2013] [Accepted: 07/21/2013] [Indexed: 11/25/2022]
|
33
|
Puschmann S, Sandmann P, Ahrens J, Thorne J, Weerda R, Klump G, Debener S, Thiel CM. Electrophysiological correlates of auditory change detection and change deafness in complex auditory scenes. Neuroimage 2013; 75:155-164. [DOI: 10.1016/j.neuroimage.2013.02.037] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2012] [Revised: 02/18/2013] [Accepted: 02/20/2013] [Indexed: 10/27/2022] Open
|