1
|
Gao J, Jia L, Wen S, Jia Y, Li G, Liu H. The influence of emotional feedback material type on attentional capture at different presentation times. PLoS One 2024; 19:e0310022. [PMID: 39283871 PMCID: PMC11404810 DOI: 10.1371/journal.pone.0310022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2024] [Accepted: 08/22/2024] [Indexed: 09/22/2024] Open
Abstract
OBJECTIVE This study aimed to explore the influence of emotional feedback materials on attentional capture at different presentation times and to investigate the mechanisms of positive and negative attentional biases. METHODS Two experiments were conducted. Experiment 1 recruited 47 participants, and Experiment 2 recruited 46 participants. Emotional facial images and emotional words were used as feedback materials. A learning-testing paradigm was employed to explore the effect of emotional feedback materials on attentional capture at different presentation times (1000 ms/100 ms). RESULTS We compared the accuracy and reaction times of participants under emotional and neutral conditions at both presentation times. Experiment 1 revealed that participants exhibited a stable positive attentional bias towards emotional facial images. Additionally, under the 100 ms feedback condition, emotional interference on judgment task accuracy was greater than under the 1000 ms feedback condition. Experiment 2 found that under the 100 ms feedback condition, emotional interference on reaction time was greater than under the 1000 ms feedback condition. Comparing the data from both experiments revealed that the processing time for emotional facial images was longer than for emotional words. CONCLUSIONS (1) Emotional facial images are more effective than emotional words in capturing attention. (2) When positive and negative information with equal arousal levels alternates over a period of time, individuals exhibit a stable positive attentional bias. (3) When there is intense competition for attention and cognitive resources, emotional information is prioritized for processing.
Collapse
Affiliation(s)
- Jiacheng Gao
- Xinjiang Key Laboratory of Mental Development and Learning Science, School of Psychology, Xinjiang Normal University, Urumqi, Xinjiang, China
- Department of Psychology, Fudan University, Shanghai, China
| | - Lin Jia
- Xinjiang Key Laboratory of Mental Development and Learning Science, School of Psychology, Xinjiang Normal University, Urumqi, Xinjiang, China
- CAS Key Laboratory of Mental Health, Institute of Psychology, Beijing, China
| | - Suxia Wen
- Xinjiang Key Laboratory of Mental Development and Learning Science, School of Psychology, Xinjiang Normal University, Urumqi, Xinjiang, China
| | - Yadi Jia
- Xinjiang Key Laboratory of Mental Development and Learning Science, School of Psychology, Xinjiang Normal University, Urumqi, Xinjiang, China
| | - Guangxin Li
- School of Educational Science, Xinjiang Normal University, Urumqi, Xinjiang, China
| | - Hongli Liu
- Xinjiang Key Laboratory of Mental Development and Learning Science, School of Psychology, Xinjiang Normal University, Urumqi, Xinjiang, China
| |
Collapse
|
2
|
Sun M, Huang Y, Ying H. Repulsion bias is insensitive to spatial attention, yet expands during active working memory maintenance. Atten Percept Psychophys 2024:10.3758/s13414-024-02910-w. [PMID: 38862765 DOI: 10.3758/s13414-024-02910-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/22/2024] [Indexed: 06/13/2024]
Abstract
Our brain sometimes represents visual information in a biased manner. Multiple visual features presented simultaneously or sequentially may interact with each other when we perceive them or maintain them in visual working memory (WM), giving rise to report bias. How goal-directed attention influences target representation is not fully understood, especially concerning whether attention towards distractors modulates report bias for the target. Our study investigated the WM biases of the target when it is concurrent with (1) one attended distractor only, (2) one unattended distractor only, and (3) both kinds of distractors during perception. It was found that the target WM is reported as being repelled away from concurrent distractors, attended or unattended, suggesting attention is not necessary for the occurrence of repulsion bias during perception. Furthermore, goal-directed attention towards the distractors modulates the strength of interitem interaction, and the repulsion bias was found to be stronger when attention was directed toward the distractor than when it was not. However, the exaggerated repulsion associated with the attended distractor is likely due to increased relevance to the memory task and (or) WM load instead of spatial attention. In contrast, spatial attention towards the distractor increases the chances of misreporting the distractor for the target.
Collapse
Affiliation(s)
- Mengdan Sun
- Department of Psychology, Soochow University, Suzhou, China.
| | - Yaxin Huang
- Department of Psychology, Soochow University, Suzhou, China
| | - Haojiang Ying
- Department of Psychology, Soochow University, Suzhou, China.
| |
Collapse
|
3
|
Narhi-Martinez W, Dube B, Chen J, Leber AB, Golomb JD. Suppression of a salient distractor protects the processing of target features. Psychon Bull Rev 2024; 31:223-233. [PMID: 37528277 PMCID: PMC11163954 DOI: 10.3758/s13423-023-02339-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/11/2023] [Indexed: 08/03/2023]
Abstract
We are often bombarded with salient stimuli that capture our attention and distract us from our current goals. Decades of research have shown the robust detrimental impacts of salient distractors on search performance and, of late, in leading to altered feature perception. These feature errors can be quite extreme, and thus, undesirable. In search tasks, salient distractors can be suppressed if they appear more frequently in one location, and this learned spatial suppression can lead to reductions in the cost of distraction as measured by reaction time slowing. Can learned spatial suppression also protect against visual feature errors? To investigate this question, participants were cued to report one of four briefly presented colored squares on a color wheel. On two-thirds of trials, a salient distractor appeared around one of the nontarget squares, appearing more frequently in one location over the course of the experiment. Participants' responses were fit to a model estimating performance parameters and compared across conditions. Our results showed that general performance (guessing and precision) improved when the salient distractor appeared in a likely location relative to elsewhere. Critically, feature swap errors (probability of misreporting the color at the salient distractor's location) were also significantly reduced when the distractor appeared in a likely location, suggesting that learned spatial suppression of a salient distractor helps protect the processing of target features. This study provides evidence that, in addition to helping us avoid salient distractors, suppression likely plays a larger role in helping to prevent distracting information from being encoded.
Collapse
Affiliation(s)
- William Narhi-Martinez
- Department of Psychology, The Ohio State University, 1835 Neil Ave, Columbus, OH, 43210, USA.
| | - Blaire Dube
- Department of Psychology, The Ohio State University, 1835 Neil Ave, Columbus, OH, 43210, USA
| | - Jiageng Chen
- Department of Psychology, The Ohio State University, 1835 Neil Ave, Columbus, OH, 43210, USA
| | - Andrew B Leber
- Department of Psychology, The Ohio State University, 1835 Neil Ave, Columbus, OH, 43210, USA
| | - Julie D Golomb
- Department of Psychology, The Ohio State University, 1835 Neil Ave, Columbus, OH, 43210, USA
| |
Collapse
|
4
|
Chen J, Golomb JD. Dynamic neural reconstructions of attended object location and features using EEG. J Neurophysiol 2023; 130:139-154. [PMID: 37283457 PMCID: PMC10393364 DOI: 10.1152/jn.00180.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2022] [Revised: 05/10/2023] [Accepted: 06/02/2023] [Indexed: 06/08/2023] Open
Abstract
Attention allows us to select relevant and ignore irrelevant information from our complex environments. What happens when attention shifts from one item to another? To answer this question, it is critical to have tools that accurately recover neural representations of both feature and location information with high temporal resolution. In the present study, we used human electroencephalography (EEG) and machine learning to explore how neural representations of object features and locations update across dynamic shifts of attention. We demonstrate that EEG can be used to create simultaneous time courses of neural representations of attended features (time point-by-time point inverted encoding model reconstructions) and attended location (time point-by-time point decoding) during both stable periods and across dynamic shifts of attention. Each trial presented two oriented gratings that flickered at the same frequency but had different orientations; participants were cued to attend one of them and on half of trials received a shift cue midtrial. We trained models on a stable period from Hold attention trials and then reconstructed/decoded the attended orientation/location at each time point on Shift attention trials. Our results showed that both feature reconstruction and location decoding dynamically track the shift of attention and that there may be time points during the shifting of attention when 1) feature and location representations become uncoupled and 2) both the previously attended and currently attended orientations are represented with roughly equal strength. The results offer insight into our understanding of attentional shifts, and the noninvasive techniques developed in the present study lend themselves well to a wide variety of future applications.NEW & NOTEWORTHY We used human EEG and machine learning to reconstruct neural response profiles during dynamic shifts of attention. Specifically, we demonstrated that we could simultaneously read out both location and feature information from an attended item in a multistimulus display. Moreover, we examined how that readout evolves over time during the dynamic process of attentional shifts. These results provide insight into our understanding of attention, and this technique carries substantial potential for versatile extensions and applications.
Collapse
Affiliation(s)
- Jiageng Chen
- Department of Psychology, The Ohio State University, Columbus, Ohio, United States
| | - Julie D Golomb
- Department of Psychology, The Ohio State University, Columbus, Ohio, United States
| |
Collapse
|
5
|
Narhi-Martinez W, Chen J, Golomb JD. Probabilistic visual attentional guidance triggers "feature avoidance" response errors. J Exp Psychol Hum Percept Perform 2023; 49:802-820. [PMID: 37141038 PMCID: PMC10320923 DOI: 10.1037/xhp0001095] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/05/2023]
Abstract
Spatial attention affects not only where we look, but also what we perceive and remember in attended and unattended locations. Previous work has shown that manipulating attention via top-down cues or bottom-up capture leads to characteristic patterns of feature errors. Here we investigated whether experience-driven attentional guidance-and probabilistic attentional guidance more generally-leads to similar feature errors. We conducted a series of pre-registered experiments employing a learned spatial probability or probabilistic pre-cue; all experiments involved reporting the color of one of four simultaneously presented stimuli using a continuous response modality. When the probabilistic cues guided attention to an invalid (nontarget) location, participants were less likely to report the target color, as expected. But strikingly, their errors tended to be clustered around a nontarget color opposite the color of the invalidly-cued nontarget. This "feature avoidance" was found for both experience-driven and top-down probabilistic cues, and appears to be the product of a strategic-but possibly subconscious-behavior, occurring when information about the features and/or feature-location bindings outside the focus of attention is limited. The findings emphasize the importance of considering how different types of attentional guidance can exert different effects on feature perception and memory reports. (PsycInfo Database Record (c) 2023 APA, all rights reserved).
Collapse
|
6
|
Chapman AF, Chunharas C, Störmer VS. Feature-based attention warps the perception of visual features. Sci Rep 2023; 13:6487. [PMID: 37081047 PMCID: PMC10119379 DOI: 10.1038/s41598-023-33488-2] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2022] [Accepted: 04/13/2023] [Indexed: 04/22/2023] Open
Abstract
Selective attention improves sensory processing of relevant information but can also impact the quality of perception. For example, attention increases visual discrimination performance and at the same time boosts apparent stimulus contrast of attended relative to unattended stimuli. Can attention also lead to perceptual distortions of visual representations? Optimal tuning accounts of attention suggest that processing is biased towards "off-tuned" features to maximize the signal-to-noise ratio in favor of the target, especially when targets and distractors are confusable. Here, we tested whether such tuning gives rise to phenomenological changes of visual features. We instructed participants to select a color among other colors in a visual search display and subsequently asked them to judge the appearance of the target color in a 2-alternative forced choice task. Participants consistently judged the target color to appear more dissimilar from the distractor color in feature space. Critically, the magnitude of these perceptual biases varied systematically with the similarity between target and distractor colors during search, indicating that attentional tuning quickly adapts to current task demands. In control experiments we rule out possible non-attentional explanations such as color contrast or memory effects. Overall, our results demonstrate that selective attention warps the representational geometry of color space, resulting in profound perceptual changes across large swaths of feature space. Broadly, these results indicate that efficient attentional selection can come at a perceptual cost by distorting our sensory experience.
Collapse
Affiliation(s)
- Angus F Chapman
- Department of Psychology, UC San Diego, La Jolla, CA, 92092, USA.
- Department of Psychological and Brain Sciences, Boston University, 64 Cummington Mall, Boston, MA, 02215, USA.
| | - Chaipat Chunharas
- Cognitive Clinical and Computational Neuroscience Lab, KCMH Chula Neuroscience Center, Thai Red Cross Society, Department of Internal Medicine, Chulalongkorn University, Bangkok, 10330, Thailand
| | - Viola S Störmer
- Department of Brain and Psychological Sciences, Dartmouth College, Hanover, NH, USA
| |
Collapse
|
7
|
Perceptual comparisons modulate memory biases induced by new visual inputs. Psychon Bull Rev 2023; 30:291-302. [PMID: 36068372 DOI: 10.3758/s13423-022-02133-w] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/22/2022] [Indexed: 11/08/2022]
Abstract
It is well-established that stimulus-specific information in visual working memory (VWM) can be systematically biased by new perceptual inputs. These memory biases are commonly attributed to interference that arises when perceptual inputs are physically similar to VWM contents. However, recent work has suggested that explicitly comparing the similarity between VWM contents and new perceptual inputs modulates the size of memory biases above and beyond stimulus-driven effects. Here, we sought to directly investigate this modulation hypothesis by comparing the size of memory biases following explicit comparisons to those induced when new perceptual inputs are ignored (Experiment 1) or maintained in VWM alongside target information (Experiment 2). We found that VWM reports showed larger attraction biases following explicit perceptual comparisons than when new perceptual inputs were ignored or maintained in VWM. An analysis of participants' perceptual comparisons revealed that memory biases were amplified after perceptual inputs were endorsed as similar-but not dissimilar-to one's VWM representation. These patterns were found to persist even after accounting for variability in the physical similarity between the target and perceptual stimuli across trials, as well as the baseline memory precision between the distinct task demands. Together, these findings illustrate a causal role of perceptual comparisons in modulating naturally-occurring memory biases.
Collapse
|
8
|
Narhi-Martinez W, Dube B, Golomb JD. Attention as a multi-level system of weights and balances. WILEY INTERDISCIPLINARY REVIEWS. COGNITIVE SCIENCE 2023; 14:e1633. [PMID: 36317275 PMCID: PMC9840663 DOI: 10.1002/wcs.1633] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/07/2022] [Revised: 10/07/2022] [Accepted: 10/08/2022] [Indexed: 12/29/2022]
Abstract
This opinion piece is part of a collection on the topic: "What is attention?" Despite the word's place in the common vernacular, a satisfying definition for "attention" remains elusive. Part of the challenge is there exist many different types of attention, which may or may not share common mechanisms. Here we review this literature and offer an intuitive definition that draws from aspects of prior theories and models of attention but is broad enough to recognize the various types of attention and modalities it acts upon: attention as a multi-level system of weights and balances. While the specific mechanism(s) governing the weighting/balancing may vary across levels, the fundamental role of attention is to dynamically weigh and balance all signals-both externally-generated and internally-generated-such that the highest weighted signals are selected and enhanced. Top-down, bottom-up, and experience-driven factors dynamically impact this balancing, and competition occurs both within and across multiple levels of processing. This idea of a multi-level system of weights and balances is intended to incorporate both external and internal attention and capture their myriad of constantly interacting processes. We review key findings and open questions related to external attention guidance, internal attention and working memory, and broader attentional control (e.g., ongoing competition between external stimuli and internal thoughts) within the framework of this analogy. We also speculate about the implications of failures of attention in terms of weights and balances, ranging from momentary one-off errors to clinical disorders, as well as attentional development and degradation across the lifespan. This article is categorized under: Psychology > Attention Neuroscience > Cognition.
Collapse
Affiliation(s)
| | - Blaire Dube
- The Ohio State University, Department of Psychology
| | - Julie D. Golomb
- Correspondence concerning this article should be addressed to Julie Golomb, Department of Psychology, The Ohio State University, Columbus, OH, 43210.
| |
Collapse
|
9
|
Yi L, Sekuler R. Audiovisual interaction with rate-varying signals. Iperception 2022; 13:20416695221116653. [PMID: 36467124 PMCID: PMC9716610 DOI: 10.1177/20416695221116653] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2022] [Accepted: 07/06/2022] [Indexed: 08/18/2023] Open
Abstract
A task-irrelevant, amplitude-modulating sound influences perception of a size-modulating visual stimulus. To probe the limits of this audiovisual interaction we vary the second temporal derivative of object size and of sound amplitude. In the study's first phase subjects see a visual stimulus size-modulating with f ″ ( x ) > 0, 0, or <0, and judge each one's rate as increasing, constant, or decreasing. Visual stimuli are accompanied by a steady, non-modulated auditory stimulus. The novel combination of multiple stimuli and multi-alternative responses allows subjects' similarity space to be estimated from the stimulus-response confusion matrix. In the study's second phase, rate-varying visual stimuli are presented in concert with auditory stimuli whose second derivative also varied. Subjects identified each visual stimuli as one of the three types, while trying to ignore the accompanying sound. Unlike some previous results with f ″ ( x ) fixed at 0, performance benefits relatively little when visual and auditory stimuli share the same directional change in modulation. However, performance does drop when visual and auditory stimului differ in their directions of rate change. Our task's computational demands may make it particularly vulnerable to the effects of a dynamic task-irrelevant stimulus.
Collapse
Affiliation(s)
- Long Yi
- Volen Center for Complex Systems, Brandeis University,
Waltham, MA, USA
| | - Robert Sekuler
- Volen Center for Complex Systems, Brandeis University,
Waltham, MA, USA
| |
Collapse
|
10
|
Dumbalska T, Rudzka K, Smithson HE, Summerfield C. How do (perceptual) distracters distract? PLoS Comput Biol 2022; 18:e1010609. [PMID: 36228038 PMCID: PMC9595561 DOI: 10.1371/journal.pcbi.1010609] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2022] [Revised: 10/25/2022] [Accepted: 09/27/2022] [Indexed: 11/06/2022] Open
Abstract
When a target stimulus occurs in the presence of distracters, decisions are less accurate. But how exactly do distracters affect choices? Here, we explored this question using measurement of human behaviour, psychophysical reverse correlation and computational modelling. We contrasted two models: one in which targets and distracters had independent influence on choices (independent model) and one in which distracters modulated choices in a way that depended on their similarity to the target (interaction model). Across three experiments, participants were asked to make fine orientation judgments about the tilt of a target grating presented adjacent to an irrelevant distracter. We found strong evidence for the interaction model, in that decisions were more sensitive when target and distracter were consistent relative to when they were inconsistent. This consistency bias occurred in the frame of reference of the decision, that is, it operated on decision values rather than on sensory signals, and surprisingly, it was independent of spatial attention. A normalization framework, where target features are normalized by the expectation and variability of the local context, successfully captures the observed pattern of results. In the real world, visual scenes usually contain many objects. As a consequence, we often have to make perceptual judgments about a specific ‘target’ stimulus in the presence of irrelevant ‘distracter’ stimuli. For instance, when hanging a picture frame, we want to discern whether it is hanging straight, ignoring the surrounding, potentially tilted frames. Laboratory experiments have shown that the presence of distracter stimuli (i.e. other picture frames) makes this type of perceptual judgment less accurate. However, the specific effect distracters have on judgments is controversial. Here, we conducted a series of experiments to compare two alternative theories of distracter influence: one in which distracters compete with targets to determine choices (independent model) and one in which distracters wield a more indirect influence on choices (interaction model). We found evidence for the latter account. Our results suggest distracters affect perceptual decisions by adjusting how sensitive decisions are to the target stimulus.
Collapse
Affiliation(s)
- Tsvetomira Dumbalska
- Department of Experimental Psychology, University of Oxford, Oxford, United Kingdom
- * E-mail:
| | - Katarzyna Rudzka
- Division of Biosciences, University College London, London, United Kingdom
- Institute of Cognitive Neuroscience, University College London, London, United Kingdom
| | - Hannah E. Smithson
- Department of Experimental Psychology, University of Oxford, Oxford, United Kingdom
| | | |
Collapse
|
11
|
Drascher ML, Kuhl BA. Long-term memory interference is resolved via repulsion and precision along diagnostic memory dimensions. Psychon Bull Rev 2022; 29:1898-1912. [PMID: 35380409 PMCID: PMC9568473 DOI: 10.3758/s13423-022-02082-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/03/2022] [Indexed: 12/04/2022]
Abstract
When memories share similar features, this can lead to interference, and ultimately forgetting. With experience, however, interference can be resolved. This raises the important question of how memories change, with experience, to minimize interference. Intuitively, interference might be minimized by increasing the precision and accuracy of memories. However, recent evidence suggests a potentially adaptive role for memory distortions. Namely, similarity can trigger exaggerations of subtle differences between memories (repulsion). Here, we tested whether repulsion specifically occurs on feature dimensions along which memories compete and whether repulsion is predictive of reduced memory interference. To test these ideas, we developed synthetic faces in a two-dimensional face space (affect and gender). This allowed us to precisely manipulate similarity between faces and the feature dimension along which faces differed. In three experiments, participants learned to associate faces with unique cue words. Associative memory tests confirmed that when faces were similar (face pairmates), this produced interference. Using a continuous face reconstruction task, we found two changes in face memory that preferentially occurred along the feature dimension that was "diagnostic" of the difference between face pairmates: (1) there was a bias to remember pairmates with exaggerated differences (repulsion) and (2) there was an increase in the precision of feature memory. Critically, repulsion and precision were each associated with reduced associative memory interference, but these were statistically dissociable contributions. Collectively, our findings reveal that similarity between memories triggers dissociable, experience-dependent changes that serve an adaptive role in reducing interference.
Collapse
Affiliation(s)
| | - Brice A Kuhl
- Department of Psychology, University of Oregon, Eugene, OR, USA.
- Institute of Neuroscience, University of Oregon, Eugene, OR, USA.
| |
Collapse
|
12
|
Hansmann-Roth S, Þorsteinsdóttir S, Geng JJ, Kristjánsson Á. Temporal integration of feature probability distributions. PSYCHOLOGICAL RESEARCH 2022; 86:2030-2044. [PMID: 34997327 DOI: 10.1007/s00426-021-01621-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2021] [Accepted: 11/13/2021] [Indexed: 10/19/2022]
Abstract
Humans are surprisingly good at learning the statistical characteristics of their visual environment. Recent studies have revealed that not only can the visual system learn repeated features of visual search distractors, but also their actual probability distributions. Search times were determined by the frequency of distractor features over consecutive search trials. The search displays applied in these studies involved many exemplars of distractors on each trial and while there is clear evidence that feature distributions can be learned from large distractor sets, it is less clear if distributions are well learned for single targets presented on each trial. Here, we investigated potential learning of probability distributions of single targets during visual search. Over blocks of trials, observers searched for an oddly colored target that was drawn from either a Gaussian or a uniform distribution. Search times for the different target colors were clearly influenced by the probability of that feature within trial blocks. The same search targets, coming from the extremes of the two distributions were found significantly slower during the blocks where the targets were drawn from a Gaussian distribution than from a uniform distribution indicating that observers were sensitive to the target probability determined by the distribution shape. In Experiment 2, we replicated the effect using binned distributions and revealed the limitations of encoding complex target distributions. Our results demonstrate detailed internal representations of target feature distributions and that the visual system integrates probability distributions of target colors over surprisingly long trial sequences.
Collapse
Affiliation(s)
- Sabrina Hansmann-Roth
- Icelandic Vision Lab, School of Health Sciences, University of Iceland, Reykjavík, Iceland.
- Université de Lille, CNRS, UMR 9193-SCALab-Sciences Cognitives et Sciences Affectives, 59000, Lille, France.
| | - Sóley Þorsteinsdóttir
- Icelandic Vision Lab, School of Health Sciences, University of Iceland, Reykjavík, Iceland
| | - Joy J Geng
- Center for Mind and Brain, University of California Davis, Davis, CA, USA
- Department of Psychology, University of California Davis, Davis, CA, USA
| | - Árni Kristjánsson
- Icelandic Vision Lab, School of Health Sciences, University of Iceland, Reykjavík, Iceland
- School of Psychology, National Research University Higher School of Economics, Moscow, Russia
| |
Collapse
|
13
|
Dube B, Pidaparthi L, Golomb JD. Visual Distraction Disrupts Category-tuned Attentional Filters in Ventral Visual Cortex. J Cogn Neurosci 2022; 34:1521-1533. [PMID: 35579979 DOI: 10.1162/jocn_a_01870] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Our behavioral goals shape how we process information via attentional filters that prioritize goal-relevant information, dictating both where we attend and what we attend to. When something unexpected or salient appears in the environment, it captures our spatial attention. Extensive research has focused on the spatiotemporal aspects of attentional capture, but what happens to concurrent nonspatial filters during visual distraction? Here, we demonstrate a novel, broader consequence of distraction: widespread disruption to filters that regulate category-specific object processing. We recorded fMRI while participants viewed arrays of face/house hybrid images. On distractor-absent trials, we found robust evidence for the standard signature of category-tuned attentional filtering: greater BOLD activation in fusiform face area during attend-faces blocks and in parahippocampal place area during attend-houses blocks. However, on trials where a salient distractor (white rectangle) flashed abruptly around a nontarget location, not only was spatial attention captured, but the concurrent category-tuned attentional filter was disrupted, revealing a boost in activation for the to-be-ignored category. This disruption was robust, resulting in errant processing-and early on, prioritization-of goal-inconsistent information. These findings provide a direct test of the filter disruption theory: that in addition to disrupting spatial attention, distraction also disrupts nonspatial attentional filters tuned to goal-relevant information. Moreover, these results reveal that, under certain circumstances, the filter disruption may be so profound as to induce a full reversal of the attentional control settings, which carries novel implications for both theory and real-world perception.
Collapse
|
14
|
Abstract
Many models of attention assume that attentional selection takes place at a specific moment in time that demarcates the critical transition from pre-attentive to attentive processing of sensory input. We argue that this intuitively appealing standard account of attentional selectivity is not only inaccurate, but has led to substantial conceptual confusion. As an alternative, we offer a 'diachronic' framework that describes attentional selectivity as a process that unfolds over time. Key to this view is the concept of attentional episodes, brief periods of intense attentional amplification of sensory representations that regulate access to working memory and response-related processes. We describe how attentional episodes are linked to earlier attentional mechanisms and to recurrent processing at the neural level. We review studies that establish the existence of attentional episodes, delineate the factors that determine if and when they are triggered, and discuss the costs associated with processing multiple events within a single episode. Finally, we argue that this framework offers new solutions to old problems in attention research that have never been resolved. It can provide a unified and conceptually coherent account of the network of cognitive and neural processes that produce the goal-directed selectivity in perceptual processing that is commonly referred to as 'attention'.
Collapse
|
15
|
Perceptual distraction causes visual memory encoding intrusions. Psychon Bull Rev 2021; 28:1592-1600. [PMID: 34027621 DOI: 10.3758/s13423-021-01937-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/13/2021] [Indexed: 11/08/2022]
Abstract
Given the complexity of our visual environments, a number of mechanisms help us prioritize goal-consistent visual information. When searching for a friend in a crowd, for instance, visual working memory (VWM) maintains a representation of your target (i.e., your friend's shirt) so that attention can be subsequently guided toward target-matching features. In turn, attentional filters gate access to VWM to ensure that only the most relevant information is encoded and used to guide behavior. Distracting (i.e., unexpected/salient) information, however, can also capture your attention, disrupting search. In the current study we ask: does distraction also disrupt control over the VWM filter? Although the effect of distraction on search behavior is heavily studied, we know little about its consequences for VWM. Participants performed two consecutive visual search tasks on each trial. Stimulus color was irrelevant for both search tasks, but on trials where a salient distractor appeared on Search 1, we found evidence that the color associated with this distractor was incidentally encoded into VWM, resulting in memory-driven capture on Search 2. In two different experiments we observed slower responses on Search 2 when a non-target item matched the color of the salient distractor from Search 1; this effect was specific to the color associated with salient distraction and not induced by other non-target colors from the Search 1 display. We propose a novel Filter Disruption Theory: distraction disrupts the attentional filter that controls access to VWM, resulting in the encoding of irrelevant inputs at the time of capture.
Collapse
|
16
|
Scotti PS, Hong Y, Leber AB, Golomb JD. Visual working memory items drift apart due to active, not passive, maintenance. J Exp Psychol Gen 2021; 150:2506-2524. [PMID: 34014755 DOI: 10.1037/xge0000890] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/10/2023]
Abstract
How are humans capable of maintaining detailed representations of visual items in memory? When required to make fine discriminations, we sometimes implicitly differentiate memory representations away from each other to reduce interitem confusion. However, this separation of representations can inadvertently lead memories to be recalled as biased away from other memory items, a phenomenon termed repulsion bias. Using a nonretinotopically specific working memory paradigm, we found stronger repulsion bias with longer working memory delays, but only when items were actively maintained. These results suggest that (a) repulsion bias can reflect a mnemonic phenomenon, distinct from perceptually driven observations of repulsion bias; and (b) mnemonic repulsion bias is ongoing during maintenance and dependent on attention to internally maintained memory items. These results support theories of working memory where items are represented interdependently and further reveals contexts where stronger attention to working memory items during maintenance increases repulsion bias between them. (PsycInfo Database Record (c) 2021 APA, all rights reserved).
Collapse
|
17
|
Statistical learning as a reference point for memory distortions: Swap and shift errors. Atten Percept Psychophys 2021; 83:1652-1672. [PMID: 33462770 DOI: 10.3758/s13414-020-02236-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/23/2020] [Indexed: 11/08/2022]
Abstract
Humans use regularities in the environment to facilitate learning, often without awareness or intent. How might such regularities distort long-term memory? Here, participants studied and reported the colors of objects in a long-term memory paradigm, uninformed that certain colors were sampled more frequently overall. When participants misreported an object's color, these errors were often centered around the average studied color (i.e., "Rich" color), demonstrating swap errors in long-term memory due to imposed statistical regularities. We observed such swap errors regardless of memory load, explicit knowledge, or the distance in color space between the correct color of the tested object and the Rich color. An explicit guessing strategy where participants intentionally made swap errors when uncertain could not fully account for our results. We discuss other potential sources of observed swap errors such as false memory and implicit biased guessing. Although less robust than swap errors, evidence was also observed for subtle shift errors towards or away from the Rich color dependent on the color distance between the correct color and the Rich color. Together, these findings of swap and shift errors provide converging evidence for memory distortion mechanisms induced by a reference point, bridging a gap in the literature between how attention to regularities similarly influences visual working memory and visual long-term memory.
Collapse
|
18
|
Dowd EW, Nag S, Golomb JD. Working memory-driven attention towards a distractor does not interfere with target feature perception. VISUAL COGNITION 2019; 27:714-731. [PMID: 33013176 DOI: 10.1080/13506285.2019.1659895] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
Abstract
The contents of working memory (WM) can influence where we attend-but can it also interfere with what we see? Active maintenance of visual items in WM biases attention towards WM-matching objects, and also enhances early perceptual processing of WM-matching items (e.g., more accurate perceptual discrimination). Here, we asked whether a WM-matching distractor interferes with perceptual processing of a target's features. In a dual-task paradigm, participants maintained a shape in WM across an intervening visual search task, during which they had to reproduce the colour of a designated target item using a continuous-report technique. Importantly, the WM shape could match the target item, a distractor item, or no item in the search array. When the WM shape matched a distractor, we found no evidence of systematic perceptual interference (i.e., swapping or mixing with the distractor colour), but observed only general disruptions in target processing (i.e., decreased target accuracy). These results suggest that when visual attention is inadvertently drawn to a WM-matching distractor, any resultant automatic perceptual processing may be too transient or weak to significantly interfere with perceptual processing of the target's features.
Collapse
Affiliation(s)
- Emma Wu Dowd
- Department of Psychology, The Ohio State University
| | - Samoni Nag
- Department of Psychology, The Ohio State University
| | | |
Collapse
|