26
|
Horstmann G, Becker SI. More efficient visual search for happy faces may not indicate guidance, but rather faster distractor rejection: Evidence from eye movements and fixations. ACTA ACUST UNITED AC 2019; 20:206-216. [PMID: 30730168 DOI: 10.1037/emo0000536] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
The visual search paradigm has been used in emotion research to examine the relation between facial expressions of emotion and attention. Here, the better performance in a search for one facial expression category (e.g., a happy face) compared to a second category (e.g., an angry face) has been often interpreted as indicating better guidance of attention. Better guidance of attention in turn indicates that some aspect of the facial expression can be used preattentively, that is, while focused attention is directed elsewhere in the visual field. This view has been criticized because better performance may also mean better distractor rejection independently of guidance. The present study uses eye tracking to disentangle the two variables. The results show better search performance with a happy than angry face as the target. Facial emotion also influenced the time the eyes fixated a stimulus (dwelling), but not guidance related variables of search performance. A linear regression moreover showed that dwelling accounted for large amounts of variance in the overall search times. Overall, the results present clear-cut evidence that differential search performance does not need to indicate differential guidance, but may also be explained by postselective factors that influence the dwelling on stimuli. The broader implication of this demonstration is that results from the visual search paradigm have to be interpreted with caution, and that better search performance cannot be directly interpreted as an indicator of preattentive guidance of attention. (PsycINFO Database Record (c) 2020 APA, all rights reserved).
Collapse
|
27
|
Cornish L, Hill A, Horswill MS, Becker SI, Watson MO. Eye-tracking reveals how observation chart design features affect the detection of patient deterioration: An experimental study. APPLIED ERGONOMICS 2019; 75:230-242. [PMID: 30509531 DOI: 10.1016/j.apergo.2018.10.005] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/25/2017] [Revised: 10/09/2018] [Accepted: 10/14/2018] [Indexed: 06/09/2023]
Abstract
Particular design features intended to improve usability - including graphically displayed observations and integrated colour-based scoring-systems - have been shown to increase the speed and accuracy with which users of hospital observation charts detect abnormal patient observations. We used eye-tracking to evaluate two potential cognitive mechanisms underlying these effects. Novice chart-users completed a series of experimental trials in which they viewed patient data presented on one of three observation chart designs (varied within-subjects), and indicated which observation was abnormal (or that none were). A chart that incorporated both graphically displayed observations and an integrated colour-based scoring-system yielded faster, more accurate responses and fewer, shorter fixations than a graphical chart without a colour-based scoring-system. The latter, in turn, yielded the same advantages over a tabular chart (which incorporated neither design feature). These results suggest that both colour-based scoring-systems and graphically displayed observations improve search efficiency and reduce the cognitive resources required to process vital sign data.
Collapse
|
28
|
Martin A, Becker SI. How feature relationships influence attention and awareness: Evidence from eye movements and EEG. J Exp Psychol Hum Percept Perform 2018; 44:1865-1883. [PMID: 30211593 DOI: 10.1037/xhp0000574] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Many everyday tasks require selecting relevant objects in the visual field while ignoring irrelevant information. A widely held belief is that attention is tuned to the exact feature value(s) of a sought-after target object (e.g., color, shape). In contrast, subsequent studies have shown that attentional orienting (capture) is often determined by the relative feature(s) that the target has relative to other irrelevant items surrounding (e.g., redder, larger). However, it is unknown whether conscious awareness is also determined by relative features. Alternatively, awareness could be more strongly determined by exact feature values, which seem to determine dwelling on objects. The present study examined eye movements in a color search task with different types of irrelevant distractors to test (a) whether dwelling is more strongly influenced by exact feature matches than relative matches, and (b) which of the processes (capture vs. dwelling) is more important for conscious awareness of the distractor. A second experiment used an electrophysiological marker of attention (N2pc in the electroencephalogram of participants) to test whether the results generalize to covert attention shifts. As expected, the results revealed that the initial capture of attention was strongest for distractors matching the relative color of the target, whereas similarity to the target was the most important determiner for dwelling. Awareness was more strongly determined by the initial capture of attention than dwelling. These results provide important insights into the interplay of attention and awareness and highlight the importance of considering relative, context-dependent features in theories of awareness. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Collapse
|
29
|
Becker SI. Reply to Theeuwes: Fast Feature-based Top-down Effects, but Saliency May be Slow. J Cogn 2018; 1:28. [PMID: 31517201 PMCID: PMC6634462 DOI: 10.5334/joc.23] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2018] [Accepted: 03/15/2018] [Indexed: 11/24/2022] Open
|
30
|
Enns JT, Becker SI, Brockmole J, Castelhano M, Creem-Regehr S, Gray R, Hecht H, Juhasz B, Philbeck J, Woodman G. Linking contemporary research to the classics: Celebrating 125 years at APA. J Exp Psychol Hum Percept Perform 2018; 43:1695-1700. [PMID: 28967778 DOI: 10.1037/xhp0000473] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
APA is celebrating 125 years this year and at the journal we are commemorating this milestone with a special issue. The inspiration came from our editorial team, who wished to acknowledge the links between game-changing articles that have influenced our research community in the past-we call them classics for short-and contemporary works. The main idea was to feature the work of nine contemporary research teams, while at the same time drawing readers' attention to their links with the classics. In this introduction, we have organized the articles according to several broad themes: active perception, perception for action, action alters perception, perception of our bodies in action, and acting on selective perceptions. As all who have read and contributed to the journal over the past few years have come to realize, it is no longer possible to study perception without considering its role in action. Nor is it possible to study action (formerly called performance, as reflected in the journal title) without understanding the perceptual contributions to action. These nine articles each exemplify, in their own way, how these dynamic interactions play out in contemporary research in our field. (PsycINFO Database Record
Collapse
|
31
|
Becker SI, Harris AM, York A, Choi J. Conjunction search is relational: Behavioral and electrophysiological evidence. J Exp Psychol Hum Percept Perform 2018; 43:1828-1842. [PMID: 28967786 DOI: 10.1037/xhp0000371] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Attention selects behaviorally relevant stimuli for further capacity-limited processing and gates their access to awareness. Given the importance of attention for conscious perception, it is important to determine the factors and mechanisms that drive attention. A widespread view is that attention is biased to the specific feature values of a conjunction target (e.g., vertical, red, medium). By contrast, the results of the present study show that attention is tuned to the 2 relative features that distinguish a conjunction target from the irrelevant nontargets (e.g., larger and bluer). Moreover, an irrelevant conjunction cue that is briefly presented prior to the target can automatically attract attention, even in the absence of any feature contrasts. Importantly, automatic orienting to the conjunction cue was completely independent of the physical similarity between cue and target, and depended only on whether the conjunction cue matched the relative features of the target. These results demonstrate that attentional orienting is determined by a mechanism that can rapidly extract information about feature relationships and guide attention to the stimulus that best matches the relative attributes of the target. These results are difficult to reconcile with extant feature-specific accounts or object-based accounts of attention and argue for a relational account of conjunction search. (PsycINFO Database Record
Collapse
|
32
|
Becker SI, Dutt N, Vromen JMG, Horstmann G. The capture of attention and gaze in the search for emotional photographic faces. VISUAL COGNITION 2017. [DOI: 10.1080/13506285.2017.1333182] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
33
|
Schönhammer JG, Becker SI, Kerzel D. Which kind of attention is captured by cues with the relative target colour? VISUAL COGNITION 2017. [DOI: 10.1080/13506285.2017.1323811] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
34
|
Horstmann G, Herwig A, Becker SI. Distractor Dwelling, Skipping, and Revisiting Determine Target Absent Performance in Difficult Visual Search. Front Psychol 2016; 7:1152. [PMID: 27574510 PMCID: PMC4983613 DOI: 10.3389/fpsyg.2016.01152] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2016] [Accepted: 07/19/2016] [Indexed: 11/13/2022] Open
Abstract
Some targets in visual search are more difficult to find than others. In particular, a target that is similar to the distractors is more difficult to find than a target that is dissimilar to the distractors. Efficiency differences between easy and difficult searches are manifest not only in target-present trials but also in target-absent trials. In fact, even physically identical displays are searched through with different efficiency depending on the searched-for target. Here, we monitored eye movements in search for a target similar to the distractors (difficult search) versus a target dissimilar to the distractors (easy search). We aimed to examine three hypotheses concerning the causes of differential search efficiencies in target-absent trials: (a) distractor dwelling (b) distractor skipping, and (c) distractor revisiting. Reaction times increased with target similarity which is consistent with existing theories and replicates earlier results. Eye movement data indicated guidance in target trials, even though search was very slow. Dwelling, skipping, and revisiting contributed to low search efficiency in difficult search, with dwelling being the strongest factor. It is argued that differences in dwell time account for a large amount of total search time differences.
Collapse
|
35
|
Retell JD, Becker SI, Remington RW. An effective attentional set for a specific colour does not prevent capture by infrequently presented motion distractors. Q J Exp Psychol (Hove) 2016; 69:1340-65. [DOI: 10.1080/17470218.2015.1080738] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
An organism's survival depends on the ability to rapidly orient attention to unanticipated events in the world. Yet, the conditions needed to elicit such involuntary capture remain in doubt. Especially puzzling are spatial cueing experiments, which have consistently shown that involuntary shifts of attention to highly salient distractors are not determined by stimulus properties, but instead are contingent on attentional control settings induced by task demands. Do we always need to be set for an event to be captured by it, or is there a class of events that draw attention involuntarily even when unconnected to task goals? Recent results suggest that a task-irrelevant event will capture attention on first presentation, suggesting that salient stimuli that violate contextual expectations might automatically capture attention. Here, we investigated the role of contextual expectation by examining whether an irrelevant motion cue that was presented only rarely (∼3–6% of trials) would capture attention when observers had an active set for a specific target colour. The motion cue had no effect when presented frequently, but when rare produced a pattern of interference consistent with attentional capture. The critical dependence on the frequency with which the irrelevant motion singleton was presented is consistent with early theories of involuntary orienting to novel stimuli. We suggest that attention will be captured by salient stimuli that violate expectations, whereas top-down goals appear to modulate capture by stimuli that broadly conform to contextual expectations.
Collapse
|
36
|
Schönhammer JG, Grubert A, Kerzel D, Becker SI. Attentional guidance by relative features: Behavioral and electrophysiological evidence. Psychophysiology 2016; 53:1074-83. [PMID: 26990008 DOI: 10.1111/psyp.12645] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2015] [Accepted: 02/19/2016] [Indexed: 11/29/2022]
Abstract
Our ability to select task-relevant information from cluttered visual environments is widely believed to be due to our ability to tune attention to the particular elementary feature values of a sought-after target (e.g., red, orange, yellow). By contrast, recent findings showed that attention is often tuned to feature relationships, that is, features that the target has relative to irrelevant features in the context (e.g., redder, yellower). However, the evidence for such a relational account is so far exclusively based on behavioral measures that do not allow a safe inference about early perceptual processes. The present study provides a critical test of the relational account, by measuring an electrophysiological marker in the EEG of participants (N2pc) in response to briefly presented distractors (cues) that could either match the physical features of the target or its relative features. In a first experiment, the target color and nontarget color were kept constant across trials. In line with a relational account, we found that only cues with the same relative color as the target were attended, regardless of whether the cues had the same physical color as the target. In a second experiment, we demonstrate that attention is biased to the exact target feature value when the target is embedded in a randomly varying context. Taken together, these results provide the first electrophysiological evidence that attention can modulate early perceptual processes differently; in a context-dependent manner versus a context-independent manner, resulting in marked differences in the range of colors that can attract attention.
Collapse
|
37
|
Savage RA, Becker SI, Lipp OV. Visual search for emotional expressions: Effect of stimulus set on anger and happiness superiority. Cogn Emot 2015; 30:713-30. [PMID: 25861807 DOI: 10.1080/02699931.2015.1027663] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
Prior reports of preferential detection of emotional expressions in visual search have yielded inconsistent results, even for face stimuli that avoid obvious expression-related perceptual confounds. The current study investigated inconsistent reports of anger and happiness superiority effects using face stimuli drawn from the same database. Experiment 1 excluded procedural differences as a potential factor, replicating a happiness superiority effect in a procedure that previously yielded an anger superiority effect. Experiments 2a and 2b confirmed that image colour or poser gender did not account for prior inconsistent findings. Experiments 3a and 3b identified stimulus set as the critical variable, revealing happiness or anger superiority effects for two partially overlapping sets of face stimuli. The current results highlight the critical role of stimulus selection for the observation of happiness or anger superiority effects in visual search even for face stimuli that avoid obvious expression related perceptual confounds and are drawn from a single database.
Collapse
|
38
|
Becker SI, Lewis AJ. Oculomotor capture by irrelevant onsets with and without color contrast. Ann N Y Acad Sci 2015; 1339:60-71. [PMID: 25708201 DOI: 10.1111/nyas.12685] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
It is widely known that irrelevant onsets (i.e., items appearing in previously empty locations) can automatically capture attention and attract our gaze. Some studies have shown that onset capture is stronger when the onset distractor matches the target feature, indicating that onset capture can be modulated by feature-based (top-down) tuning to the target. However, it is less clear whether and to what extent the perceptual saliency of the distractor can further modulate this effect. This study examined the effects of target similarity, competition between target and distractor, and bottom-up color contrast on the ability of onset distractor to capture the gaze, by varying the color (contrast) and stimulus-onset asynchrony of the onset distractor. The results clearly show that competition and feature-based attention modulate capture by the irrelevant onset to a large extent, whereas bottom-up color contrasts do not modulate onset capture. These results indicate the need to revise current accounts of gaze control.
Collapse
|
39
|
Becker SI, Grubert A, Dux PE. Distinct neural networks for target feature versus dimension changes in visual search, as revealed by EEG and fMRI. Neuroimage 2014; 102 Pt 2:798-808. [DOI: 10.1016/j.neuroimage.2014.08.058] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2014] [Revised: 08/21/2014] [Accepted: 08/31/2014] [Indexed: 11/29/2022] Open
|
40
|
Schneider D, Slaughter VP, Becker SI, Dux PE. Implicit false-belief processing in the human brain. Neuroimage 2014; 101:268-75. [PMID: 25042446 DOI: 10.1016/j.neuroimage.2014.07.014] [Citation(s) in RCA: 44] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2014] [Revised: 07/08/2014] [Accepted: 07/09/2014] [Indexed: 01/18/2023] Open
|
41
|
Venini D, Remington RW, Horstmann G, Becker SI. Centre-of-Gravity Fixations in Visual Search: When Looking at Nothing Helps to Find Something. J Ophthalmol 2014; 2014:237812. [PMID: 25002972 PMCID: PMC4065739 DOI: 10.1155/2014/237812] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2013] [Revised: 02/12/2014] [Accepted: 02/28/2014] [Indexed: 11/17/2022] Open
Abstract
In visual search, some fixations are made between stimuli on empty regions, commonly referred to as "centre-of-gravity" fixations (henceforth: COG fixations). Previous studies have shown that observers with task expertise show more COG fixations than novices. This led to the view that COG fixations reflect simultaneous encoding of multiple stimuli, allowing more efficient processing of task-related items. The present study tested whether COG fixations also aid performance in visual search tasks with unfamiliar and abstract stimuli. Moreover, to provide evidence for the multiple-item processing view, we analysed the effects of COG fixations on the number and dwell times of stimulus fixations. The results showed that (1) search efficiency increased with increasing COG fixations even in search for unfamiliar stimuli and in the absence of special higher-order skills, (2) COG fixations reliably reduced the number of stimulus fixations and their dwell times, indicating processing of multiple distractors, and (3) the proportion of COG fixations was dynamically adapted to potential information gain of COG locations. A second experiment showed that COG fixations are diminished when stimulus positions unpredictably vary across trials. Together, the results support the multiple-item processing view, which has important implications for current theories of visual search.
Collapse
|
42
|
Craig BM, Becker SI, Lipp OV. Different faces in the crowd: a happiness superiority effect for schematic faces in heterogeneous backgrounds. ACTA ACUST UNITED AC 2014; 14:794-803. [PMID: 24821397 DOI: 10.1037/a0036043] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Recently, D.V. Becker, Anderson, Mortensen, Neufeld, and Neel (2011) proposed recommendations to avoid methodological confounds in visual search studies using emotional photographic faces. These confounds were argued to cause the frequently observed Anger Superiority Effect (ASE), the faster detection of angry than happy expressions, and conceal a true Happiness Superiority Effect (HSE). In Experiment 1, we applied these recommendations (for the first time) to visual search among schematic faces that previously had consistently yielded a robust ASE. Contrary to the prevailing literature, but consistent with D.V. Becker et al. (2011), we observed a HSE with schematic faces. The HSE with schematic faces was replicated in Experiments 2 and 3 using a similar method in discrimination tasks rather than fixed target searches. Experiment 4 isolated background heterogeneity as the key determinant leading to the HSE.
Collapse
|
43
|
Becker SI, Valuch C, Ansorge U. Color priming in pop-out search depends on the relative color of the target. Front Psychol 2014; 5:289. [PMID: 24782795 PMCID: PMC3986547 DOI: 10.3389/fpsyg.2014.00289] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2014] [Accepted: 03/20/2014] [Indexed: 11/17/2022] Open
Abstract
In visual search for pop-out targets, search times are shorter when the target and non-target colors from the previous trial are repeated than when they change. This priming effect was originally attributed to a feature weighting mechanism that biases attention toward the target features, and away from the non-target features. However, more recent studies have shown that visual selection is strongly context-dependent: according to a relational account of feature priming, the target color is always encoded relative to the non-target color (e.g., as redder or greener). The present study provides a critical test of this hypothesis, by varying the colors of the search items such that either the relative color or the absolute color of the target always remained constant (or both). The results clearly show that color priming depends on the relative color of a target with respect to the non-targets but not on its absolute color value. Moreover, the observed priming effects did not change over the course of the experiment, suggesting that the visual system encodes colors in a relative manner from the start of the experiment. Taken together, these results strongly support a relational account of feature priming in visual search, and are inconsistent with the dominant feature-based views.
Collapse
|
44
|
Becker SI, Harris AM, Venini D, Retell JD. Visual search for color and shape: When is the gaze guided by feature relationships, when by feature values? ACTA ACUST UNITED AC 2014; 40:264-91. [DOI: 10.1037/a0033489] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
45
|
Barutchu A, Becker SI, Carter O, Hester R, Levy NL. The role of task-related learned representations in explaining asymmetries in task switching. PLoS One 2013; 8:e61729. [PMID: 23613919 PMCID: PMC3628671 DOI: 10.1371/journal.pone.0061729] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2012] [Accepted: 03/18/2013] [Indexed: 11/19/2022] Open
Abstract
Task switch costs often show an asymmetry, with switch costs being larger when switching from a difficult task to an easier task. This asymmetry has been explained by difficult tasks being represented more strongly and consequently requiring more inhibition prior to switching to the easier task. The present study shows that switch cost asymmetries observed in arithmetic tasks (addition vs. subtraction) do not depend on task difficulty: Switch costs of similar magnitudes were obtained when participants were presented with unsolvable pseudo-equations that did not differ in task difficulty. Further experiments showed that neither task switch costs nor switch cost asymmetries were due to perceptual factors (e.g., perceptual priming effects). These findings suggest that asymmetrical switch costs can be brought about by the association of some tasks with greater difficulty than others. Moreover, the finding that asymmetrical switch costs were observed (1) in the absence of a task switch proper and (2) without differences in task difficulty, suggests that present theories of task switch costs and switch cost asymmetries are in important ways incomplete and need to be modified.
Collapse
|
46
|
Becker SI, Folk CL, Remington RW. Attentional capture does not depend on feature similarity, but on target-nontarget relations. Psychol Sci 2013; 24:634-47. [PMID: 23558547 DOI: 10.1177/0956797612458528] [Citation(s) in RCA: 66] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
What factors determine which stimuli of a scene will be visually selected and become available for conscious perception? The currently prevalent view is that attention operates on specific feature values, so attention will be drawn to stimuli that have features similar to those of the sought-after target. Here, we show that, instead, attentional capture depends on whether a distractor's feature relationships match the target-nontarget relations (e.g., redder). In three spatial-cuing experiments, we found that (a) a cue with the target color (e.g., orange) can fail to capture attention when the cue-cue-context relations do not match the target-nontarget relations (e.g., redder target vs. yellower cue), whereas (b) a cue with the nontarget color can capture attention when its relations match the target-nontarget relations (e.g., both are redder). These results support a relational account in which attention is biased toward feature relationships instead of particular feature values, and show that attentional capture by an irrelevant distractor does not depend on feature similarity, but rather depends on whether the distractor matches or mismatches the target's relative attributes (e.g., relative color).
Collapse
|
47
|
Becker SI, Ansorge U. Higher set sizes in pop-out search displays do not eliminate priming or enhance target selection. Vision Res 2013; 81:18-28. [DOI: 10.1016/j.visres.2013.01.009] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/14/2012] [Revised: 01/10/2013] [Accepted: 01/24/2013] [Indexed: 10/27/2022]
|
48
|
Savage RA, Lipp OV, Craig BM, Becker SI, Horstmann G. In search of the emotional face: anger versus happiness superiority in visual search. ACTA ACUST UNITED AC 2013; 13:758-68. [PMID: 23527503 DOI: 10.1037/a0031970] [Citation(s) in RCA: 45] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Previous research has provided inconsistent results regarding visual search for emotional faces, yielding evidence for either anger superiority (i.e., more efficient search for angry faces) or happiness superiority effects (i.e., more efficient search for happy faces), suggesting that these results do not reflect on emotional expression, but on emotion (un-)related low-level perceptual features. The present study investigated possible factors mediating anger/happiness superiority effects; specifically search strategy (fixed vs. variable target search; Experiment 1), stimulus choice (Nimstim database vs. Ekman & Friesen database; Experiments 1 and 2), and emotional intensity (Experiment 3 and 3a). Angry faces were found faster than happy faces regardless of search strategy using faces from the Nimstim database (Experiment 1). By contrast, a happiness superiority effect was evident in Experiment 2 when using faces from the Ekman and Friesen database. Experiment 3 employed angry, happy, and exuberant expressions (Nimstim database) and yielded anger and happiness superiority effects, respectively, highlighting the importance of the choice of stimulus materials. Ratings of the stimulus materials collected in Experiment 3a indicate that differences in perceived emotional intensity, pleasantness, or arousal do not account for differences in search efficiency. Across three studies, the current investigation indicates that prior reports of anger or happiness superiority effects in visual search are likely to reflect on low-level visual features associated with the stimulus materials used, rather than on emotion.
Collapse
|
49
|
Valuch C, Becker SI, Ansorge U. Priming of fixations during recognition of natural scenes. J Vis 2013; 13:13.3.3. [PMID: 23444392 DOI: 10.1167/13.3.3] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
Eye fixations allow the human viewer to perceive scene content with high acuity. If fixations drive visual memory for scenes, a viewer might repeat his/her previous fixation pattern during recognition of a familiar scene. However, visual salience alone could account for similarities between two successive fixation patterns by attracting the eyes in a stimulus-driven, task-independent manner. In the present study, we tested whether the viewer's aim to recognize a scene fosters fixations on scene content that repeats from learning to recognition as compared to the influence of visual salience alone. In Experiment 1 we compared the gaze behavior in a recognition task to that in a free-viewing task. By showing the same stimuli in both tasks, the task-independent influence of salience was held constant. We found that during a recognition task, but not during (repeated) free viewing, viewers showed a pronounced preference for previously fixated scene content. In Experiment 2 we tested whether participants remembered visual input that they fixated during learning better than salient but nonfixated visual input. To that end we presented participants with smaller cutouts from learned and new scenes. We found that cutouts featuring scene content fixated during encoding were recognized better and faster than cutouts featuring nonfixated but highly salient scene content from learned scenes. Both experiments supported the hypothesis that fixations during encoding and maybe during recognition serve visual memory over and above a stimulus-driven influence of visual salience.
Collapse
|
50
|
Schneider D, Bayliss AP, Becker SI, Dux PE. Eye movements reveal sustained implicit processing of others' mental states. ACTA ACUST UNITED AC 2012; 141:433-8. [DOI: 10.1037/a0025458] [Citation(s) in RCA: 82] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|