1
|
McManus R, Thomas LE. Action does not drive visual biases in peri-tool space. Atten Percept Psychophys 2024; 86:525-535. [PMID: 38127254 DOI: 10.3758/s13414-023-02826-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/07/2023] [Indexed: 12/23/2023]
Abstract
Observers experience visual biases in the area around handheld tools. These biases may occur when active use leads an observer to incorporate a tool into the body schema. However, the visual salience of a handheld tool may instead create an attentional prioritization that is not reliant on body-based representations. We investigated these competing explanations of near-tool visual biases in two experiments during which participants performed a target detection task. Targets could appear near or far from a tool positioned next to a display. In Experiment 1, participants showed facilitation in detecting targets that appeared near a simple handheld rake tool regardless of whether they first used the rake to retrieve objects, but participants who only viewed the tool without holding it were no faster to detect targets appearing near the rake than targets that appeared on the opposite side of the display. In a second experiment, participants who held a novel magnetic tool again showed a near-tool bias even when they refrained from using the tool. Taken together, these results suggest active use is unnecessary, but visual salience is not sufficient, to introduce visual biases in peri-tool space.
Collapse
Affiliation(s)
- Robert McManus
- Center for Visual and Cognitive Neuroscience, Department of Psychology, North Dakota State University, Fargo, ND, USA
| | - Laura E Thomas
- Center for Visual and Cognitive Neuroscience, Department of Psychology, North Dakota State University, Fargo, ND, USA.
| |
Collapse
|
2
|
Nakayama K, Moher J, Song JH. Rethinking Vision and Action. Annu Rev Psychol 2023; 74:59-86. [PMID: 36652303 DOI: 10.1146/annurev-psych-021422-043229] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/19/2023]
Abstract
Action is an important arbitrator as to whether an individual or a species will survive. Yet, action has not been well integrated into the study of psychology. Action or motor behavior is a field apart. This is traditional science with its need for specialization. The sequence in a typical laboratory experiment of see → decide → act provides the rationale for broad disciplinary categorizations. With renewed interest in action itself, surprising and exciting anomalous findings at odds with this simplified caricature have emerged. They reveal a much more intimate coupling of vision and action, which we describe. In turn, this prompts us to identify and dwell on three pertinent theories deserving of greater notice.
Collapse
Affiliation(s)
- Ken Nakayama
- Department of Psychology, University of California, Berkeley, California, USA;
| | - Jeff Moher
- Department of Psychology, Connecticut College, New London, Connecticut, USA;
| | - Joo-Hyun Song
- Department of Cognitive, Linguistic, and Psychological Sciences, Brown University, Providence, Rhode Island, USA;
| |
Collapse
|
3
|
Abstract
Previous studies have shown that after actively using a handheld tool for a period of time, participants show visual biases toward stimuli presented near the end of the tool. Research suggests this is driven by an incorporation of the tool into the observer's body schema, extending peripersonal space to surround the tool. This study aims to investigate whether the same visual biases might be seen near remotely operated tools. Participants used tools-a handheld rake (Experiment 1), a remote-controlled drone (Experiment 2), a remote-controlled excavator (Experiment 3), or a handheld excavator (Experiment 4)-to rake sand for several minutes, then performed a target-detection task in which they made speeded responses to targets appearing near and far from the tool. In Experiment 1, participants detected targets appearing near the rake significantly faster than targets appearing far from the rake, replicating previous findings. We failed to find strong evidence of improved target detection near remotely operated tools in Experiments 2 and 3, but found clear evidence of near-tool facilitation in Experiment 4 when participants physically picked up the excavator and used it as a handheld tool. These results suggest that observers may not incorporate remotely operated tools into the body schema in the same manner as handheld tools. We discuss potential mechanisms that may drive these differences in embodiment between handheld and remote-controlled tools.
Collapse
|
4
|
Rodin has it! The role of hands in improving the selectivity of attention. Acta Psychol (Amst) 2020; 210:103160. [PMID: 32823058 DOI: 10.1016/j.actpsy.2020.103160] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2020] [Revised: 07/28/2020] [Accepted: 08/03/2020] [Indexed: 11/22/2022] Open
Abstract
We report a new discovery on the role of hands in guiding attention, using the classic Stroop effect as our assay. We show that the Stroop effect diminishes, hence selective attention improves, when observers hold their chin, emulating Rodin's famous sculpture, "The Thinker." In two experiments we show that the Rodin posture improves the selectivity of attention as efficiently as holding the hands nearby the visual stimulus (the near-hands effect). Because spatial proximity to the displayed stimulus is neither present nor intended, the presence of the Rodin effect implies that attentional prioritization by the hands is not limited to the space between the hands.
Collapse
|
5
|
Agauas SJ, Jacoby M, Thomas LE. Near-hand effects are robust: Three OSF pre-registered replications of visual biases in perihand space. VISUAL COGNITION 2020. [DOI: 10.1080/13506285.2020.1751763] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
Affiliation(s)
- Stephen J. Agauas
- Center for Visual and Cognitive Neuroscience, Department of Psychology, North Dakota State University, Fargo, ND, USA
| | - Morgan Jacoby
- Center for Visual and Cognitive Neuroscience, Department of Psychology, North Dakota State University, Fargo, ND, USA
| | - Laura E. Thomas
- Center for Visual and Cognitive Neuroscience, Department of Psychology, North Dakota State University, Fargo, ND, USA
| |
Collapse
|
6
|
Developing a Checklist for Cognitive Characteristics of Driving Scenarios in Dual-Task Studies: The Case of Cell Phone Use While Driving. HEALTH SCOPE 2019. [DOI: 10.5812/jhealthscope.86836] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
7
|
Abstract
Perception and action interact in nearly every moment of daily life. Previous studies have demonstrated not only that perceptual input shapes action but also that various factors associated with action-including individual abilities and biomechanical costs-influence perceptual decisions. However, it is unknown how action fluency affects the sensitivity of early-stage visual perception, such as orientation. To address this question, we used a dual-task paradigm: Participants prepared an action (e.g., grasping), while concurrently performing an orientation-change-detection task. We demonstrated that as actions became more fluent (e.g., as grasping errors decreased), perceptual-discrimination performance also improved. Importantly, we found that grasping training prior to discrimination enhanced subsequent perceptual sensitivity, supporting the notion of a reciprocal relation between perception and action.
Collapse
Affiliation(s)
- Jianfei Guo
- Department of Cognitive, Linguistic & Psychological Sciences, Brown University
| | - Joo-Hyun Song
- Department of Cognitive, Linguistic & Psychological Sciences, Brown University
- Carney Institute for Brain Science, Brown University
| |
Collapse
|
8
|
Abstract
Recent evidence has demonstrated that observers experience visual-processing biases in perihand space that may be tied to the hands' relevance for grasping actions. Our previous work suggested that when the hands are positioned to afford a power-grasp action, observers show increased temporal sensitivity that could aid with fast and forceful action, whereas when the hands are instead at the ready to perform a precision-grasp action, observers show enhanced spatial sensitivity that benefits delicate and detail-oriented actions. In the present investigation we seek to extend these previous findings by examining how object affordances may interact with hand positioning to shape visual biases in perihand space. Across three experiments, we examined how long participants took to perform a change detection task on photos of real objects, while we manipulated hand position (near/far from display), grasp posture (power/precision), and change type (orientation/identity). Participants viewed objects that afforded either a power grasp or a precision grasp, or were ungraspable. Although we were unable to uncover evidence of altered vision in perihand space in our first experiment, mirroring previous findings, in Experiments 2 and 3 our participants showed grasp-dependent biases near the hands when detecting changes to target objects that afforded a power grasp. Interestingly, ungraspable target objects were not subject to the same perihand space biases. Taken together, our results suggest that the influence of hand position on change detection performance is mediated not only by the hands' grasp posture, but also by a target object's affordances for grasping.
Collapse
|
9
|
Memory for retinotopic locations is more accurate than memory for spatiotopic locations, even for visually guided reaching. Psychon Bull Rev 2019; 25:1388-1398. [PMID: 29159799 DOI: 10.3758/s13423-017-1401-x] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
To interact successfully with objects, we must maintain stable representations of their locations in the world. However, their images on the retina may be displaced several times per second by large, rapid eye movements. A number of studies have demonstrated that visual processing is heavily influenced by gaze-centered (retinotopic) information, including a recent finding that memory for an object's location is more accurate and precise in gaze-centered (retinotopic) than world-centered (spatiotopic) coordinates (Golomb & Kanwisher, 2012b). This effect is somewhat surprising, given our intuition that behavior is successfully guided by spatiotopic representations. In the present experiment, we asked whether the visual system may rely on a more spatiotopic memory store depending on the mode of responding. Specifically, we tested whether reaching toward and tapping directly on an object's location could improve memory for its spatiotopic location. Participants performed a spatial working memory task under four conditions: retinotopic vs. spatiotopic task, and computer mouse click vs. touchscreen reaching response. When participants responded by clicking with a mouse on the screen, we replicated Golomb & Kanwisher's original results, finding that memory was more accurate in retinotopic than spatiotopic coordinates and that the accuracy of spatiotopic memory deteriorated substantially more than retinotopic memory with additional eye movements during the memory delay. Critically, we found the same pattern of results when participants responded by using their finger to reach and tap the remembered location on the monitor. These results further support the hypothesis that spatial memory is natively retinotopic; we found no evidence that engaging the motor system improves spatiotopic memory across saccades.
Collapse
|
10
|
Abstract
The science of mental life and behavior has paid scant attention to the means by which mental life is translated into physical behavior. Why this is so was the topic of a 2005 American Psychologist article whose main title was "The Cinderella of Psychology." In the present article, we briefly review some of the reasons why motor control was relegated to the sidelines of psychology. Then we point to work showing that experimental psychologists have much to contribute to research on action generation. We focus on studies showing that actions are generated in a way that, at least by default, minimize changes between successive actions. The method is computationally as well as physically economical but also requires consideration of costs, including costs of different kinds. How such costs are compared is discussed in the next section. The final section offers comments about the future of psychologically focused action research. Two additional themes of the review concern methods for studying action generation. First, much can be learned through naturalistic observation. Second, subsequent experiments, designed to check naturalistic observations, can use very simple equipment and procedures. This can make the study of action generation easy to pursue in the psychology laboratory.
Collapse
|
11
|
Hosang TJ, Fischer R, Pomp J, Liepelt R. Dual-Tasking in the Near-Hand Space: Effects of Stimulus-Hand Proximity on Between-Task Shifts in the Psychological Refractory Period Paradigm. Front Psychol 2018; 9:1942. [PMID: 30459670 PMCID: PMC6232416 DOI: 10.3389/fpsyg.2018.01942] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2018] [Accepted: 09/20/2018] [Indexed: 11/17/2022] Open
Abstract
Two decades of research indicate that visual processing is typically enhanced for items that are in the space near the hands (near-hand space). Enhanced attention and cognitive control have been thought to be responsible for the observed effects, amongst others. As accumulating experimental evidence and recent theories of dual-tasking suggest an involvement of cognitive control and attentional processes during dual tasking, dual-task performance may be modulated in the near-hand space. Therefore, we performed a series of three experiments that aimed to test if the near-hand space affects the shift between task-component processing in two visual-manual tasks. We applied a Psychological Refractory Period Paradigm (PRP) with varying stimulus-onset asynchrony (SOA) and manipulated stimulus-hand proximity by placing hands either on the side of a computer screen (near-hand condition) or on the lap (far-hand condition). In Experiment 1, Task 1 was a number categorization task (odd vs. even) and Task 2 was a letter categorization task (vowel vs. consonant). Stimulus presentation was spatially segregated with Stimulus 1 presented on the right side of the screen, appearing first and then Stimulus 2, presented on the left side of the screen, appearing second. In Experiment 2, we replaced Task 2 with a color categorization task (orange vs. blue). In Experiment 3, Stimulus 1 and Stimulus 2 were centrally presented as a single bivalent stimulus. The classic PRP effect was shown in all three experiments, with Task 2 performance declining at short SOA while Task 1 performance being relatively unaffected by task-overlap. In none of the three experiments did stimulus-hand proximity affect the size of the PRP effect. Our results indicate that the switching operation between two tasks in the PRP paradigm is neither optimized nor disturbed by being processed in near-hand space.
Collapse
Affiliation(s)
- Thomas J Hosang
- Department of Performance Psychology, Institute of Psychology, German Sport University Cologne, Cologne, Germany.,Experimental Psychology Unit, Department of Psychology, Helmut Schmidt University, Hamburg, Germany
| | - Rico Fischer
- Department of Psychology, University of Greifswald, Greifswald, Germany.,Department of Psychology, Dresden University of Technology, Dresden, Germany
| | - Jennifer Pomp
- Institute for Psychology, University of Münster, Münster, Germany
| | - Roman Liepelt
- Department of Performance Psychology, Institute of Psychology, German Sport University Cologne, Cologne, Germany.,Institute for Psychology, University of Münster, Münster, Germany
| |
Collapse
|
12
|
Abstract
Recent literature has demonstrated that hand position can affect visual processing, a set of phenomena termed Near Hand Effects (NHEs). Across four studies we looked for single-hand NHEs on a large screen when participants were asked to discriminate stimuli based on size, colour, and orientation (Study 1), to detect stimuli after a manipulation of hand shaping (Study 2), to detect stimuli after the introduction of a peripheral cue (Study 3), and finally to detect stimuli after a manipulation of screen orientation (Study 4). Each study failed to find a NHE. Further examination of the pooled data using a Bayesian analysis also failed to reveal positive evidence for faster responses or larger cueing effects near a hand. These findings suggest that at least some NHEs may be surprisingly fragile, which dovetails with the recent proposition that NHEs may not form a unitary set of phenomena (Gozli & Deng, 2018). The implication is that visual processing may be less sensitive to hand position across measurement techniques than previously thought, and points to a need for well-powered, methodologically rigorous studies on this topic in the future.
Collapse
Affiliation(s)
- Jill A. Dosso
- Department of Psychology, University of British Columbia, Vancouver, BC, CA
| | - Alan Kingstone
- Department of Psychology, University of British Columbia, Vancouver, BC, CA
| |
Collapse
|
13
|
Immobilization does not disrupt near-hand attentional biases. Conscious Cogn 2018; 64:50-60. [PMID: 29773511 DOI: 10.1016/j.concog.2018.05.001] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2018] [Revised: 04/30/2018] [Accepted: 05/03/2018] [Indexed: 11/23/2022]
Abstract
Observers show biases in attention when viewing objects within versus outside of their hands' grasping space. While the hands' proximity to stimuli plays a key role in these effects, recent evidence suggests an observer's affordances for grasping actions also shape visual processing near the hands. The current study examined the relative contributions of proximity and affordances in introducing attentional biases in peripersonal space. Participants placed a single hand on a visual display and detected targets appearing near or far from the hand. Across conditions, the hand was either free, creating an affordance for a grasping action, or immobilized using an orthosis, interfering with the potential to grasp. Replicating previous findings, participants detected targets appearing near the hand more quickly than targets appearing far from the hand. Immobilizing the hands did not disrupt this effect, suggesting that proximity alone is sufficient to facilitate target detection in peripersonal space.
Collapse
|
14
|
Andringa R, Boot WR, Roque NA, Ponnaluri S. Hand proximity effects are fragile: a useful null result. COGNITIVE RESEARCH-PRINCIPLES AND IMPLICATIONS 2018; 3:7. [PMID: 29607404 PMCID: PMC5871631 DOI: 10.1186/s41235-018-0094-7] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/10/2017] [Accepted: 02/15/2018] [Indexed: 11/16/2022]
Abstract
Placing one’s hands near an object has been reported to enhance visual processing in a number of ways. We explored whether hand proximity confers an advantage when applied to complex visual search. In one experiment, participants indicated the presence or absence of a target item in a baggage x-ray image by pressing response boxes located at the edge of a tablet computer screen, requiring them to grip the display between their hands. Alternatively, they responded using a mouse held within their lap. Contrary to expectations, hand position did not influence search performance. In a second experiment, participants used their finger to trace along the x-ray image while searching. In addition to any effect of hand proximity it was predicted that this strategy would encourage a more systematic search strategy. Participants inspected bags longer using this strategy, but this did not translate into improved target detection. A third experiment attempted to replicate the near-hands advantage in a change detection paradigm featuring simple stimuli (Tseng and Bridgeman, Experimental Brain Research 209:257–269, 2011), and the same equipment and hand positions as Experiment 1, but was unable to do so. One possibility is that the grip posture associated with holding a tablet is not conducive to producing a near-hands advantage. A final experiment tested this hypothesis with a direct replication of Tseng and Bridgeman, in which participants responded to stimuli presented on a CRT monitor using keys attached to the side of the monitor. Still, no near-hands advantage was observed. Our results suggest that the near-hands advantage may be sensitive to small differences in procedure, a finding that has important implications for harnessing the near-hands advantage to produce better performance in applied contexts.
Collapse
Affiliation(s)
- Ronald Andringa
- 1Department of Psychology, Florida State University, 1107 W. Call Street, Tallahassee, FL 32306 USA
| | - Walter R Boot
- 1Department of Psychology, Florida State University, 1107 W. Call Street, Tallahassee, FL 32306 USA
| | - Nelson A Roque
- 1Department of Psychology, Florida State University, 1107 W. Call Street, Tallahassee, FL 32306 USA
| | | |
Collapse
|
15
|
Engagement of the motor system in position monitoring: reduced distractor suppression and effects of internal representation quality on motor kinematics. Exp Brain Res 2018; 236:1445-1460. [PMID: 29546652 PMCID: PMC5937884 DOI: 10.1007/s00221-018-5234-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2017] [Accepted: 03/12/2018] [Indexed: 11/04/2022]
Abstract
The position monitoring task is a measure of divided spatial attention in which participants track the changing positions of one or more objects, attempting to represent positions with as much precision as possible. Typically precision of representations declines with each target object added to participants’ attention load. Since the motor system requires precise representations of changing target positions, we investigated whether position monitoring would be facilitated by increasing engagement of the motor system. Using motion capture, we recorded the positions of participants’ index finger during pointing responses. Participants attempted to monitor the changing positions of between one and four target discs as they moved randomly around a large projected display. After a period of disc motion, all discs disappeared and participants were prompted to report the final position of one of the targets, either by mouse click or by pointing to the final perceived position on the screen. For mouse click responses, precision declined with attentional load. For pointing responses, precision declined only up to three targets and remained at the same level for four targets, suggesting obligatory attention to all four objects for loads above two targets. Kinematic profiles for pointing responses for highest and lowest loads showed greater motor adjustments during the point, demonstrating that, like external environmental task demands, the quality of internal representations affects motor kinematics. Specifically, these adjustments reflect the difficulty of both pointing to very precisely represented locations as well as keeping representations distinct from one another.
Collapse
|