1
|
Maurya A, Shukla A, Thomas T. Temporal mechanisms underlying visual processing bias in peri-hand space. Atten Percept Psychophys 2024; 86:2659-2671. [PMID: 39627397 DOI: 10.3758/s13414-024-02980-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/11/2024] [Indexed: 12/18/2024]
Abstract
The immediate space surrounding the hands has often been termed the peri-hand space (PHS), and is characterized by a smaller reaction time (RT), better detection, and enhanced accuracy for stimuli presented in this space, relative to those stimuli presented beyond this space. Such behavioral changes have been explained in terms of a biased allocation of cognitive resources such as perception, attention, and memory, for the efficient processing of information presented in the PHS. However, in two experiments, the current study shows that these cognitive biases seem to have an underlying temporal basis. The first experiment requires participants to perform a temporal bisection task, whereas the second experiment requires them to perform a verbal estimation task when stimuli are presented either near the hands or relatively far. Results from both experiments give evidence for slowing down of temporal mechanisms in the PHS - reflected in the form of temporal dilation for stimuli presented in the PHS relative to those presented further away. The slowing down of time in the PHS seems crucial in giving sufficient temporal allowance for the allocation of cognitive resources to prioritize the processing of information in the PHS. The findings are in line with the early anticipatory mechanisms associated with the PHS and seem to be driven by the switch/gate mechanism, and not the pacemaker component of the attentional gate model of time perception. Thus, the current study tries to integrate the theories of time perception with the peripersonal space literature.
Collapse
Affiliation(s)
- Ankit Maurya
- Department of Humanities and Social Sciences, Indian Institute of Technology RoorkeeRoom No. 513, Uttarakhand - 247 667, Roorkee, India
| | - Anuj Shukla
- Thapar School of Liberal Arts and Sciences, Thapar Institute of Engineering and Technology, Bhadson Road, Patiala, Punjab, India
| | - Tony Thomas
- Department of Humanities and Social Sciences, Indian Institute of Technology RoorkeeRoom No. 513, Uttarakhand - 247 667, Roorkee, India.
| |
Collapse
|
2
|
Schroer SE, Yu C. Word learning is hands-on: Insights from studying natural behavior. ADVANCES IN CHILD DEVELOPMENT AND BEHAVIOR 2024; 66:55-79. [PMID: 39074925 DOI: 10.1016/bs.acdb.2024.04.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/31/2024]
Abstract
Infants' interactions with social partners are richly multimodal. Dyads respond to and coordinate their visual attention, gestures, vocalizations, speech, manual actions, and manipulations of objects. Although infants are typically described as active learners, previous experimental research has often focused on how infants learn from stimuli that is well-crafted by researchers. Recent research studying naturalistic, free-flowing interactions has explored the meaningful patterns in dyadic behavior that relate to language learning. Infants' manual engagement and exploration of objects supports their visual attention, creates salient and diverse views of objects, and elicits labeling utterances from parents. In this chapter, we discuss how the cascade of behaviors created by infant multimodal attention plays a fundamental role in shaping their learning environment, supporting real-time word learning and predicting later vocabulary size. We draw from recent at-home and cross-cultural research to test the validity of our mechanistic pathway and discuss why hands matter so much for learning. Our goal is to convey the critical need for developmental scientists to study natural behavior and move beyond our "tried-and-true" paradigms, like screen-based tasks. By studying natural behavior, the role of infants' hands in early language learning was revealed-though it was a behavior that was often uncoded, undiscussed, or not even allowed in decades of previous research. When we study infants in their natural environment, they can show us how they learn about and explore their world. Word learning is hands-on.
Collapse
Affiliation(s)
- Sara E Schroer
- The Center for Perceptual Systems, The University of Texas at Austin; Department of Psychology, The University of Texas at Austin.
| | - Chen Yu
- The Center for Perceptual Systems, The University of Texas at Austin; Department of Psychology, The University of Texas at Austin
| |
Collapse
|
3
|
Schroer SE, Yu C. Looking is not enough: Multimodal attention supports the real-time learning of new words. Dev Sci 2023; 26:e13290. [PMID: 35617054 DOI: 10.1111/desc.13290] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2021] [Revised: 04/07/2022] [Accepted: 05/09/2022] [Indexed: 11/28/2022]
Abstract
Most research on early language learning focuses on the objects that infants see and the words they hear in their daily lives, although growing evidence suggests that motor development is also closely tied to language development. To study the real-time behaviors required for learning new words during free-flowing toy play, we measured infants' visual attention and manual actions on to-be-learned toys. Parents and 12-to-26-month-old infants wore wireless head-mounted eye trackers, allowing them to move freely around a home-like lab environment. After the play session, infants were tested on their knowledge of object-label mappings. We found that how often parents named objects during play did not predict learning, but instead, it was infants' attention during and around a labeling utterance that predicted whether an object-label mapping was learned. More specifically, we found that infant visual attention alone did not predict word learning. Instead, coordinated, multimodal attention-when infants' hands and eyes were attending to the same object-predicted word learning. Our results implicate a causal pathway through which infants' bodily actions play a critical role in early word learning.
Collapse
Affiliation(s)
- Sara E Schroer
- Department of Psychology, The University of Texas at Austin, Austin, Texas, USA
| | - Chen Yu
- Department of Psychology, The University of Texas at Austin, Austin, Texas, USA
| |
Collapse
|
4
|
Coello Y, Cartaud A. The Interrelation Between Peripersonal Action Space and Interpersonal Social Space: Psychophysiological Evidence and Clinical Implications. Front Hum Neurosci 2021; 15:636124. [PMID: 33732124 PMCID: PMC7959827 DOI: 10.3389/fnhum.2021.636124] [Citation(s) in RCA: 31] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2020] [Accepted: 02/08/2021] [Indexed: 11/17/2022] Open
Abstract
The peripersonal space is an adaptive and flexible interface between the body and the environment that fulfills a dual-motor function: preparing the body for voluntary object-oriented actions to interact with incentive stimuli and preparing the body for defensive responses when facing potentially harmful stimuli. In this position article, we provide arguments for the sensorimotor rooting of the peripersonal space representation and highlight the variables that contribute to its flexible and adaptive characteristics. We also demonstrate that peripersonal space represents a mediation zone between the body and the environment contributing to not only the control of goal-directed actions but also the organization of social life. The whole of the data presented and discussed led us to the proposal of a new theoretical framework linking the peripersonal action space and the interpersonal social space and we highlight how this theoretical framework can account for social behaviors in populations with socio-emotional deficits.
Collapse
Affiliation(s)
- Yann Coello
- Univ. Lille, CNRS, Lille, UMR 9193-SCALab-Sciences Cognitives et Sciences Affectives, Lille, France
| | - Alice Cartaud
- Univ. Lille, CNRS, Lille, UMR 9193-SCALab-Sciences Cognitives et Sciences Affectives, Lille, France
| |
Collapse
|
5
|
An auditory hand-proximity effect: The auditory Simon effect is enhanced near the hands. Psychon Bull Rev 2021; 28:853-861. [PMID: 33469849 DOI: 10.3758/s13423-020-01860-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/07/2020] [Indexed: 11/08/2022]
Abstract
Visual processing near the hands is altered compared with stimuli far from the hands. Here, we aimed to test whether this alteration can be found in auditory processing. Participants were required to perform an auditory Simon task either with their hands close to the loudspeakers or far from the loudspeakers. Two experiments consistently showed that the auditory Simon effect was enhanced when the hands were close to the speakers compared with far from the speakers. This is consistent with previous findings of an enhanced visual Simon effect near the hands. Furthermore, the hand-proximity effects in auditory and visual Simon tasks (an enhanced Simon effect near hands compared with far from hands) were comparable, indicating hand-proximity effect is reliable across visual and auditory modalities. Thus, the present study extended the hand-proximity effect from vision to audition by showing that the auditory Simon effect was enhanced near the hands compared with far from the hands.
Collapse
|
6
|
Peck TC, Tutar A. The Impact of a Self-Avatar, Hand Collocation, and Hand Proximity on Embodiment and Stroop Interference. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2020; 26:1964-1971. [PMID: 32070969 DOI: 10.1109/tvcg.2020.2973061] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Understanding the effects of hand proximity to objects and tasks is critical for hand-held and near-hand objects. Even though self-avatars have been shown to be beneficial for various tasks in virtual environments, little research has investigated the effect of avatar hand proximity on working memory. This paper presents a between-participants user study investigating the effects of self-avatars and physical hand proximity on a common working memory task, the Stroop interference task. Results show that participants felt embodied when a self-avatar was in the scene, and that the subjective level of embodiment decreased when a participant's hands were not collocated with the avatar's hands. Furthermore, a participant's physical hand placement was significantly related to Stroop interference: proximal hands produced a significant increase in accuracy compared to non-proximal hands. Surprisingly, Stroop interference was not mediated by the existence of a self-avatar or level of embodiment.
Collapse
|
7
|
Agauas SJ, Jacoby M, Thomas LE. Near-hand effects are robust: Three OSF pre-registered replications of visual biases in perihand space. VISUAL COGNITION 2020. [DOI: 10.1080/13506285.2020.1751763] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
Affiliation(s)
- Stephen J. Agauas
- Center for Visual and Cognitive Neuroscience, Department of Psychology, North Dakota State University, Fargo, ND, USA
| | - Morgan Jacoby
- Center for Visual and Cognitive Neuroscience, Department of Psychology, North Dakota State University, Fargo, ND, USA
| | - Laura E. Thomas
- Center for Visual and Cognitive Neuroscience, Department of Psychology, North Dakota State University, Fargo, ND, USA
| |
Collapse
|
8
|
Bamford LE, Klassen NR, Karl JM. Faster recognition of graspable targets defined by orientation in a visual search task. Exp Brain Res 2020; 238:905-916. [PMID: 32170332 DOI: 10.1007/s00221-020-05769-z] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2019] [Accepted: 03/03/2020] [Indexed: 10/24/2022]
Abstract
Peri-hand space is the area surrounding the hand. Objects within this space may be subject to increased visuospatial perception, increased attentional prioritization, and slower attentional disengagement compared to more distal objects. This may result from kinesthetic and visual feedback about the location of the hand that projects from the reach and grasp networks of the dorsal visual stream back to occipital visual areas, which in turn, refines cortical visual processing that can subsequently guide skilled motor actions. Thus, we hypothesized that visual stimuli that afford action, which are known to potentiate activity in the dorsal visual stream, would be associated with greater alterations in visual processing when presented near the hand. To test this, participants held their right hand near or far from a touchscreen that presented a visual array containing a single target object that differed from 11 distractor objects by orientation only. The target objects and their accompanying distractors either strongly afforded grasping or did not. Participants identified the target among the distractors by reaching out and touching it with their left index finger while eye-tracking was used to measure visual search times, target recognition times, and search accuracy. The results failed to support the theory of enhanced visual processing of graspable objects near the hand as participants were faster at recognizing graspable compared to non-graspable targets, regardless of the position of the right hand. The results are discussed in relation to the idea that, in addition to potentiating appropriate motor responses, object affordances may also potentiate early visual processes necessary for object recognition.
Collapse
Affiliation(s)
- Lindsay E Bamford
- Department of Psychology, Thompson Rivers University, Kamloops, BC, Canada.
| | - Nikola R Klassen
- Department of Psychology, Thompson Rivers University, Kamloops, BC, Canada
| | - Jenni M Karl
- Department of Psychology, Thompson Rivers University, Kamloops, BC, Canada
| |
Collapse
|
9
|
Abstract
Recent evidence has demonstrated that observers experience visual-processing biases in perihand space that may be tied to the hands' relevance for grasping actions. Our previous work suggested that when the hands are positioned to afford a power-grasp action, observers show increased temporal sensitivity that could aid with fast and forceful action, whereas when the hands are instead at the ready to perform a precision-grasp action, observers show enhanced spatial sensitivity that benefits delicate and detail-oriented actions. In the present investigation we seek to extend these previous findings by examining how object affordances may interact with hand positioning to shape visual biases in perihand space. Across three experiments, we examined how long participants took to perform a change detection task on photos of real objects, while we manipulated hand position (near/far from display), grasp posture (power/precision), and change type (orientation/identity). Participants viewed objects that afforded either a power grasp or a precision grasp, or were ungraspable. Although we were unable to uncover evidence of altered vision in perihand space in our first experiment, mirroring previous findings, in Experiments 2 and 3 our participants showed grasp-dependent biases near the hands when detecting changes to target objects that afforded a power grasp. Interestingly, ungraspable target objects were not subject to the same perihand space biases. Taken together, our results suggest that the influence of hand position on change detection performance is mediated not only by the hands' grasp posture, but also by a target object's affordances for grasping.
Collapse
|
10
|
Abstract
Recent studies have demonstrated altered visual processing of stimuli in the proximal region of the hand. It has been challenging to characterize the range and nature of these processing differences. In our attempt to deconstruct the factors giving rise to the Hand-Proximity Effects (HPEs), we manipulated the organization of items in a visual search display. In two experiments, we observed the absence of HPE. Specifically, in Experiment 1, we presented the search display in only one half of the monitor (split diagonally), which could be either near or far from the hand placed on the corner of the monitor. The results of a Bayesian analysis showed that the search efficiency was not significantly different for neither ‘near’ nor ‘far’ condition when compared with the baseline condition in which the hand rested on the lap. In Experiment 2, the search display was arranged horizontally across the monitor. A Bayesian analysis showed that RTs did not vary depending on the proximity of the target to the hand as well as the baseline (lap) condition. The present results characterize features of the HPE that have not been reported previously and are in line with recent reports of the failure to replicate HPE under various circumstances.
Collapse
Affiliation(s)
- Tony Thomas
- Centre for Cognitive Science, Indian Institute of Technology Gandhinagar, Gandhinagar, IN
| | - Meera Mary Sunny
- Centre for Cognitive Science, Indian Institute of Technology Gandhinagar, Gandhinagar, IN
| |
Collapse
|
11
|
Gozli DG, Deng WS. Building Blocks of Psychology: on Remaking the Unkept Promises of Early Schools. Integr Psychol Behav Sci 2018; 52:1-24. [PMID: 29063441 DOI: 10.1007/s12124-017-9405-7] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Abstract
The appeal and popularity of "building blocks", i.e., simple and dissociable elements of behavior and experience, persists in psychological research. We begin our assessment of this research strategy with an historical review of structuralism (as espoused by E. B. Titchener) and behaviorism (espoused by J. B. Watson and B. F. Skinner), two movements that held the assumption in their attempts to provide a systematic and unified discipline. We point out the ways in which the elementism of the two schools selected, framed, and excluded topics of study. After the historical review, we turn to contemporary literature and highlight the persistence of research into building blocks and the associated framing and exclusions in psychological research. The assumption that complex categories of human psychology can be understood in terms of their elementary components and simplest forms seems indefensible. In specific cases, therefore, reliance on the assumption requires justification. Finally, we review alternative strategies that bypass the commitment to building blocks.
Collapse
Affiliation(s)
- Davood G Gozli
- Department of Psychology, University of Macau, Macau, SAR, China.
| | - Wei Sophia Deng
- Department of Psychology, University of Macau, Macau, SAR, China
| |
Collapse
|
12
|
Abstract
Recent literature has demonstrated that hand position can affect visual processing, a set of phenomena termed Near Hand Effects (NHEs). Across four studies we looked for single-hand NHEs on a large screen when participants were asked to discriminate stimuli based on size, colour, and orientation (Study 1), to detect stimuli after a manipulation of hand shaping (Study 2), to detect stimuli after the introduction of a peripheral cue (Study 3), and finally to detect stimuli after a manipulation of screen orientation (Study 4). Each study failed to find a NHE. Further examination of the pooled data using a Bayesian analysis also failed to reveal positive evidence for faster responses or larger cueing effects near a hand. These findings suggest that at least some NHEs may be surprisingly fragile, which dovetails with the recent proposition that NHEs may not form a unitary set of phenomena (Gozli & Deng, 2018). The implication is that visual processing may be less sensitive to hand position across measurement techniques than previously thought, and points to a need for well-powered, methodologically rigorous studies on this topic in the future.
Collapse
Affiliation(s)
- Jill A. Dosso
- Department of Psychology, University of British Columbia, Vancouver, BC, CA
| | - Alan Kingstone
- Department of Psychology, University of British Columbia, Vancouver, BC, CA
| |
Collapse
|
13
|
Testing the generality of the zoom-lens model: Evidence for visual-pathway specific effects of attended-region size on perception. Atten Percept Psychophys 2017; 79:1147-1164. [DOI: 10.3758/s13414-017-1306-9] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
14
|
Wiemers M, Fischer MH. Effects of Hand Proximity and Movement Direction in Spatial and Temporal Gap Discrimination. Front Psychol 2016; 7:1930. [PMID: 28018268 PMCID: PMC5145868 DOI: 10.3389/fpsyg.2016.01930] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2016] [Accepted: 11/24/2016] [Indexed: 11/13/2022] Open
Abstract
Previous research on the interplay between static manual postures and visual attention revealed enhanced visual selection near the hands (near-hand effect). During active movements there is also superior visual performance when moving toward compared to away from the stimulus (direction effect). The “modulated visual pathways” hypothesis argues that differential involvement of magno- and parvocellular visual processing streams causes the near-hand effect. The key finding supporting this hypothesis is an increase in temporal and a reduction in spatial processing in near-hand space (Gozli et al., 2012). Since this hypothesis has, so far, only been tested with static hand postures, we provide a conceptual replication of Gozli et al.’s (2012) result with moving hands, thus also probing the generality of the direction effect. Participants performed temporal or spatial gap discriminations while their right hand was moving below the display. In contrast to Gozli et al. (2012), temporal gap discrimination was superior at intermediate and not near hand proximity. In spatial gap discrimination, a direction effect without hand proximity effect suggests that pragmatic attentional maps overshadowed temporal/spatial processing biases for far/near-hand space.
Collapse
Affiliation(s)
- Michael Wiemers
- Division of Cognitive Science, University of PotsdamPotsdam, Germany; Donders Institute for Brain, Cognition and Behaviour, Radboud UniversityNijmegen, Netherlands
| | - Martin H Fischer
- Division of Cognitive Science, University of Potsdam Potsdam, Germany
| |
Collapse
|
15
|
Abstract
Observers experience affordance-specific biases in visual processing for objects within the hands' grasping space, but the mechanism that tunes visual cognition to facilitate action remains unknown. I investigated the hypothesis that altered vision near the hands is a result of experience-driven plasticity. Participants performed motion-detection and form-perception tasks-while their hands were either near the display, in atypical grasping postures, or positioned in their laps-both before and after learning novel grasp affordances. Participants showed enhanced temporal sensitivity for stimuli viewed near the backs of the hands after training to execute a power grasp using the backs of their hands (Experiment 1), but showed enhanced spatial sensitivity for stimuli viewed near the tips of their little fingers after training to use their little fingers to execute a precision grasp (Experiment 2). These results show that visual biases near the hands are plastic, facilitating processing of information relevant to learned grasp affordances.
Collapse
Affiliation(s)
- Laura E Thomas
- Center for Visual and Cognitive Neuroscience, Department of Psychology, North Dakota State University
| |
Collapse
|
16
|
|
17
|
Constable MD, Pratt J, Gozli DG, Welsh TN. Do you see what I see? Co-actor posture modulates visual processing in joint tasks. VISUAL COGNITION 2015. [DOI: 10.1080/13506285.2015.1078426] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
18
|
Davoli CC, Tseng P. Editorial: Taking a hands-on approach: current perspectives on the effect of hand position on vision. Front Psychol 2015; 6:1231. [PMID: 26347693 PMCID: PMC4540014 DOI: 10.3389/fpsyg.2015.01231] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2015] [Accepted: 08/03/2015] [Indexed: 11/16/2022] Open
Affiliation(s)
| | - Philip Tseng
- Graduate Institute of Humanities in Medicine, Taipei Medical University Taipei, Taiwan ; Brain and Consciousness Research Center, Shuang-Ho Hospital New Taipei City, Taiwan
| |
Collapse
|
19
|
Hand position influences perceptual grouping. Exp Brain Res 2015; 233:2627-34. [PMID: 26026809 DOI: 10.1007/s00221-015-4332-7] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2015] [Accepted: 05/15/2015] [Indexed: 11/27/2022]
Abstract
Over the past decade, evidence has accumulated that performance in attention, perception, and memory-related tasks are influenced by the distance between the hands and the stimuli (i.e., placing the observer's hands near or far from the stimuli). To account for existing findings, it has recently been proposed that processing of stimuli near the hands is dominated by the magnocellular visual pathway. The present study tests an implication of this hypothesis, whether perceptual grouping is reduced in hands-proximal space. Consistent with previous work on the object-based capture of attention, a benefit for the visual object in the hands-distal condition was observed in the present study. Interestingly, the object-based benefit did not emerge in the hands-proximal condition, suggesting perceptual grouping is impaired near the hands. This change in perceptual grouping processes provides further support for the hypothesis that visual processing near the hands is subject to increased magnocellular processing.
Collapse
|
20
|
Taylor JET, Pratt J, Witt JK. Joint attention for stimuli on the hands: ownership matters. Front Psychol 2015; 6:543. [PMID: 25983713 PMCID: PMC4416455 DOI: 10.3389/fpsyg.2015.00543] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2014] [Accepted: 04/15/2015] [Indexed: 11/29/2022] Open
Abstract
The visual system treats the space near the hands with unique, action-related priorities. For example, attention orients slowly to stimuli on the hands (Taylor and Witt, 2014). In this article, we asked whether jointly attended hands are attended in the same way. Specifically, we examined whether ownership over the hand mattered: do we attend to our hands and the hands of others in the same way? Pairs of participants performed a spatial cueing task with stimuli that could be projected onto one partner’s hands or on a control surface. Results show delayed orienting of attention to targets appearing on the hands, but only for the owner of the hands. For an observer, others’ hands are like any other surface. This result emphasizes the importance of ownership for hand-based effects on vision, and in doing so, is inconsistent with some expectations of the joint action literature.
Collapse
Affiliation(s)
- J E T Taylor
- Department of Psychological Sciences, Purdue University , West Lafayette, IN, USA ; Department of Psychology, University of Toronto , Toronto, ON, Canada
| | - Jay Pratt
- Department of Psychology, University of Toronto , Toronto, ON, Canada
| | - Jessica K Witt
- Department of Psychology, Colorado State University , Fort Collins, CO, USA
| |
Collapse
|
21
|
Thomas LE. Grasp posture alters visual processing biases near the hands. Psychol Sci 2015; 26:625-32. [PMID: 25862545 DOI: 10.1177/0956797615571418] [Citation(s) in RCA: 47] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2014] [Accepted: 01/15/2015] [Indexed: 11/16/2022] Open
Abstract
Observers experience biases in visual processing for objects within easy reach of their hands; these biases may assist them in evaluating items that are candidates for action. I investigated the hypothesis that hand postures that afford different types of actions differentially bias vision. Across three experiments, participants performed global-motion-detection and global-form-perception tasks while their hands were positioned (a) near the display in a posture affording a power grasp, (b) near the display in a posture affording a precision grasp, or (c) in their laps. Although the power-grasp posture facilitated performance on the motion-detection task, the precision-grasp posture instead facilitated performance on the form-perception task. These results suggest that the visual system weights processing on the basis of an observer's current affordances for specific actions: Fast and forceful power grasps enhance temporal sensitivity, whereas detail-oriented precision grasps enhance spatial sensitivity.
Collapse
|
22
|
Gozli DG, Moskowitz JB, Pratt J. Visual attention to features by associative learning. Cognition 2014; 133:488-501. [DOI: 10.1016/j.cognition.2014.07.014] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2013] [Revised: 07/24/2014] [Accepted: 07/31/2014] [Indexed: 11/29/2022]
|