1
|
Kim T, Zhou R, Gassass S, Soberano T, Liu L, Philip BA. Healthy adults favor stable left/right hand choices over performance at an unconstrained reach-to-grasp task. Exp Brain Res 2024; 242:1349-1359. [PMID: 38563977 DOI: 10.1007/s00221-024-06828-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2023] [Accepted: 03/25/2024] [Indexed: 04/04/2024]
Abstract
Reach-to-grasp actions are fundamental to the daily activities of human life, but few methods exist to assess individuals' reaching and grasping actions in unconstrained environments. The Block Building Task (BBT) provides an opportunity to directly observe and quantify these actions, including left/right hand choices. Here we sought to investigate the motor and non-motor causes of left/right hand choices, and optimize the design of the BBT, by manipulating motor and non-motor difficulty in the BBT's unconstrained reach-to-grasp task. We hypothesized that greater motor and non-motor (e.g. cognitive/perceptual) difficulty would drive increased usage of the dominant hand. To test this hypothesis, we modulated block size (large vs. small) to influence motor difficulty, and model complexity (10 vs. 5 blocks per model) to influence non-motor difficulty, in healthy adults (n = 57). Our data revealed that increased motor and non-motor difficulty led to lower task performance (slower task speed), but participants only increased use of their dominant hand only under the most difficult combination of conditions: in other words, participants allowed their performance to degrade before changing hand choices, even though participants were instructed only to optimize performance. These results demonstrate that hand choices during reach-to grasp actions are more stable than motor performance in healthy right-handed adults, but tasks with multifaceted difficulties can drive individuals to rely more on their dominant hand.
Collapse
Affiliation(s)
- Taewon Kim
- Program in Occupational Therapy, Washington University School of Medicine, St. Louis, MO, USA
- Department of Kinesiology, The Pennsylvania State University, University Park, PA, USA
- Department of Physical Medicine and Rehabilitation, Penn State College of Medicine, Hershey, PA, USA
| | - Ruiwen Zhou
- Department of Biostatistics, Washington University School of Medicine, St. Louis, MO, USA
| | - Samah Gassass
- Program in Occupational Therapy, Washington University School of Medicine, St. Louis, MO, USA
| | - Téa Soberano
- Program in Occupational Therapy, Washington University School of Medicine, St. Louis, MO, USA
| | - Lei Liu
- Department of Biostatistics, Washington University School of Medicine, St. Louis, MO, USA
| | - Benjamin A Philip
- Program in Occupational Therapy, Washington University School of Medicine, St. Louis, MO, USA.
| |
Collapse
|
2
|
Klein LK, Maiello G, Stubbs K, Proklova D, Chen J, Paulun VC, Culham JC, Fleming RW. Distinct Neural Components of Visually Guided Grasping during Planning and Execution. J Neurosci 2023; 43:8504-8514. [PMID: 37848285 PMCID: PMC10711727 DOI: 10.1523/jneurosci.0335-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2023] [Revised: 07/18/2023] [Accepted: 09/06/2023] [Indexed: 10/19/2023] Open
Abstract
Selecting suitable grasps on three-dimensional objects is a challenging visuomotor computation, which involves combining information about an object (e.g., its shape, size, and mass) with information about the actor's body (e.g., the optimal grasp aperture and hand posture for comfortable manipulation). Here, we used functional magnetic resonance imaging to investigate brain networks associated with these distinct aspects during grasp planning and execution. Human participants of either sex viewed and then executed preselected grasps on L-shaped objects made of wood and/or brass. By leveraging a computational approach that accurately predicts human grasp locations, we selected grasp points that disentangled the role of multiple grasp-relevant factors, that is, grasp axis, grasp size, and object mass. Representational Similarity Analysis revealed that grasp axis was encoded along dorsal-stream regions during grasp planning. Grasp size was first encoded in ventral stream areas during grasp planning then in premotor regions during grasp execution. Object mass was encoded in ventral stream and (pre)motor regions only during grasp execution. Premotor regions further encoded visual predictions of grasp comfort, whereas the ventral stream encoded grasp comfort during execution, suggesting its involvement in haptic evaluation. These shifts in neural representations thus capture the sensorimotor transformations that allow humans to grasp objects.SIGNIFICANCE STATEMENT Grasping requires integrating object properties with constraints on hand and arm postures. Using a computational approach that accurately predicts human grasp locations by combining such constraints, we selected grasps on objects that disentangled the relative contributions of object mass, grasp size, and grasp axis during grasp planning and execution in a neuroimaging study. Our findings reveal a greater role of dorsal-stream visuomotor areas during grasp planning, and, surprisingly, increasing ventral stream engagement during execution. We propose that during planning, visuomotor representations initially encode grasp axis and size. Perceptual representations of object material properties become more relevant instead as the hand approaches the object and motor programs are refined with estimates of the grip forces required to successfully lift the object.
Collapse
Affiliation(s)
- Lina K Klein
- Department of Experimental Psychology, Justus Liebig University Giessen, 35390 Giessen, Germany
| | - Guido Maiello
- School of Psychology, University of Southampton, Southampton SO17 1PS, United Kingdom
| | - Kevin Stubbs
- Department of Psychology, University of Western Ontario, London, Ontario N6A 5C2, Canada
| | - Daria Proklova
- Department of Psychology, University of Western Ontario, London, Ontario N6A 5C2, Canada
| | - Juan Chen
- Center for the Study of Applied Psychology, Guangdong Key Laboratory of Mental Health and Cognitive Science, and the School of Psychology, South China Normal University, Guangzhou, 510631, China
- Key Laboratory of Brain, Cognition and Education Sciences, South China Normal University, Guangzhou 510631, China
| | - Vivian C Paulun
- McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139
| | - Jody C Culham
- Department of Psychology, University of Western Ontario, London, Ontario N6A 5C2, Canada
| | - Roland W Fleming
- Department of Experimental Psychology, Justus Liebig University Giessen, 35390 Giessen, Germany
- Center for Mind, Brain and Behavior, University of Marburg and Justus Liebig University Giessen, Giessen, Germany, 35390
| |
Collapse
|
3
|
Beyvers MC, Voudouris D, Fiehler K. Sensorimotor memories influence movement kinematics but not associated tactile processing. Sci Rep 2023; 13:17920. [PMID: 37863998 PMCID: PMC10589242 DOI: 10.1038/s41598-023-45138-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2023] [Accepted: 10/16/2023] [Indexed: 10/22/2023] Open
Abstract
When interacting with objects, we often rely on visual information. However, vision is not always the most reliable sense for determining relevant object properties. For example, when the mass distribution of an object cannot be inferred visually, humans may rely on predictions about the object's dynamics. Such predictions may not only influence motor behavior but also associated processing of movement-related afferent information, leading to reduced tactile sensitivity during movement. We examined whether predictions based on sensorimotor memories influence grasping kinematics and associated tactile processing. Participants lifted an object of unknown mass distribution and reported whether they detected a tactile stimulus on their grasping hand during the lift. In Experiment 1, the mass distribution could change from trial to trial, whereas in Experiment 2, we intermingled longer with shorter parts of constant and variable mass distributions, while also providing implicit or explicit information about the trial structure. In both experiments, participants grasped the object by predictively choosing contact points that would compensate the mass distribution experienced in the previous trial. Tactile suppression during movement, however, was invariant across conditions. These results suggest that predictions based on sensorimotor memories can influence movement kinematics but not associated tactile perception.
Collapse
Affiliation(s)
- Marie C Beyvers
- Department of Experimental Psychology, Justus Liebig University Giessen, Otto-Behaghel-Strasse 10F, 35394, Giessen, Germany
| | - Dimitris Voudouris
- Department of Experimental Psychology, Justus Liebig University Giessen, Otto-Behaghel-Strasse 10F, 35394, Giessen, Germany
| | - Katja Fiehler
- Department of Experimental Psychology, Justus Liebig University Giessen, Otto-Behaghel-Strasse 10F, 35394, Giessen, Germany.
- Center for Mind, Brain and Behavior (CMMB), University of Marburg and Justus Liebig University Giessen, Giessen, Germany.
| |
Collapse
|
4
|
Mastinu E, Coletti A, Mohammad SHA, van den Berg J, Cipriani C. HANDdata - first-person dataset including proximity and kinematics measurements from reach-to-grasp actions. Sci Data 2023; 10:405. [PMID: 37355716 PMCID: PMC10290694 DOI: 10.1038/s41597-023-02313-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2023] [Accepted: 06/14/2023] [Indexed: 06/26/2023] Open
Abstract
HANDdata is a dataset designed to provide hand kinematics and proximity vision data during reach to grasp actions of non-virtual objects, specifically tailored for autonomous grasping of a robotic hand, and with particular attention to the reaching phase. Thus, we sought to capture target object characteristics from radar and time-of-flight proximity sensors, as well as details of the reach-to-grasp action by looking at wrist and fingers kinematics, and at hand-object interaction main events. We structured the data collection as a sequence of static and grasping tasks, organized by increasing levels of complexity. HANDdata is a first-person, reach-to-grasp dataset that includes almost 6000 human-object interactions from 29 healthy adults, with 10 standardized objects of 5 different shapes and 2 kinds of materials. We believe that such data collection can be of value for researchers interested in autonomous grasping robots for healthcare and industrial applications, as well as for those interested in radar-based computer vision and in basic aspects of sensorimotor control and manipulation.
Collapse
Affiliation(s)
- Enzo Mastinu
- BioRobotics Institute, Scuola Superiore Sant'Anna, Pisa, Italy.
| | - Anna Coletti
- BioRobotics Institute, Scuola Superiore Sant'Anna, Pisa, Italy
| | | | | | | |
Collapse
|
5
|
Voudouris D, Fiehler K. The role of grasping demands on tactile suppression. Hum Mov Sci 2022; 83:102957. [DOI: 10.1016/j.humov.2022.102957] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2021] [Revised: 03/27/2022] [Accepted: 04/30/2022] [Indexed: 11/15/2022]
|