1
|
Laurence-Chasen JD, Ross CF, Arce-McShane FI, Hatsopoulos NG. Robust cortical encoding of 3D tongue shape during feeding in macaques. Nat Commun 2023; 14:2991. [PMID: 37225708 PMCID: PMC10209084 DOI: 10.1038/s41467-023-38586-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2022] [Accepted: 05/08/2023] [Indexed: 05/26/2023] Open
Abstract
Dexterous tongue deformation underlies eating, drinking, and speaking. The orofacial sensorimotor cortex has been implicated in the control of coordinated tongue kinematics, but little is known about how the brain encodes-and ultimately drives-the tongue's 3D, soft-body deformation. Here we combine a biplanar x-ray video technology, multi-electrode cortical recordings, and machine-learning-based decoding to explore the cortical representation of lingual deformation. We trained long short-term memory (LSTM) neural networks to decode various aspects of intraoral tongue deformation from cortical activity during feeding in male Rhesus monkeys. We show that both lingual movements and complex lingual shapes across a range of feeding behaviors could be decoded with high accuracy, and that the distribution of deformation-related information across cortical regions was consistent with previous studies of the arm and hand.
Collapse
Affiliation(s)
- Jeffrey D Laurence-Chasen
- Department of Organismal Biology and Anatomy, The University of Chicago, 1027 E 57th Street, Chicago, IL, 60637, USA.
| | - Callum F Ross
- Department of Organismal Biology and Anatomy, The University of Chicago, 1027 E 57th Street, Chicago, IL, 60637, USA
| | - Fritzie I Arce-McShane
- Department of Oral Health Sciences, School of Dentistry, University of Washington, 1959 NE Pacific Street, Box #357475, Seattle, WA, 98195-7475, USA
- Graduate Program in Neuroscience, University of Washington, 1959 NE Pacific St., Seattle, WA, 98195-7475, USA
| | - Nicholas G Hatsopoulos
- Department of Organismal Biology and Anatomy, The University of Chicago, 1027 E 57th Street, Chicago, IL, 60637, USA
- Program in Computational Neuroscience, The University of Chicago, 5812 South Ellis Avenue, Chicago, IL, 60637, USA
| |
Collapse
|
2
|
Knights E, Smith FW, Rossit S. The role of the anterior temporal cortex in action: evidence from fMRI multivariate searchlight analysis during real object grasping. Sci Rep 2022; 12:9042. [PMID: 35662252 PMCID: PMC9167815 DOI: 10.1038/s41598-022-12174-9] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2022] [Accepted: 04/29/2022] [Indexed: 12/20/2022] Open
Abstract
Intelligent manipulation of handheld tools marks a major discontinuity between humans and our closest ancestors. Here we identified neural representations about how tools are typically manipulated within left anterior temporal cortex, by shifting a searchlight classifier through whole-brain real action fMRI data when participants grasped 3D-printed tools in ways considered typical for use (i.e., by their handle). These neural representations were automatically evocated as task performance did not require semantic processing. In fact, findings from a behavioural motion-capture experiment confirmed that actions with tools (relative to non-tool) incurred additional processing costs, as would be suspected if semantic areas are being automatically engaged. These results substantiate theories of semantic cognition that claim the anterior temporal cortex combines sensorimotor and semantic content for advanced behaviours like tool manipulation.
Collapse
Affiliation(s)
- Ethan Knights
- School of Psychology, University of East Anglia, Norwich, UK
| | - Fraser W Smith
- School of Psychology, University of East Anglia, Norwich, UK
| | | |
Collapse
|
3
|
Liu Y, Caracoglia J, Sen S, Freud E, Striem-Amit E. Are reaching and grasping effector-independent? Similarities and differences in reaching and grasping kinematics between the hand and foot. Exp Brain Res 2022; 240:1833-1848. [PMID: 35426511 PMCID: PMC9142431 DOI: 10.1007/s00221-022-06359-x] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2021] [Accepted: 03/24/2022] [Indexed: 11/30/2022]
Abstract
While reaching and grasping are highly prevalent manual actions, neuroimaging studies provide evidence that their neural representations may be shared between different body parts, i.e., effectors. If these actions are guided by effector-independent mechanisms, similar kinematics should be observed when the action is performed by the hand or by a cortically remote and less experienced effector, such as the foot. We tested this hypothesis with two characteristic components of action: the initial ballistic stage of reaching, and the preshaping of the digits during grasping based on object size. We examined if these kinematic features reflect effector-independent mechanisms by asking participants to reach toward and to grasp objects of different widths with their hand and foot. First, during both reaching and grasping, the velocity profile up to peak velocity matched between the hand and the foot, indicating a shared ballistic acceleration phase. Second, maximum grip aperture and time of maximum grip aperture of grasping increased with object size for both effectors, indicating encoding of object size during transport. Differences between the hand and foot were found in the deceleration phase and time of maximum grip aperture, likely due to biomechanical differences and the participants’ inexperience with foot actions. These findings provide evidence for effector-independent visuomotor mechanisms of reaching and grasping that generalize across body parts.
Collapse
Affiliation(s)
- Yuqi Liu
- Department of Neuroscience, Georgetown University Medical Center, Washington, DC, 20057, USA.
- Institute of Neuroscience, Key Laboratory of Primate Neurobiology, CAS Center for Excellence in Brain Sciences and Intelligence Technology, Chinese Academy of Sciences, Shanghai, China.
| | - James Caracoglia
- Department of Neuroscience, Georgetown University Medical Center, Washington, DC, 20057, USA
- Division of Graduate Medical Sciences, Boston University Medical Center, Boston, MA, 02215, USA
| | - Sriparna Sen
- Department of Neuroscience, Georgetown University Medical Center, Washington, DC, 20057, USA
| | - Erez Freud
- Department of Psychology, York University, Toronto, ON, M3J 1P3, Canada
- Centre for Vision Research, York University, Toronto, ON, M3J 1P3, Canada
| | - Ella Striem-Amit
- Department of Neuroscience, Georgetown University Medical Center, Washington, DC, 20057, USA.
| |
Collapse
|
4
|
Quadrelli E, Roberti E, Turati C, Craighero L. Observation of the point-light animation of a grasping hand activates sensorimotor cortex in nine-month-old infants. Cortex 2019; 119:373-385. [PMID: 31401422 DOI: 10.1016/j.cortex.2019.07.006] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2019] [Revised: 05/27/2019] [Accepted: 07/09/2019] [Indexed: 11/20/2022]
Abstract
Measuring changes in sensorimotor alpha band activity in nine-month-old infants we sought to understand the involvement of the sensorimotor cortex during observation of the Point-Light (PL) animation of a grasping hand. Attenuation of alpha activity was found both when the PL display moved towards the to-be-grasped object and when the object was deleted from the video. Before the beginning of the movement of the PL stimuli, only in the presence of the object evoked attenuation of sensorimotor alpha activity was documented, possibly interpreted either as movement prediction or as graspable object perception. Our main findings demonstrate that, during observation of stimuli moving with biological kinematics, the infants' sensorimotor system is activated when the pictorial information is absent or highly reduced, and independently of the presence of the goal-directed object. The possible compensatory function of the sensorimotor system during observation of highly degraded moving stimuli is discussed.
Collapse
Affiliation(s)
- Ermanno Quadrelli
- Department of Psychology, University of Milano-Bicocca, Italy; NeuroMI, Milan Center for Neuroscience, Italy
| | - Elisa Roberti
- Department of Psychology, University of Milano-Bicocca, Italy; NeuroMI, Milan Center for Neuroscience, Italy
| | - Chiara Turati
- Department of Psychology, University of Milano-Bicocca, Italy; NeuroMI, Milan Center for Neuroscience, Italy
| | - Laila Craighero
- Department of Biomedical and Specialty Surgical Sciences, University of Ferrara, Italy.
| |
Collapse
|
5
|
Senna I, Cardinali L, Farnè A, Brozzoli C. Aim and Plausibility of Action Chains Remap Peripersonal Space. Front Psychol 2019; 10:1681. [PMID: 31379692 PMCID: PMC6652232 DOI: 10.3389/fpsyg.2019.01681] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2018] [Accepted: 07/03/2019] [Indexed: 11/22/2022] Open
Abstract
Successful interaction with objects in the peripersonal space requires that the information relative to current and upcoming positions of our body is continuously monitored and updated with respect to the location of target objects. Voluntary actions, for example, are known to induce an anticipatory remapping of the peri-hand space (PHS, i.e., the space near the acting hand) during the very early stages of the action chain: planning and initiating an object grasp increase the interference exerted by visual stimuli coming from the object on touches delivered to the grasping hand, thus allowing for hand-object position monitoring and guidance. Voluntarily grasping an object, though, is rarely performed in isolation. Grasping a candy, for example, is most typically followed by concatenated secondary action steps (bringing the candy to the mouth and swallowing it) that represent the agent’s ultimate intention (to eat the candy). However, whether and when complex action chains remap the PHS remains unknown, just as whether remapping is conditional to goal achievability (e.g., candy-mouth fit). Here we asked these questions by assessing changes in visuo-tactile interference on the acting hand while participants had to grasp an object serving as a support for an elongated candy, and bring it toward their mouth. Depending on its orientation, the candy could potentially enter the participants’ mouth (plausible goal), or not (implausible goal). We observed increased visuo-tactile interference at relatively late stages of the action chain, after the object had been grasped, and only when the action goal was plausible. These findings suggest that multisensory interactions during action execution depend upon the final aim and plausibility of complex goal-directed actions, and extend our knowledge about the role of peripersonal space in guiding goal-directed voluntary actions.
Collapse
Affiliation(s)
- Irene Senna
- Integrative Multisensory Perception Action and Cognition Team (ImpAct), Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, Lyon, France.,Department of Applied Cognitive Psychology, Ulm University, Ulm, Germany
| | - Lucilla Cardinali
- Cognition, Motion and Neuroscience Unit, Fondazione Istituto Italiano di Tecnologia, Genoa, Italy
| | - Alessandro Farnè
- Integrative Multisensory Perception Action and Cognition Team (ImpAct), Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, Lyon, France.,University of Lyon 1, Lyon, France.,Hospices Civils de Lyon, Mouvement et Handicap & Neuro-Immersion, Lyon, France.,Center for Mind/Brain Sciences, University of Trento, Trento, Italy
| | - Claudio Brozzoli
- Integrative Multisensory Perception Action and Cognition Team (ImpAct), Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, Lyon, France.,University of Lyon 1, Lyon, France.,Hospices Civils de Lyon, Mouvement et Handicap & Neuro-Immersion, Lyon, France.,Institutionen för Neurobiologi, Vårdvetenskap och Samhälle, Aging Research Center, Karolinska Institutet, Stockholm, Sweden
| |
Collapse
|
6
|
Chen CF, Kreutz-Delgado K, Sereno MI, Huang RS. Unraveling the spatiotemporal brain dynamics during a simulated reach-to-eat task. Neuroimage 2019; 185:58-71. [PMID: 30315910 PMCID: PMC6325169 DOI: 10.1016/j.neuroimage.2018.10.028] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2018] [Revised: 09/11/2018] [Accepted: 10/09/2018] [Indexed: 01/17/2023] Open
Abstract
The reach-to-eat task involves a sequence of action components including looking, reaching, grasping, and feeding. While cortical representations of individual action components have been mapped in human functional magnetic resonance imaging (fMRI) studies, little is known about the continuous spatiotemporal dynamics among these representations during the reach-to-eat task. In a periodic event-related fMRI experiment, subjects were scanned while they reached toward a food image, grasped the virtual food, and brought it to their mouth within each 16-s cycle. Fourier-based analysis of fMRI time series revealed periodic signals and noise distributed across the brain. Independent component analysis was used to remove periodic or aperiodic motion artifacts. Time-frequency analysis was used to analyze the temporal characteristics of periodic signals in each voxel. Circular statistics was then used to estimate mean phase angles of periodic signals and select voxels based on the distribution of phase angles. By sorting mean phase angles across regions, we were able to show the real-time spatiotemporal brain dynamics as continuous traveling waves over the cortical surface. The activation sequence consisted of approximately the following stages: (1) stimulus related activations in occipital and temporal cortices; (2) movement planning related activations in dorsal premotor and superior parietal cortices; (3) reaching related activations in primary sensorimotor cortex and supplementary motor area; (4) grasping related activations in postcentral gyrus and sulcus; (5) feeding related activations in orofacial areas. These results suggest that phase-encoded design and analysis can be used to unravel sequential activations among brain regions during a simulated reach-to-eat task.
Collapse
Affiliation(s)
- Ching-Fu Chen
- Department of Electrical and Computer Engineering, University of California, San Diego, La Jolla, CA, 92093, USA
| | - Kenneth Kreutz-Delgado
- Department of Electrical and Computer Engineering, University of California, San Diego, La Jolla, CA, 92093, USA; Institute for Neural Computation, University of California, San Diego, La Jolla, CA, 92093, USA
| | - Martin I Sereno
- Department of Psychology and Neuroimaging Center, San Diego State University, San Diego, CA, 92182, USA; Experimental Psychology, University College London, London, WC1H 0AP, UK
| | - Ruey-Song Huang
- Institute for Neural Computation, University of California, San Diego, La Jolla, CA, 92093, USA.
| |
Collapse
|
7
|
Tactile learning transfer from the hand to the face but not to the forearm implies a special hand-face relationship. Sci Rep 2018; 8:11752. [PMID: 30082760 PMCID: PMC6079060 DOI: 10.1038/s41598-018-30183-5] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2018] [Accepted: 07/20/2018] [Indexed: 02/04/2023] Open
Abstract
In the primary somatosensory cortex, large-scale cortical and perceptual changes have been demonstrated following input deprivation. Recently, we found that the cortical and perceptual changes induced by repetitive somatosensory stimulation (RSS) at a finger transfer to the face. However, whether such cross-border changes are specific to the face remains elusive. Here, we investigated whether RSS-induced acuity changes at the finger can also transfer to the forearm, which is the body part represented on the other side of the hand representation. Our results confirmed the transfer of tactile learning from the stimulated finger to the lip, but no significant changes were observed at the forearm. A second experiment revealed that the same regions on the forearm exhibited improved tactile acuity when RSS was applied there, excluding the possibility of low plastic ability at the arm representation. This provides also the first evidence that RSS can be efficient on body parts other than the hand. These results suggest that RSS-induced tactile learning transfers preferentially from the hand to the face rather than to the forearm. This specificity could arise from a stronger functional connectivity between the cortical hand and face representations, reflecting a fundamental coupling between these body parts.
Collapse
|
8
|
Yasuda T, Fukiwake M, Shimokasa K, Mine Y. Investigation of Food Characteristics Modulating Spoon Motions in Skilled Spoon Users: Proposal of a Control Target for the Active Self-feeding Spoon. ADVANCED BIOMEDICAL ENGINEERING 2017. [DOI: 10.14326/abe.6.110] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022] Open
Affiliation(s)
- Toshitaka Yasuda
- Department of Electronic Engineering, Tokyo National College of Technology
| | - Midori Fukiwake
- Department of Electronic Engineering, Tokyo National College of Technology
| | - Kenji Shimokasa
- Course of System Engineering, Division of Industrial Technology, Tsukuba University of Technology
| | - Yasuhiro Mine
- Department of Human Environment Design, Faculty of Human Life Design, Toyo University
| |
Collapse
|
9
|
Unusual hand postures but not familiar tools show motor equivalence with precision grasping. Cognition 2016; 151:28-36. [DOI: 10.1016/j.cognition.2016.02.013] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2015] [Revised: 02/18/2016] [Accepted: 02/18/2016] [Indexed: 11/24/2022]
|