1
|
Agudelo-Toro A, Michaels JA, Sheng WA, Scherberger H. Accurate neural control of a hand prosthesis by posture-related activity in the primate grasping circuit. Neuron 2024; 112:4115-4129.e8. [PMID: 39419024 DOI: 10.1016/j.neuron.2024.09.018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2023] [Revised: 03/15/2024] [Accepted: 09/19/2024] [Indexed: 10/19/2024]
Abstract
Brain-computer interfaces (BCIs) have the potential to restore hand movement for people with paralysis, but current devices still lack the fine control required to interact with objects of daily living. Following our understanding of cortical activity during arm reaches, hand BCI studies have focused primarily on velocity control. However, mounting evidence suggests that posture, and not velocity, dominates in hand-related areas. To explore whether this signal can causally control a prosthesis, we developed a BCI training paradigm centered on the reproduction of posture transitions. Monkeys trained with this protocol were able to control a multidimensional hand prosthesis with high accuracy, including execution of the very intricate precision grip. Analysis revealed that the posture signal in the target grasping areas was the main contributor to control. We present, for the first time, neural posture control of a multidimensional hand prosthesis, opening the door for future interfaces to leverage this additional information channel.
Collapse
Affiliation(s)
- Andres Agudelo-Toro
- Neurobiology Laboratory, Deutsches Primatenzentrum GmbH, Göttingen 37077, Germany.
| | - Jonathan A Michaels
- Neurobiology Laboratory, Deutsches Primatenzentrum GmbH, Göttingen 37077, Germany; School of Kinesiology and Health Science, Faculty of Health, York University, Toronto, ON M3J 1P3, Canada
| | - Wei-An Sheng
- Neurobiology Laboratory, Deutsches Primatenzentrum GmbH, Göttingen 37077, Germany; Institute of Biomedical Sciences, Academia Sinica, Taipei 115, Taiwan
| | - Hansjörg Scherberger
- Neurobiology Laboratory, Deutsches Primatenzentrum GmbH, Göttingen 37077, Germany; Faculty of Biology and Psychology, University of Göttingen, Göttingen 37073, Germany.
| |
Collapse
|
2
|
Maranesi M, Lanzilotto M, Arcuri E, Bonini L. Mixed selectivity in monkey anterior intraparietal area during visual and motor processes. Prog Neurobiol 2024; 236:102611. [PMID: 38604583 DOI: 10.1016/j.pneurobio.2024.102611] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2023] [Revised: 02/29/2024] [Accepted: 04/08/2024] [Indexed: 04/13/2024]
Abstract
Classical studies suggest that the anterior intraparietal area (AIP) contributes to the encoding of specific information such as objects and actions of self and others, through a variety of neuronal classes, such as canonical, motor and mirror neurons. However, these studies typically focused on a single variable, leaving it unclear whether distinct sets of AIP neurons encode a single or multiple sources of information and how multimodal coding emerges. Here, we chronically recorded monkey AIP neurons in a variety of tasks and conditions classically employed in separate experiments. Most cells exhibited mixed selectivity for observed objects, executed actions, and observed actions, enhanced when this information came from the monkey's peripersonal working space. In contrast with the classical view, our findings indicate that multimodal coding emerges in AIP from partially-mixed selectivity of individual neurons for a variety of information relevant for planning actions directed to both physical objects and other subjects.
Collapse
Affiliation(s)
- Monica Maranesi
- Department of Medicine and Surgery, University of Parma, Parma 43125, Italy.
| | - Marco Lanzilotto
- Department of Medicine and Surgery, University of Parma, Parma 43125, Italy
| | - Edoardo Arcuri
- Department of Medicine and Surgery, University of Parma, Parma 43125, Italy
| | - Luca Bonini
- Department of Medicine and Surgery, University of Parma, Parma 43125, Italy
| |
Collapse
|
3
|
Ali Y, Montani V, Cesari P. Neural underpinnings of the interplay between actual touch and action imagination in social contexts. Front Hum Neurosci 2024; 17:1274299. [PMID: 38292652 PMCID: PMC10826515 DOI: 10.3389/fnhum.2023.1274299] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2023] [Accepted: 12/21/2023] [Indexed: 02/01/2024] Open
Abstract
While there is established evidence supporting the involvement of the sense of touch in various actions, the neural underpinnings of touch and action interplay in a social context remain poorly understood. To prospectively investigate this phenomenon and offer further insights, we employed a combination of motor and sensory components by asking participants to imagine exerting force with the index finger while experiencing their own touch, the touch of one another individual, the touch of a surface, and no touch. Based on the assumption that the patterns of activation in the motor system are similar when action is imagined or actually performed, we proceeded to apply a single-pulse transcranial magnetic stimulation over the primary motor cortex (M1) while participants engaged in the act of imagination. Touch experience was associated with higher M1 excitability in the presence and in the absence of force production imagination, but only during force production imagination M1 excitability differed among the types of touch: both biological sources, the self-touch and the touch of one other individual, elicited a significant increase in motor system activity when compared to touching a non-living surface or in the absence of touch. A strong correlation between individual touch avoidance questionnaire values and facilitation in the motor system was present while touching another person, indicating a social aspect for touch in action. The present study unveils the motor system correlates when the sensory/motor components of touch are considered in social contexts.
Collapse
Affiliation(s)
| | | | - Paola Cesari
- Department of Neuroscience, Biomedicine and Movement Sciences, University of Verona, Verona, Italy
| |
Collapse
|
4
|
Okatan M, Kocatürk M. Decoding the Spike-Band Subthreshold Motor Cortical Activity. J Mot Behav 2023; 56:161-183. [PMID: 37964432 DOI: 10.1080/00222895.2023.2280263] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2023] [Accepted: 10/25/2023] [Indexed: 11/16/2023]
Abstract
Intracortical Brain-Computer Interfaces (iBCI) use single-unit activity (SUA), multiunit activity (MUA) and local field potentials (LFP) to control neuroprosthetic devices. SUA and MUA are usually extracted from the bandpassed recording through amplitude thresholding, while subthreshold data are ignored. Here, we show that subthreshold data can actually be decoded to determine behavioral variables with test set accuracy of up to 100%. Although the utility of SUA, MUA and LFP for decoding behavioral variables has been explored previously, this study investigates the utility of spike-band subthreshold activity exclusively. We provide evidence suggesting that this activity can be used to keep decoding performance at acceptable levels even when SUA quality is reduced over time. To the best of our knowledge, the signals that we derive from the subthreshold activity may be the weakest neural signals that have ever been extracted from extracellular neural recordings, while still being decodable with test set accuracy of up to 100%. These results are relevant for the development of fully data-driven and automated methods for amplitude thresholding spike-band extracellular neural recordings in iBCIs containing thousands of electrodes.
Collapse
Affiliation(s)
- Murat Okatan
- Informatics Institute, Istanbul Technical University, Istanbul, Türkiye
- Artificial Intelligence and Data Engineering Department, Istanbul Technical University, Istanbul, Türkiye
| | - Mehmet Kocatürk
- Biomedical Engineering Department, Istanbul Medipol University, Istanbul, Türkiye
- Research Institute for Health Sciences and Technologies (SABITA), Istanbul Medipol University, Istanbul, Türkiye
| |
Collapse
|
5
|
Marciniak Dg Agra K, Dg Agra P. F = ma. Is the macaque brain Newtonian? Cogn Neuropsychol 2023; 39:376-408. [PMID: 37045793 DOI: 10.1080/02643294.2023.2191843] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/14/2023]
Abstract
Intuitive Physics, the ability to anticipate how the physical events involving mass objects unfold in time and space, is a central component of intelligent systems. Intuitive physics is a promising tool for gaining insight into mechanisms that generalize across species because both humans and non-human primates are subject to the same physical constraints when engaging with the environment. Physical reasoning abilities are widely present within the animal kingdom, but monkeys, with acute 3D vision and a high level of dexterity, appreciate and manipulate the physical world in much the same way humans do.
Collapse
Affiliation(s)
- Karolina Marciniak Dg Agra
- The Rockefeller University, Laboratory of Neural Circuits, New York, NY, USA
- Center for Brain, Minds and Machines, Cambridge, MA, USA
| | - Pedro Dg Agra
- The Rockefeller University, Laboratory of Neural Circuits, New York, NY, USA
- Center for Brain, Minds and Machines, Cambridge, MA, USA
| |
Collapse
|
6
|
Sobinov AR, Bensmaia SJ. The neural mechanisms of manual dexterity. Nat Rev Neurosci 2021; 22:741-757. [PMID: 34711956 DOI: 10.1038/s41583-021-00528-7] [Citation(s) in RCA: 62] [Impact Index Per Article: 15.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/21/2021] [Indexed: 01/22/2023]
Abstract
The hand endows us with unparalleled precision and versatility in our interactions with objects, from mundane activities such as grasping to extraordinary ones such as virtuoso pianism. The complex anatomy of the human hand combined with expansive and specialized neuronal control circuits allows a wide range of precise manual behaviours. To support these behaviours, an exquisite sensory apparatus, spanning the modalities of touch and proprioception, conveys detailed and timely information about our interactions with objects and about the objects themselves. The study of manual dexterity provides a unique lens into the sensorimotor mechanisms that endow the nervous system with the ability to flexibly generate complex behaviour.
Collapse
Affiliation(s)
- Anton R Sobinov
- Department of Organismal Biology and Anatomy, University of Chicago, Chicago, IL, USA.,Neuroscience Institute, University of Chicago, Chicago, IL, USA
| | - Sliman J Bensmaia
- Department of Organismal Biology and Anatomy, University of Chicago, Chicago, IL, USA. .,Neuroscience Institute, University of Chicago, Chicago, IL, USA. .,Committee on Computational Neuroscience, University of Chicago, Chicago, IL, USA.
| |
Collapse
|
7
|
Dekleva BM, Weiss JM, Boninger ML, Collinger JL. Generalizable cursor click decoding using grasp-related neural transients. J Neural Eng 2021; 18. [PMID: 34289456 DOI: 10.1088/1741-2552/ac16b2] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2021] [Accepted: 07/21/2021] [Indexed: 11/11/2022]
Abstract
Objective.Intracortical brain-computer interfaces (iBCI) have the potential to restore independence for individuals with significant motor or communication impairments. One of the most realistic avenues for clinical translation of iBCI technology is enabling control of a computer cursor-i.e. movement-related neural activity is interpreted (decoded) and used to drive cursor function. Here we aim to improve cursor click decoding to allow for both point-and-click and click-and-drag control.Approach.Using chronic microelectrode arrays implanted in the motor cortex of two participants with tetraplegia, we identified prominent neural responses related to attempted hand grasp. We then developed a new approach for decoding cursor click (hand grasp) based on the most salient responses.Main results.We found that the population-wide response contained three dominant components related to hand grasp: an onset transient response, a sustained response, and an offset transient response. The transient responses were larger in magnitude-and thus more reliably detected-than the sustained response, and a click decoder based on these transients outperformed the standard approach of binary state classification.Significance.A transient-based approach for identifying hand grasp can provide a high degree of cursor click control for both point-and-click and click-and-drag applications. This generalized click functionality is an important step toward high-performance cursor control and eventual clinical translation of iBCI technology.
Collapse
Affiliation(s)
- Brian M Dekleva
- Rehab Neural Engineering Labs, University of Pittsburgh, Pittsburgh, PA, United States of America.,Department of Physical Medicine and Rehabilitation, University of Pittsburgh, Pittsburgh, PA, United States of America.,Center for the Neural Basis of Cognition, Pittsburgh, PA, United States of America
| | - Jeffrey M Weiss
- Rehab Neural Engineering Labs, University of Pittsburgh, Pittsburgh, PA, United States of America
| | - Michael L Boninger
- Rehab Neural Engineering Labs, University of Pittsburgh, Pittsburgh, PA, United States of America.,Department of Physical Medicine and Rehabilitation, University of Pittsburgh, Pittsburgh, PA, United States of America.,Bioengineering, University of Pittsburgh, Pittsburgh, PA, United States of America.,Department of Veterans Affairs, Human Engineering Research Labs, VA Center of Excellence, Pittsburgh, PA, United States of America
| | - Jennifer L Collinger
- Rehab Neural Engineering Labs, University of Pittsburgh, Pittsburgh, PA, United States of America.,Department of Physical Medicine and Rehabilitation, University of Pittsburgh, Pittsburgh, PA, United States of America.,Bioengineering, University of Pittsburgh, Pittsburgh, PA, United States of America.,Center for the Neural Basis of Cognition, Pittsburgh, PA, United States of America.,Department of Veterans Affairs, Human Engineering Research Labs, VA Center of Excellence, Pittsburgh, PA, United States of America.,Department of Biomedical Engineering, Carnegie Mellon University, Pittsburgh, PA, United States of America
| |
Collapse
|
8
|
Buchwald D, Schaffelhofer S, Dörge M, Dann B, Scherberger H. A Turntable Setup for Testing Visual and Tactile Grasping Movements in Non-human Primates. Front Behav Neurosci 2021; 15:648483. [PMID: 34113241 PMCID: PMC8185519 DOI: 10.3389/fnbeh.2021.648483] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2020] [Accepted: 04/19/2021] [Indexed: 11/13/2022] Open
Abstract
Grasping movements are some of the most common movements primates do every day. They are important for social interactions as well as picking up objects or food. Usually, these grasping movements are guided by vision but proprioceptive and haptic inputs contribute greatly. Since grasping behaviors are common and easy to motivate, they represent an ideal task for understanding the role of different brain areas during planning and execution of complex voluntary movements in primates. For experimental purposes, a stable and repeatable presentation of the same object as well as the variation of objects is important in order to understand the neural control of movement generation. This is even more the case when investigating the role of different senses for movement planning, where objects need to be presented in specific sensory modalities. We developed a turntable setup for non-human primates (macaque monkeys) to investigate visually and tactually guided grasping movements with an option to easily exchange objects. The setup consists of a turntable that can fit six different objects and can be exchanged easily during the experiment to increase the number of presented objects. The object turntable is connected to a stepper motor through a belt system to automate rotation and hence object presentation. By increasing the distance between the turntable and the stepper motor, metallic components of the stepper motor are kept at a distance to the actual recording setup, which allows using a magnetic-based data glove to track hand kinematics. During task execution, the animal sits in the dark and is instructed to grasp the object in front of it. Options to turn on a light above the object allow for visual presentation of the objects, while the object can also remain in the dark for exclusive tactile exploration. A red LED is projected onto the object by a one-way mirror that serves as a grasp cue instruction for the animal to start grasping the object. By comparing kinematic data from the magnetic-based data glove with simultaneously recorded neural signals, this setup enables the systematic investigation of neural population activity involved in the neural control of hand grasping movements.
Collapse
Affiliation(s)
- Daniela Buchwald
- Neuroscience Laboratory, Deutsches Primatenzentrum GmbH, Göttingen, Germany
- Faculty of Biology and Psychology, University of Goettingen, Göttingen, Germany
| | | | - Matthias Dörge
- Neuroscience Laboratory, Deutsches Primatenzentrum GmbH, Göttingen, Germany
| | - Benjamin Dann
- Neuroscience Laboratory, Deutsches Primatenzentrum GmbH, Göttingen, Germany
| | - Hansjörg Scherberger
- Neuroscience Laboratory, Deutsches Primatenzentrum GmbH, Göttingen, Germany
- Faculty of Biology and Psychology, University of Goettingen, Göttingen, Germany
| |
Collapse
|
9
|
Mroczkowski CA, Niechwiej-Szwedo E. Stereopsis contributes to the predictive control of grip forces during prehension. Exp Brain Res 2021; 239:1345-1358. [PMID: 33661370 DOI: 10.1007/s00221-021-06052-5] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2020] [Accepted: 01/29/2021] [Indexed: 11/26/2022]
Abstract
Binocular viewing is associated with a superior prehensile performance, which is particularly evident in the latter part of the reach as the hand approaches and makes contact with the target object. However, the visuomotor mechanisms through which binocular vision serves prehension are not fully understood. This study assessed the role of stereopsis in the predictive control of grasping by measuring grip force. Twenty participants performed a precision reach-to-grasp task in four viewing conditions: binocular, monocular, and with reduced stereoacuity (200 arc sec, > 400 arc sec). Monocular, compared to binocular viewing, was associated with a fourfold increase in grasp errors, a 56% increase in grasp duration, 22% decrease in grip force at 50 ms following grasp initiation, and the time of peak force occurred 40% later after grasp initiation (all p < 0.05). Grasp performance was also disrupted when viewing with reduced stereoacuity. Notably, grip force at the time of object lift-off was comparable between all viewing conditions. These results demonstrate that binocular stereopsis contributes to the efficient programming of grip forces. Specifically, stereopsis may provide important sensory information that enables the central nervous system to engage in predictive control of grasping.
Collapse
Affiliation(s)
- Corey A Mroczkowski
- Department of Kinesiology, University of Waterloo, 200 University Ave W, Waterloo, ON, N2L 5G1, Canada
| | - Ewa Niechwiej-Szwedo
- Department of Kinesiology, University of Waterloo, 200 University Ave W, Waterloo, ON, N2L 5G1, Canada.
| |
Collapse
|
10
|
The Neural Representation of Force across Grasp Types in Motor Cortex of Humans with Tetraplegia. eNeuro 2021; 8:ENEURO.0231-20.2020. [PMID: 33495242 PMCID: PMC7920535 DOI: 10.1523/eneuro.0231-20.2020] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2020] [Revised: 10/17/2020] [Accepted: 10/20/2020] [Indexed: 11/21/2022] Open
Abstract
Intracortical brain-computer interfaces (iBCIs) have the potential to restore hand grasping and object interaction to individuals with tetraplegia. Optimal grasping and object interaction require simultaneous production of both force and grasp outputs. However, since overlapping neural populations are modulated by both parameters, grasp type could affect how well forces are decoded from motor cortex in a closed-loop force iBCI. Therefore, this work quantified the neural representation and offline decoding performance of discrete hand grasps and force levels in two human participants with tetraplegia. Participants attempted to produce three discrete forces (light, medium, hard) using up to five hand grasp configurations. A two-way Welch ANOVA was implemented on multiunit neural features to assess their modulation to force and grasp Demixed principal component analysis (dPCA) was used to assess for population-level tuning to force and grasp and to predict these parameters from neural activity. Three major findings emerged from this work: (1) force information was neurally represented and could be decoded across multiple hand grasps (and, in one participant, across attempted elbow extension as well); (2) grasp type affected force representation within multiunit neural features and offline force classification accuracy; and (3) grasp was classified more accurately and had greater population-level representation than force. These findings suggest that force and grasp have both independent and interacting representations within cortex, and that incorporating force control into real-time iBCI systems is feasible across multiple hand grasps if the decoder also accounts for grasp type.
Collapse
|
11
|
Greulich RS, Adam R, Everling S, Scherberger H. Shared functional connectivity between the dorso-medial and dorso-ventral streams in macaques. Sci Rep 2020; 10:18610. [PMID: 33122655 PMCID: PMC7596572 DOI: 10.1038/s41598-020-75219-x] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2020] [Accepted: 10/07/2020] [Indexed: 12/04/2022] Open
Abstract
Manipulation of an object requires us to transport our hand towards the object (reach) and close our digits around that object (grasp). In current models, reach-related information is propagated in the dorso-medial stream from posterior parietal area V6A to medial intraparietal area, dorsal premotor cortex, and primary motor cortex. Grasp-related information is processed in the dorso-ventral stream from the anterior intraparietal area to ventral premotor cortex and the hand area of primary motor cortex. However, recent studies have cast doubt on the validity of this separation in separate processing streams. We investigated in 10 male rhesus macaques the whole-brain functional connectivity of these areas using resting state fMRI at 7-T. Although we found a clear separation between dorso-medial and dorso-ventral network connectivity in support of the two-stream hypothesis, we also found evidence of shared connectivity between these networks. The dorso-ventral network was distinctly correlated with high-order somatosensory areas and feeding related areas, whereas the dorso-medial network with visual areas and trunk/hindlimb motor areas. Shared connectivity was found in the superior frontal and precentral gyrus, central sulcus, intraparietal sulcus, precuneus, and insular cortex. These results suggest that while sensorimotor processing streams are functionally separated, they can access information through shared areas.
Collapse
Affiliation(s)
- R Stefan Greulich
- Deutsches Primatenzentrum GmbH, Kellnerweg 4, 37077, Göttingen, Germany. .,Faculty of Biology and Psychology, University of Goettingen, Göttingen, Germany.
| | - Ramina Adam
- Robarts Research Institute, University of Western Ontario, London, Canada.,Graduate Program in Neuroscience, University of Western Ontario, London, Canada
| | - Stefan Everling
- Robarts Research Institute, University of Western Ontario, London, Canada.,Department of Physiology and Pharmacology, University of Western Ontario, London, Canada
| | - Hansjörg Scherberger
- Deutsches Primatenzentrum GmbH, Kellnerweg 4, 37077, Göttingen, Germany. .,Faculty of Biology and Psychology, University of Goettingen, Göttingen, Germany.
| |
Collapse
|
12
|
The Representation of Finger Movement and Force in Human Motor and Premotor Cortices. eNeuro 2020; 7:ENEURO.0063-20.2020. [PMID: 32769159 PMCID: PMC7438059 DOI: 10.1523/eneuro.0063-20.2020] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2020] [Revised: 05/06/2020] [Accepted: 05/21/2020] [Indexed: 11/21/2022] Open
Abstract
The ability to grasp and manipulate objects requires controlling both finger movement kinematics and isometric force in rapid succession. Previous work suggests that these behavioral modes are controlled separately, but it is unknown whether the cerebral cortex represents them differently. Here, we asked the question of how movement and force were represented cortically, when executed sequentially with the same finger. We recorded high-density electrocorticography (ECoG) from the motor and premotor cortices of seven human subjects performing a movement-force motor task. We decoded finger movement [0.7 ± 0.3 fractional variance accounted for (FVAF)] and force (0.7 ± 0.2 FVAF) with high accuracy, yet found different spatial representations. In addition, we used a state-of-the-art deep learning method to uncover smooth, repeatable trajectories through ECoG state space during the movement-force task. We also summarized ECoG across trials and participants by developing a new metric, the neural vector angle (NVA). Thus, state-space techniques can help to investigate broad cortical networks. Finally, we were able to classify the behavioral mode from neural signals with high accuracy (90 ± 6%). Thus, finger movement and force appear to have distinct representations in motor/premotor cortices. These results inform our understanding of the neural control of movement, as well as the design of grasp brain-machine interfaces (BMIs).
Collapse
|
13
|
Stavisky SD, Willett FR, Wilson GH, Murphy BA, Rezaii P, Avansino DT, Memberg WD, Miller JP, Kirsch RF, Hochberg LR, Ajiboye AB, Druckmann S, Shenoy KV, Henderson JM. Neural ensemble dynamics in dorsal motor cortex during speech in people with paralysis. eLife 2019; 8:e46015. [PMID: 31820736 PMCID: PMC6954053 DOI: 10.7554/elife.46015] [Citation(s) in RCA: 50] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2019] [Accepted: 11/14/2019] [Indexed: 01/20/2023] Open
Abstract
Speaking is a sensorimotor behavior whose neural basis is difficult to study with single neuron resolution due to the scarcity of human intracortical measurements. We used electrode arrays to record from the motor cortex 'hand knob' in two people with tetraplegia, an area not previously implicated in speech. Neurons modulated during speaking and during non-speaking movements of the tongue, lips, and jaw. This challenges whether the conventional model of a 'motor homunculus' division by major body regions extends to the single-neuron scale. Spoken words and syllables could be decoded from single trials, demonstrating the potential of intracortical recordings for brain-computer interfaces to restore speech. Two neural population dynamics features previously reported for arm movements were also present during speaking: a component that was mostly invariant across initiating different words, followed by rotatory dynamics during speaking. This suggests that common neural dynamical motifs may underlie movement of arm and speech articulators.
Collapse
Affiliation(s)
- Sergey D Stavisky
- Department of NeurosurgeryStanford UniversityStanfordUnited States
- Department of Electrical EngineeringStanford UniversityStanfordUnited States
| | - Francis R Willett
- Department of NeurosurgeryStanford UniversityStanfordUnited States
- Department of Electrical EngineeringStanford UniversityStanfordUnited States
| | - Guy H Wilson
- Neurosciences ProgramStanford UniversityStanfordUnited States
| | - Brian A Murphy
- Department of Biomedical EngineeringCase Western Reserve UniversityClevelandUnited States
- FES Center, Rehab R&D ServiceLouis Stokes Cleveland Department of Veterans Affairs Medical CenterClevelandUnited States
| | - Paymon Rezaii
- Department of NeurosurgeryStanford UniversityStanfordUnited States
| | | | - William D Memberg
- Department of Biomedical EngineeringCase Western Reserve UniversityClevelandUnited States
- FES Center, Rehab R&D ServiceLouis Stokes Cleveland Department of Veterans Affairs Medical CenterClevelandUnited States
| | - Jonathan P Miller
- FES Center, Rehab R&D ServiceLouis Stokes Cleveland Department of Veterans Affairs Medical CenterClevelandUnited States
- Department of NeurosurgeryUniversity Hospitals Cleveland Medical CenterClevelandUnited States
| | - Robert F Kirsch
- Department of Biomedical EngineeringCase Western Reserve UniversityClevelandUnited States
- FES Center, Rehab R&D ServiceLouis Stokes Cleveland Department of Veterans Affairs Medical CenterClevelandUnited States
| | - Leigh R Hochberg
- VA RR&D Center for Neurorestoration and Neurotechnology, Rehabilitation R&D ServiceProvidence VA Medical CenterProvidenceUnited States
- Center for Neurotechnology and Neurorecovery, Department of NeurologyMassachusetts General Hospital, Harvard Medical SchoolBostonUnited States
- School of Engineering and Robert J. & Nandy D. Carney Institute for Brain ScienceBrown UniversityProvidenceUnited States
| | - A Bolu Ajiboye
- Department of Biomedical EngineeringCase Western Reserve UniversityClevelandUnited States
- FES Center, Rehab R&D ServiceLouis Stokes Cleveland Department of Veterans Affairs Medical CenterClevelandUnited States
| | - Shaul Druckmann
- Department of NeurobiologyStanford UniversityStanfordUnited States
| | - Krishna V Shenoy
- Department of Electrical EngineeringStanford UniversityStanfordUnited States
- Department of NeurobiologyStanford UniversityStanfordUnited States
- Department of BioengineeringStanford UniversityStanfordUnited States
- Howard Hughes Medical Institute, Stanford UniversityStanfordUnited States
- Wu Tsai Neurosciences InstituteStanford UniversityStanfordUnited States
- Bio-X ProgramStanford UniversityStanfordUnited States
| | - Jaimie M Henderson
- Department of NeurosurgeryStanford UniversityStanfordUnited States
- Wu Tsai Neurosciences InstituteStanford UniversityStanfordUnited States
- Bio-X ProgramStanford UniversityStanfordUnited States
| |
Collapse
|
14
|
Piovesan D, Kolesnikov M, Lynch K, Mussa-Ivaldi FA. The Concurrent Control of Motion and Contact Force in the Presence of Predictable Disturbances. JOURNAL OF MECHANISMS AND ROBOTICS 2019; 11:060903. [PMID: 34163561 PMCID: PMC8208241 DOI: 10.1115/1.4044599] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/26/2019] [Revised: 07/30/2019] [Accepted: 07/30/2019] [Indexed: 06/13/2023]
Abstract
The simultaneous control of force and motion is important in everyday activities when humans interact with objects. While many studies have analyzed the control of movement within a perturbing force field, few have investigated its dual aspects of controlling a contact force in nonisometric conditions. The mechanism by which the central nervous system controls forces during movements is still unclear, and it can be elucidated by estimating the mechanical properties of the arm during tasks with concurrent motion and contact force goals. We investigate how arm mechanics change when a force control task is accomplished during low-frequency positional perturbations of the arm. Contrary to many force regulation algorithms implemented in robotics, where contact impedance is decreased to reduce force fluctuations in response to position disturbances, we observed a steady increase of arm endpoint stiffness as the task progressed. Based on this evidence, we propose a theoretical framework suggesting that an internal model of the perturbing trajectory is formed. We observed that force regulation in the presence of predictable positional disturbances is implemented using a position control strategy together with the modulation of the endpoint stiffness magnitude, where the direction of the endpoint stiffness ellipse's major axis is oriented toward the desired force.
Collapse
Affiliation(s)
- Davide Piovesan
- Department Biomedical Industrial and Systems Engineering, Gannon University, 109 University Square, Erie, PA 16541
| | | | - Kevin Lynch
- Department of Mechanical Engineering, Northwestern University, 2145 Sheridan Road, Evanston, IL 60208
| | - Ferdinando A. Mussa-Ivaldi
- The Shirley Ryan Ability Lab, 355 E Erie Street, Chicago, IL 60611
- Department of Physiology, Northwestern University, M211 303 E. Chicago Avenue, Chicago, IL 60611
| |
Collapse
|
15
|
Multisensory Neurons in the Primate Amygdala. J Neurosci 2019; 39:3663-3675. [PMID: 30858163 DOI: 10.1523/jneurosci.2903-18.2019] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2018] [Revised: 02/12/2019] [Accepted: 02/13/2019] [Indexed: 11/21/2022] Open
Abstract
Animals identify, interpret, and respond to complex, natural signals that are often multisensory. The ability to integrate signals across sensory modalities depends on the convergence of sensory inputs at the level of single neurons. Neurons in the amygdala are expected to be multisensory because they respond to complex, natural stimuli, and the amygdala receives inputs from multiple sensory areas. We recorded activity from the amygdala of 2 male monkeys (Macaca mulatta) in response to visual, tactile, and auditory stimuli. Although the stimuli were devoid of inherent emotional or social significance and were not paired with rewards or punishments, the majority of neurons that responded to these stimuli were multisensory. Selectivity for sensory modality was stronger and emerged earlier than selectivity for individual items within a sensory modality. Modality and item selectivity were expressed via three main spike-train metrics: (1) response magnitude, (2) response polarity, and (3) response duration. None of these metrics were unique to a particular sensory modality; rather, each neuron responded with distinct combinations of spike-train metrics to discriminate sensory modalities and items within a modality. The relative proportion of multisensory neurons was similar across the nuclei of the amygdala. The convergence of inputs of multiple sensory modalities at the level of single neurons in the amygdala rests at the foundation for multisensory integration. The integration of visual, auditory, and tactile inputs in the amygdala may serve social communication by binding together social signals carried by facial expressions, vocalizations, and social grooming.SIGNIFICANCE STATEMENT Our brain continuously decodes information detected by multiple sensory systems. The emotional and social significance of the incoming signals is likely extracted by the amygdala, which receives input from all sensory domains. Here we show that a large portion of neurons in the amygdala respond to stimuli from two or more sensory modalities. The convergence of visual, tactile, and auditory signals at the level of individual neurons in the amygdala establishes a foundation for multisensory integration within this structure. The ability to integrate signals across sensory modalities is critical for social communication and other high-level cognitive functions.
Collapse
|