1
|
Breitinger E, Pokorny L, Biermann L, Jarczok TA, Dundon NM, Roessner V, Bender S. What makes somatosensory short-term memory maintenance effective? An EEG study comparing contralateral delay activity between sighted participants and participants who are blind. Neuroimage 2022; 259:119407. [PMID: 35752414 DOI: 10.1016/j.neuroimage.2022.119407] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2022] [Revised: 05/27/2022] [Accepted: 06/21/2022] [Indexed: 11/26/2022] Open
Abstract
Somatosensory short-term memory is essential for object recognition, sensorimotor learning, and, especially, Braille reading for people who are blind. This study examined how visual sensory deprivation and a compensatory focus on somatosensory information influences memory processes in this domain. We measured slow cortical negativity developing during short-term tactile memory maintenance (tactile contralateral delay activity, tCDA) in frontal and somatosensory areas while a sample of 24 sighted participants and 22 participants who are blind completed a tactile change-detection task where varying loads of Braille pin patterns served as stimuli. Auditory cues, appearing at varying latencies between sample arrays, could be used to reduce memory demands during maintenance. Participants who are blind (trained Braille readers) outperformed sighted participants behaviorally. In addition, while task-related frontal activation featured in both groups, participants who are blind uniquely showed higher tCDA amplitudes specifically over somatosensory areas. The site specificity of this component's functional relevance in short-term memory maintenance was further supported by somatosensory tCDA amplitudes first correlating across the whole sample with behavioral performance, and secondly showing sensitivity to varying memory load. The results substantiate sensory recruitment models and provide new insights into the effects of visual sensory deprivation on tactile processing. Between-group differences in the interplay between frontal and somatosensory areas during somatosensory maintenance also suggest that efficient maintenance of complex tactile stimuli in short-term memory is primarily facilitated by lateralized activity in somatosensory cortex.
Collapse
Affiliation(s)
- Eva Breitinger
- Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, University of Cologne, Faculty of Medicine and University Hospital Cologne, Germany.
| | - Lena Pokorny
- Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, University of Cologne, Faculty of Medicine and University Hospital Cologne, Germany
| | - Lea Biermann
- Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, University of Cologne, Faculty of Medicine and University Hospital Cologne, Germany
| | - Tomasz Antoni Jarczok
- Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, University of Cologne, Faculty of Medicine and University Hospital Cologne, Germany; Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, University Hospital Frankfurt, Germany; Department of Child and Adolescent Psychiatry and Psychotherapy, KJF Klinik Josefinum, Augsburg, Germany
| | - Neil M Dundon
- Department of Child and Adolescent Psychiatry, Psychotherapy, and Psychosomatics, University of Freiburg, Germany; Department of Psychological and Brain Sciences, University of California, Santa Barbara, CA
| | - Veit Roessner
- Department of Child and Adolescent Psychiatry and Psychotherapy, Technische Universität Dresden, Faculty of Medicine, University Hospital C. G. Carus, Germany
| | - Stephan Bender
- Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, University of Cologne, Faculty of Medicine and University Hospital Cologne, Germany
| |
Collapse
|
2
|
Three-Dimensional Printing Model Enhances Craniofacial Trauma Teaching by Improving Morphologic and Biomechanical Understanding: A Randomized Controlled Study. Plast Reconstr Surg 2022; 149:475e-484e. [PMID: 35196687 DOI: 10.1097/prs.0000000000008869] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/13/2023]
Abstract
BACKGROUND Teaching about craniofacial traumas is challenging given the complexity of the craniofacial anatomy and the necessity for good spatial representation skills. To solve these problems, three-dimensional printing seems to be an appropriate educative material. In this study, the authors conducted a randomized controlled trial. The authors' main objective was to compare the performance of the undergraduate medical students in an examination based on the teaching support: three-dimensionally printed models versus two-dimensional pictures. METHODS All participants were randomly assigned to one of two groups using a random number table: the three-dimensionally-printed support group (three-dimensional group) or the two-dimensionally-displayed support group (two-dimensional group). All participants completed a multiple-choice question evaluation questionnaire on facial traumatology (first, a zygomatic bone fracture; then, a double mandible fracture). Sex and potential confounding factors were evaluated. RESULTS Four hundred thirty-two fifth-year undergraduate medical students were enrolled in this study. Two hundred six students were allocated to the three-dimensional group, and 226 were allocated to the two-dimensional group. The three-dimensionally printed model was considered to be a better teaching material compared with two-dimensional support. The global mean score was 2.36 in the three-dimensional group versus 1.99 in the two-dimensional group (p = 0.008). Regarding teaching of biomechanical aspects, three-dimensionally-printed models provide better understanding (p = 0.015). Participants in both groups exhibited similar previous student educational achievements and visuospatial skills. CONCLUSIONS This prospective, randomized, controlled educational trial demonstrated that incorporation of three-dimensionally-printed models improves medical students' understanding. This trial reinforces previous studies highlighting academic benefits in using three-dimensionally-printed models mostly in the field of understanding complex structures.
Collapse
|
3
|
Cheng J, Yang Z, Overstreet CK, Keefer E. Fascicle-Specific Targeting of Longitudinal Intrafascicular Electrodes for Motor and Sensory Restoration in Upper-Limb Amputees. Hand Clin 2021; 37:401-414. [PMID: 34253313 DOI: 10.1016/j.hcl.2021.04.004] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/02/2023]
Abstract
Multichannel longitudinal intrafascicular electrode (LIFE) interfaces provide optimized balance of invasiveness and stability for chronic sensory stimulation and motor recording/decoding of peripheral nerve signals. Using a fascicle-specific targeting (FAST)-LIFE approach, where electrodes are individually placed within discrete sensory- and motor-related fascicular subdivisions of the residual ulnar and/or median nerves in an amputated upper limb, FAST-LIFE interfacing can provide discernment of motor intent for individual digit control of a robotic hand, and restoration of touch- and movement-related sensory feedback. The authors describe their findings from clinical studies performed with 6 human amputee trials using FAST-LIFE interfacing of the residual upper limb.
Collapse
Affiliation(s)
- Jonathan Cheng
- Department of Plastic Surgery, University of Texas Southwestern Medical Center, 1801 Inwood Road, Dallas, TX 75390, USA.
| | - Zhi Yang
- Department of Biomedical Engineering, University of Minnesota, Nils Hasselmo Hall, Room 6-120, 312 Church Street Southeast, Minneapolis, MN 55455, USA
| | | | - Edward Keefer
- Nerves Incorporated, P.O. Box 141295, Dallas, TX 75214, USA
| |
Collapse
|
4
|
Versteeg C, Chowdhury RH, Miller LE. Cuneate nucleus: The somatosensory gateway to the brain. CURRENT OPINION IN PHYSIOLOGY 2021; 20:206-215. [PMID: 33869911 DOI: 10.1016/j.cophys.2021.02.004] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
Much remains unknown about the transformation of proprioceptive afferent input from the periphery to the cortex. Until recently, the only recordings from neurons in the cuneate nucleus (CN) were from anesthetized animals. We are beginning to learn more about how the sense of proprioception is transformed as it propagates centrally. Recent recordings from microelectrode arrays chronically implanted in CN have revealed that CN neurons with muscle-like properties have a greater sensitivity to active reaching movements than to passive limb displacement, and we find that these neurons have receptive fields that resemble single muscles. In this review, we focus on the varied uses of proprioceptive input and the possible role of CN in processing this information.
Collapse
Affiliation(s)
- Christopher Versteeg
- Department of Biomedical Engineering, McCormick School of Engineering, Northwestern 7 University, Evanston, IL, USA
| | - Raeed H Chowdhury
- Department of Bioengineering, Swanson School of Engineering, University of Pittsburgh, 10 Pittsburgh, PA, USA
| | - Lee E Miller
- Department of Biomedical Engineering, McCormick School of Engineering, Northwestern 7 University, Evanston, IL, USA.,Department of Physiology, Feinberg School of Medicine, Northwestern University, Chicago, 13 IL, USA.,Department of Physical Medicine and Rehabilitation, Feinberg School of Medicine, 16 Northwestern University, Chicago, IL, USA.,Shirley Ryan AbilityLab, Chicago, IL, USA
| |
Collapse
|
5
|
Quantifying the alignment error and the effect of incomplete somatosensory feedback on motor performance in a virtual brain-computer-interface setup. Sci Rep 2021; 11:4614. [PMID: 33633302 PMCID: PMC7907076 DOI: 10.1038/s41598-021-84288-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2020] [Accepted: 02/13/2021] [Indexed: 11/29/2022] Open
Abstract
Invasive brain–computer-interfaces (BCIs) aim to improve severely paralyzed patient’s (e.g. tetraplegics) quality of life by using decoded movement intentions to let them interact with robotic limbs. We argue that the performance in controlling an end-effector using a BCI depends on three major factors: decoding error, missing somatosensory feedback and alignment error caused by translation and/or rotation of the end-effector relative to the real or perceived body. Using a virtual reality (VR) model of an ideal BCI decoder with healthy participants, we found that a significant performance loss might be attributed solely to the alignment error. We used a shape-drawing task to investigate and quantify the effects of robot arm misalignment on motor performance independent from the other error sources. We found that a 90° rotation of the robot arm relative to the participant leads to the worst performance, while we did not find a significant difference between a 45° rotation and no rotation. Additionally, we compared a group of subjects with indirect haptic feedback with a group without indirect haptic feedback to investigate the feedback-error. In the group without feedback, we found a significant difference in performance only when no rotation was applied to the robot arm, supporting that a form of haptic feedback is another important factor to be considered in BCI control.
Collapse
|
6
|
Cámara C, López-Moliner J, Brenner E, de la Malla C. Looking away from a moving target does not disrupt the way in which the movement toward the target is guided. J Vis 2021; 20:5. [PMID: 32407436 PMCID: PMC7409596 DOI: 10.1167/jov.20.5.5] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/02/2022] Open
Abstract
People usually follow a moving object with their gaze if they intend to interact with it. What would happen if they did not? We recorded eye and finger movements while participants moved a cursor toward a moving target. An unpredictable delay in updating the position of the cursor on the basis of that of the invisible finger made it essential to use visual information to guide the finger's ongoing movement. Decreasing the contrast between the cursor and the background from trial to trial made it difficult to see the cursor without looking at it. In separate experiments, either participants were free to hit the target anywhere along its trajectory or they had to move along a specified path. In the two experiments, participants tracked the cursor rather than the target with their gaze on 13% and 32% of the trials, respectively. They hit fewer targets when the contrast was low or a path was imposed. Not looking at the target did not disrupt the visual guidance that was required to deal with the delays that we imposed. Our results suggest that peripheral vision can be used to guide one item to another, irrespective of which item one is looking at.
Collapse
|
7
|
Callier T, Suresh AK, Bensmaia SJ. Neural Coding of Contact Events in Somatosensory Cortex. Cereb Cortex 2020; 29:4613-4627. [PMID: 30668644 DOI: 10.1093/cercor/bhy337] [Citation(s) in RCA: 26] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2018] [Revised: 12/07/2018] [Accepted: 12/17/2018] [Indexed: 01/22/2023] Open
Abstract
Manual interactions with objects require precise and rapid feedback about contact events. These tactile signals are integrated with motor plans throughout the neuraxis to achieve dexterous object manipulation. To better understand the role of somatosensory cortex in interactions with objects, we measured, using chronically implanted arrays of electrodes, the responses of populations of somatosensory neurons to skin indentations designed to simulate the initiation, maintenance, and termination of contact with an object. First, we find that the responses of somatosensory neurons to contact onset and offset dwarf their responses to maintenance of contact. Second, we show that these responses rapidly and reliably encode features of the simulated contact events-their timing, location, and strength-and can account for the animals' performance in an amplitude discrimination task. Third, we demonstrate that the spatiotemporal dynamics of the population response in cortex mirror those of the population response in the nerves. We conclude that the responses of populations of somatosensory neurons are well suited to encode contact transients and are consistent with a role of somatosensory cortex in signaling transitions between task subgoals.
Collapse
Affiliation(s)
- Thierri Callier
- Committee on Computational Neuroscience, University of Chicago, Chicago, IL, USA
| | - Aneesha K Suresh
- Committee on Computational Neuroscience, University of Chicago, Chicago, IL, USA
| | - Sliman J Bensmaia
- Committee on Computational Neuroscience, University of Chicago, Chicago, IL, USA.,Department of Organismal Biology and Anatomy, University of Chicago, Chicago, IL, USA
| |
Collapse
|
8
|
Reid S, Shapiro L, Louw G. How Haptics and Drawing Enhance the Learning of Anatomy. ANATOMICAL SCIENCES EDUCATION 2019; 12:164-172. [PMID: 30107081 DOI: 10.1002/ase.1807] [Citation(s) in RCA: 41] [Impact Index Per Article: 8.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/07/2017] [Accepted: 05/09/2018] [Indexed: 06/08/2023]
Abstract
Students' engagement with two-dimensional (2D) representations as opposed to three-dimensional (3D) representations of anatomy such as in dissection, is significant in terms of the depth of their comprehension. This qualitative study aimed to understand how students learned anatomy using observational and drawing activities that included touch, called haptics. Five volunteer second year medical students at the University of Cape Town participated in a six-day educational intervention in which a novel "haptico-visual observation and drawing" (HVOD) method was employed. Data were collected through individual interviews as well as a focus group discussion. The HVOD method was successfully applied by all the participants, who reported an improvement of their cognitive understanding and memorization of the 3D form of the anatomical part. All the five participants described the development of a "mental picture" of the object as being central to "deep learning." The use of the haptic senses coupled with the simultaneous act of drawing enrolled sources of information that were reported by the participants to have enabled better memorization. We postulate that the more sources of information about an object, the greater degree of complexity could be appreciated, and therefore the more clearly it could be captured and memorized. The inclusion of haptics has implications for cadaveric dissection versus non-cadaveric forms of learning. This study was limited by its sample size as well as the bias and position of the researchers, but the sample of five produced a sufficient amount of data to generate a conceptual model and hypothesis.
Collapse
Affiliation(s)
- Stephen Reid
- Primary Health Care Directorate, Faculty of Health Sciences, University of Cape Town, Cape Town, South Africa
| | - Leonard Shapiro
- Primary Health Care Directorate, Faculty of Health Sciences, University of Cape Town, Cape Town, South Africa
| | - Graham Louw
- Division of Clinical Anatomy and Biological Anthropology, Department of Human Biology, Faculty of Health Sciences, University of Cape Town, Cape Town, South Africa
| |
Collapse
|
9
|
Delhaye BP, Long KH, Bensmaia SJ. Neural Basis of Touch and Proprioception in Primate Cortex. Compr Physiol 2018; 8:1575-1602. [PMID: 30215864 PMCID: PMC6330897 DOI: 10.1002/cphy.c170033] [Citation(s) in RCA: 108] [Impact Index Per Article: 18.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/31/2022]
Abstract
The sense of proprioception allows us to keep track of our limb posture and movements and the sense of touch provides us with information about objects with which we come into contact. In both senses, mechanoreceptors convert the deformation of tissues-skin, muscles, tendons, ligaments, or joints-into neural signals. Tactile and proprioceptive signals are then relayed by the peripheral nerves to the central nervous system, where they are processed to give rise to percepts of objects and of the state of our body. In this review, we first examine briefly the receptors that mediate touch and proprioception, their associated nerve fibers, and pathways they follow to the cerebral cortex. We then provide an overview of the different cortical areas that process tactile and proprioceptive information. Next, we discuss how various features of objects-their shape, motion, and texture, for example-are encoded in the various cortical fields, and the susceptibility of these neural codes to attention and other forms of higher-order modulation. Finally, we summarize recent efforts to restore the senses of touch and proprioception by electrically stimulating somatosensory cortex. © 2018 American Physiological Society. Compr Physiol 8:1575-1602, 2018.
Collapse
Affiliation(s)
- Benoit P Delhaye
- Department of Organismal Biology and Anatomy, University of Chicago, Chicago, USA
| | - Katie H Long
- Committee on Computational Neuroscience, University of Chicago, Chicago, USA
| | - Sliman J Bensmaia
- Department of Organismal Biology and Anatomy, University of Chicago, Chicago, USA.,Committee on Computational Neuroscience, University of Chicago, Chicago, USA
| |
Collapse
|
10
|
Liu J, Ando H. Response Modality vs. Target Modality: Sensory Transformations and Comparisons in Cross-modal Slant Matching Tasks. Sci Rep 2018; 8:11068. [PMID: 30038316 PMCID: PMC6056512 DOI: 10.1038/s41598-018-29375-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2018] [Accepted: 07/10/2018] [Indexed: 11/17/2022] Open
Abstract
Humans constantly combine multi-sensory spatial information to successfully interact with objects in peripersonal space. Previous studies suggest that sensory inputs of different modalities are encoded in different reference frames. In cross-modal tasks where the target and response modalities are different, it is unclear which reference frame these multiple sensory signals are transformed to for comparison. The current study used a slant perception and parallelity paradigm to explore this issue. Participants perceived (either visually or haptically) the slant of a reference board and were asked to either adjust an invisible test board by hand manipulation or to adjust a visible test board through verbal instructions to be physically parallel to the reference board. We examined the patterns of constant error and variability of unimodal and cross-modal tasks with various reference slant angles at different reference/test locations. The results revealed that rather than a mixture of the patterns of unimodal conditions, the pattern in cross-modal conditions depended almost entirely on the response modality and was not substantially affected by the target modality. Deviations in haptic response conditions could be predicted by the locations of the reference and test board, whereas the reference slant angle was an important predictor in visual response conditions.
Collapse
Affiliation(s)
- Juan Liu
- Center for Information and Neural Networks (CiNet), National Institute of Information and Communications Technology (NICT) and Osaka University, Osaka, Japan.
| | - Hiroshi Ando
- Center for Information and Neural Networks (CiNet), National Institute of Information and Communications Technology (NICT) and Osaka University, Osaka, Japan
| |
Collapse
|
11
|
Mikula L, Sahnoun S, Pisella L, Blohm G, Khan AZ. Vibrotactile information improves proprioceptive reaching target localization. PLoS One 2018; 13:e0199627. [PMID: 29979697 PMCID: PMC6034815 DOI: 10.1371/journal.pone.0199627] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2018] [Accepted: 06/11/2018] [Indexed: 11/19/2022] Open
Abstract
When pointing to parts of our own body (e.g., the opposite index finger), the position of the target is derived from proprioceptive signals. Consistent with the principles of multisensory integration, it has been found that participants better matched the position of their index finger when they also had visual cues about its location. Unlike vision, touch may not provide additional information about finger position in space, since fingertip tactile information theoretically remains the same irrespective of the postural configuration of the upper limb. However, since tactile and proprioceptive information are ultimately coded within the same population of posterior parietal neurons within high-level spatial representations, we nevertheless hypothesized that additional tactile information could benefit the processing of proprioceptive signals. To investigate the influence of tactile information on proprioceptive localization, we asked 19 participants to reach with the right hand towards the opposite unseen index finger (proprioceptive target). Vibrotactile stimuli were applied to the target index finger prior to movement execution. We found that participants made smaller errors and more consistent reaches following tactile stimulation. These results demonstrate that transient touch provided at the proprioceptive target improves subsequent reaching precision and accuracy. Such improvement was not observed when tactile stimulation was delivered to a distinct body part (the shoulder). This suggests a specific spatial integration of touch and proprioception at the level of high-level cortical body representations, resulting in touch improving position sense.
Collapse
Affiliation(s)
- Laura Mikula
- Centre de Recherche en Neurosciences de Lyon (CRNL), ImpAct team, Inserm U1028, CNRS UMR 5292, University Claude Bernard Lyon 1, Bron, France
- School of Optometry, University of Montreal, Montréal, Québec, Canada
| | - Sofia Sahnoun
- School of Optometry, University of Montreal, Montréal, Québec, Canada
| | - Laure Pisella
- Centre de Recherche en Neurosciences de Lyon (CRNL), ImpAct team, Inserm U1028, CNRS UMR 5292, University Claude Bernard Lyon 1, Bron, France
| | - Gunnar Blohm
- Centre for Neuroscience Studies, Queen’s University, Kingston, Ontario, Canada
| | - Aarlenne Zein Khan
- School of Optometry, University of Montreal, Montréal, Québec, Canada
- * E-mail:
| |
Collapse
|
12
|
Ortiz-Catalan M. Restoration of somatosensory perception via electrical stimulation of peripheral nerves. Clin Neurophysiol 2018; 129:845-846. [PMID: 29395847 DOI: 10.1016/j.clinph.2018.01.008] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2018] [Accepted: 01/10/2018] [Indexed: 01/07/2023]
Affiliation(s)
- Max Ortiz-Catalan
- Chalmers University of Technology, Department of Electrical Engineering, Biomechatronics and Neurorehabilitation Laboratory, Hörsalsvägen 11, SE-41296 Gothenburg, Sweden; Integrum AB, Krokslätts Fabriker 50, SE-43137 Mölndal, Sweden.
| |
Collapse
|
13
|
Yau JM, Kim SS, Thakur PH, Bensmaia SJ. Feeling form: the neural basis of haptic shape perception. J Neurophysiol 2016; 115:631-42. [PMID: 26581869 PMCID: PMC4752307 DOI: 10.1152/jn.00598.2015] [Citation(s) in RCA: 43] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2015] [Accepted: 10/23/2015] [Indexed: 11/22/2022] Open
Abstract
The tactile perception of the shape of objects critically guides our ability to interact with them. In this review, we describe how shape information is processed as it ascends the somatosensory neuraxis of primates. At the somatosensory periphery, spatial form is represented in the spatial patterns of activation evoked across populations of mechanoreceptive afferents. In the cerebral cortex, neurons respond selectively to particular spatial features, like orientation and curvature. While feature selectivity of neurons in the earlier processing stages can be understood in terms of linear receptive field models, higher order somatosensory neurons exhibit nonlinear response properties that result in tuning for more complex geometrical features. In fact, tactile shape processing bears remarkable analogies to its visual counterpart and the two may rely on shared neural circuitry. Furthermore, one of the unique aspects of primate somatosensation is that it contains a deformable sensory sheet. Because the relative positions of cutaneous mechanoreceptors depend on the conformation of the hand, the haptic perception of three-dimensional objects requires the integration of cutaneous and proprioceptive signals, an integration that is observed throughout somatosensory cortex.
Collapse
Affiliation(s)
- Jeffrey M Yau
- Department of Neuroscience, Baylor College of Medicine, Houston, Texas;
| | - Sung Soo Kim
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, Virginia
| | | | - Sliman J Bensmaia
- Department of Organismal Biology and Anatomy, University of Chicago, Chicago, Illinois
| |
Collapse
|
14
|
Hellman RB, Chang E, Tanner J, Helms Tillery SI, Santos VJ. A robot hand testbed designed for enhancing embodiment and functional neurorehabilitation of body schema in subjects with upper limb impairment or loss. Front Hum Neurosci 2015; 9:26. [PMID: 25745391 PMCID: PMC4333840 DOI: 10.3389/fnhum.2015.00026] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2014] [Accepted: 01/12/2015] [Indexed: 11/13/2022] Open
Abstract
Many upper limb amputees experience an incessant, post-amputation "phantom limb pain" and report that their missing limbs feel paralyzed in an uncomfortable posture. One hypothesis is that efferent commands no longer generate expected afferent signals, such as proprioceptive feedback from changes in limb configuration, and that the mismatch of motor commands and visual feedback is interpreted as pain. Non-invasive therapeutic techniques for treating phantom limb pain, such as mirror visual feedback (MVF), rely on visualizations of postural changes. Advances in neural interfaces for artificial sensory feedback now make it possible to combine MVF with a high-tech "rubber hand" illusion, in which subjects develop a sense of embodiment with a fake hand when subjected to congruent visual and somatosensory feedback. We discuss clinical benefits that could arise from the confluence of known concepts such as MVF and the rubber hand illusion, and new technologies such as neural interfaces for sensory feedback and highly sensorized robot hand testbeds, such as the "BairClaw" presented here. Our multi-articulating, anthropomorphic robot testbed can be used to study proprioceptive and tactile sensory stimuli during physical finger-object interactions. Conceived for artificial grasp, manipulation, and haptic exploration, the BairClaw could also be used for future studies on the neurorehabilitation of somatosensory disorders due to upper limb impairment or loss. A remote actuation system enables the modular control of tendon-driven hands. The artificial proprioception system enables direct measurement of joint angles and tendon tensions while temperature, vibration, and skin deformation are provided by a multimodal tactile sensor. The provision of multimodal sensory feedback that is spatiotemporally consistent with commanded actions could lead to benefits such as reduced phantom limb pain, and increased prosthesis use due to improved functionality and reduced cognitive burden.
Collapse
Affiliation(s)
- Randall B Hellman
- Biomechatronics Laboratory, Department of Mechanical and Aerospace Engineering, Arizona State University , Tempe, AZ , USA ; Biomechatronics Laboratory, Department of Mechanical and Aerospace Engineering, University of California Los Angeles , Los Angeles, CA , USA
| | - Eric Chang
- Biomechatronics Laboratory, Department of Mechanical and Aerospace Engineering, Arizona State University , Tempe, AZ , USA
| | - Justin Tanner
- SensoriMotor Research Group, School of Biological and Health Systems Engineering, Arizona State University , Tempe, AZ , USA
| | - Stephen I Helms Tillery
- SensoriMotor Research Group, School of Biological and Health Systems Engineering, Arizona State University , Tempe, AZ , USA
| | - Veronica J Santos
- Biomechatronics Laboratory, Department of Mechanical and Aerospace Engineering, Arizona State University , Tempe, AZ , USA ; Biomechatronics Laboratory, Department of Mechanical and Aerospace Engineering, University of California Los Angeles , Los Angeles, CA , USA
| |
Collapse
|
15
|
Alnajjar F, Itkonen M, Berenz V, Tournier M, Nagai C, Shimoda S. Sensory synergy as environmental input integration. Front Neurosci 2015; 8:436. [PMID: 25628523 PMCID: PMC4292368 DOI: 10.3389/fnins.2014.00436] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2014] [Accepted: 12/11/2014] [Indexed: 11/24/2022] Open
Abstract
The development of a method to feed proper environmental inputs back to the central nervous system (CNS) remains one of the challenges in achieving natural movement when part of the body is replaced with an artificial device. Muscle synergies are widely accepted as a biologically plausible interpretation of the neural dynamics between the CNS and the muscular system. Yet the sensorineural dynamics of environmental feedback to the CNS has not been investigated in detail. In this study, we address this issue by exploring the concept of sensory synergy. In contrast to muscle synergy, we hypothesize that sensory synergy plays an essential role in integrating the overall environmental inputs to provide low-dimensional information to the CNS. We assume that sensor synergy and muscle synergy communicate using these low-dimensional signals. To examine our hypothesis, we conducted posture control experiments involving lateral disturbance with nine healthy participants. Proprioceptive information represented by the changes on muscle lengths were estimated by using the musculoskeletal model analysis software SIMM. Changes on muscles lengths were then used to compute sensory synergies. The experimental results indicate that the environmental inputs were translated into the two dimensional signals and used to move the upper limb to the desired position immediately after the lateral disturbance. Participants who showed high skill in posture control were found to be likely to have a strong correlation between sensory and muscle signaling as well as high coordination between the utilized sensory synergies. These results suggest the importance of integrating environmental inputs into suitable low-dimensional signals before providing them to the CNS. This mechanism should be essential when designing the prosthesis' sensory system to make the controller simpler.
Collapse
Affiliation(s)
- Fady Alnajjar
- Intelligent Behavior Control Unit, Brain Science Institute-TOYOTA Collaboration Center of RIKENNagoya, Japan
| | | | | | | | | | | |
Collapse
|
16
|
Honeine JL, Schieppati M. Time-interval for integration of stabilizing haptic and visual information in subjects balancing under static and dynamic conditions. Front Syst Neurosci 2014; 8:190. [PMID: 25339872 PMCID: PMC4186340 DOI: 10.3389/fnsys.2014.00190] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2014] [Accepted: 09/17/2014] [Indexed: 01/22/2023] Open
Abstract
Maintaining equilibrium is basically a sensorimotor integration task. The central nervous system (CNS) continually and selectively weights and rapidly integrates sensory inputs from multiple sources, and coordinates multiple outputs. The weighting process is based on the availability and accuracy of afferent signals at a given instant, on the time-period required to process each input, and possibly on the plasticity of the relevant pathways. The likelihood that sensory inflow changes while balancing under static or dynamic conditions is high, because subjects can pass from a dark to a well-lit environment or from a tactile-guided stabilization to loss of haptic inflow. This review article presents recent data on the temporal events accompanying sensory transition, on which basic information is fragmentary. The processing time from sensory shift to reaching a new steady state includes the time to (a) subtract or integrate sensory inputs; (b) move from allocentric to egocentric reference or vice versa; and (c) adjust the calibration of motor activity in time and amplitude to the new sensory set. We present examples of processes of integration of posture-stabilizing information, and of the respective sensorimotor time-intervals while allowing or occluding vision or adding or subtracting tactile information. These intervals are short, in the order of 1–2 s for different postural conditions, modalities and deliberate or passive shift. They are just longer for haptic than visual shift, just shorter on withdrawal than on addition of stabilizing input, and on deliberate than unexpected mode. The delays are the shortest (for haptic shift) in blind subjects. Since automatic balance stabilization may be vulnerable to sensory-integration delays and to interference from concurrent cognitive tasks in patients with sensorimotor problems, insight into the processing time for balance control represents a critical step in the design of new balance- and locomotion training devices.
Collapse
Affiliation(s)
- Jean-Louis Honeine
- Department of Public Health, Experimental and Forensic Medicine, University of Pavia Pavia, Italy
| | - Marco Schieppati
- Department of Public Health, Experimental and Forensic Medicine, University of Pavia Pavia, Italy ; Centro Studi Attività Motorie (CSAM), Fondazione Salvatore Maugeri (IRCSS), Scientific Institute of Pavia Pavia, Italy
| |
Collapse
|
17
|
Overstreet CK, Klein JD, Helms Tillery SI. Computational modeling of direct neuronal recruitment during intracortical microstimulation in somatosensory cortex. J Neural Eng 2013; 10:066016. [PMID: 24280531 DOI: 10.1088/1741-2560/10/6/066016] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/29/2023]
Abstract
OBJECTIVE Electrical stimulation of cortical tissue could be used to deliver sensory information as part of a neuroprosthetic device, but current control of the location, resolution, quality, and intensity of sensations elicited by intracortical microstimulation (ICMS) remains inadequate for this purpose. One major obstacle to resolving this problem is the poor understanding of the neural activity induced by ICMS. Even with new imaging methods, quantifying the activity of many individual neurons within cortex is difficult. APPROACH We used computational modeling to examine the response of somatosensory cortex to ICMS. We modeled the axonal arbors of eight distinct morphologies of interneurons and seven types of pyramidal neurons found in somatosensory cortex and identified their responses to extracellular stimulation. We then combined these axonal elements to form a multi-layered slab of simulated cortex and investigated the patterns of neural activity directly induced by ICMS. Specifically we estimated the number, location, and variety of neurons directly recruited by stimulation on a single penetrating microelectrode. MAIN RESULTS The population of neurons activated by ICMS was dependent on both stimulation strength and the depth of the electrode within cortex. Strikingly, stimulation recruited interneurons and pyramidal neurons in very different patterns. Interneurons are primarily recruited within a dense, continuous region around the electrode, while pyramidal neurons were recruited in a sparse fashion both near the electrode and up to several millimeters away. Thus ICMS can lead to an unexpectedly complex spatial distribution of firing neurons. SIGNIFICANCE These results lend new insights to the complexity and range of neural activity that can be induced by ICMS. This work also suggests mechanisms potentially responsible for the inconsistency and unnatural quality of sensations initiated by ICMS. Understanding these mechanisms will aid in the design of stimulation that can be used to generate effective sensory feedback for neuroprosthetic devices.
Collapse
Affiliation(s)
- C K Overstreet
- School of Biological and Health Systems Engineering, Arizona State University, Tempe, AZ 85287, USA
| | | | | |
Collapse
|
18
|
Expanding the primate body schema in sensorimotor cortex by virtual touches of an avatar. Proc Natl Acad Sci U S A 2013; 110:15121-6. [PMID: 23980141 DOI: 10.1073/pnas.1308459110] [Citation(s) in RCA: 62] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
The brain representation of the body, called the body schema, is susceptible to plasticity. For instance, subjects experiencing a rubber hand illusion develop a sense of ownership of a mannequin hand when they view it being touched while tactile stimuli are simultaneously applied to their own hand. Here, the cortical basis of such an embodiment was investigated through concurrent recordings from primary somatosensory (i.e., S1) and motor (i.e., M1) cortical neuronal ensembles while two monkeys observed an avatar arm being touched by a virtual ball. Following a period when virtual touches occurred synchronously with physical brushes of the monkeys' arms, neurons in S1 and M1 started to respond to virtual touches applied alone. Responses to virtual touch occurred 50 to 70 ms later than to physical touch, consistent with the involvement of polysynaptic pathways linking the visual cortex to S1 and M1. We propose that S1 and M1 contribute to the rubber hand illusion and that, by taking advantage of plasticity in these areas, patients may assimilate neuroprosthetic limbs as parts of their body schema.
Collapse
|
19
|
Song W, Francis JT. Tactile information processing in primate hand somatosensory cortex (S1) during passive arm movement. J Neurophysiol 2013; 110:2061-70. [PMID: 23945783 DOI: 10.1152/jn.00893.2012] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Motor output mostly depends on sensory input, which also can be affected by action. To further our understanding of how tactile information is processed in the primary somatosensory cortex (S1) in dynamic environments, we recorded neural responses to tactile stimulation of the hand in three awake monkeys under arm/hand passive movement and rest. We found that neurons generally responded to tactile stimulation under both conditions and were modulated by movement: with a higher baseline firing rate, a suppressed peak rate, and a smaller dynamic range during passive movement than during rest, while the area under the response curve was stable across these two states. By using an information theory-based method, the mutual information between tactile stimulation and neural responses was quantified with rate and spatial coding models under the two conditions. The two potential encoding models showed different contributions depending on behavioral contexts. Tactile information encoded with rate coding from individual units was lower than spatial coding of unit pairs, especially during movement; however, spatial coding had redundant information between unit pairs. Passive movement regulated the mutual information, and such regulation might play different roles depending on the encoding strategies used. The underlying mechanisms of our observation most likely come from a bottom-up strategy, where neurons in S1 were regulated through the activation of the peripheral tactile/proprioceptive receptors and the interactions between these different types of information.
Collapse
Affiliation(s)
- Weiguo Song
- Department of Physiology and Pharmacology, State University of New York Downstate Medical Center, Brooklyn, New York
| | | |
Collapse
|
20
|
Konczak J, Abbruzzese G. Focal dystonia in musicians: linking motor symptoms to somatosensory dysfunction. Front Hum Neurosci 2013; 7:297. [PMID: 23805090 PMCID: PMC3691509 DOI: 10.3389/fnhum.2013.00297] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2013] [Accepted: 06/05/2013] [Indexed: 11/29/2022] Open
Abstract
Musician's dystonia (MD) is a neurological motor disorder characterized by involuntary contractions of those muscles involved in the play of a musical instrument. It is task-specific and initially only impairs the voluntary control of highly practiced musical motor skills. MD can lead to a severe decrement in a musician's ability to perform. While the etiology and the neurological pathomechanism of the disease remain unknown, it is known that MD like others forms of focal dystonia is associated with somatosensory deficits, specifically a decreased precision of tactile and proprioceptive perception. The sensory component of the disease becomes also evident by the patients' use of “sensory tricks” such as touching dystonic muscles to alleviate motor symptoms. The central premise of this paper is that the motor symptoms of MD have a somatosensory origin and are not fully explained as a problem of motor execution. We outline how altered proprioceptive feedback ultimately leads to a loss of voluntary motor control and propose two scenarios that explain why sensory tricks are effective. They are effective, because the sensorimotor system either recruits neural resources normally involved in tactile-proprioceptive (sensory) integration, or utilizes a fully functioning motor efference copy mechanism to align experienced with expected sensory feedback. We argue that an enhanced understanding of how a primary sensory deficit interacts with mechanisms of sensorimotor integration in MD provides helpful insights for the design of more effective behavioral therapies.
Collapse
Affiliation(s)
- Jürgen Konczak
- Human Sensorimotor Control Laboratory, Center for Clinical Movement Science, School of Kinesiology, University of Minnesota Minneapolis, MN, USA
| | | |
Collapse
|
21
|
Sutherland GR, Lama S, Gan LS, Wolfsberger S, Zareinia K. Merging machines with microsurgery: clinical experience with neuroArm. J Neurosurg 2013; 118:521-9. [DOI: 10.3171/2012.11.jns12877] [Citation(s) in RCA: 56] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/22/2023]
Abstract
Object
It has been over a decade since the introduction of the da Vinci Surgical System into surgery. Since then, technology has been advancing at an exponential rate, and newer surgical robots are becoming increasingly sophisticated, which could greatly impact the performance of surgery. NeuroArm is one such robotic system.
Methods
Clinical integration of neuroArm, an MR-compatible image-guided robot, into surgical procedure has been developed over a prospective series of 35 cases with varying pathology.
Results
Only 1 adverse event was encountered in the first 35 neuroArm cases, with no patient injury. The adverse event was uncontrolled motion of the left neuroArm manipulator, which was corrected through a rigorous safety review procedure. Surgeons used a graded approach to introducing neuroArm into surgery, with routine dissection of the tumor-brain interface occurring over the last 15 cases. The use of neuroArm for routine dissection shows that robotic technology can be successfully integrated into microsurgery. Karnofsky performance status scores were significantly improved postoperatively and at 12-week follow-up.
Conclusions
Surgical robots have the potential to improve surgical precision and accuracy through motion scaling and tremor filters, although human surgeons currently possess superior speed and dexterity. Additionally, neuroArm's workstation has positive implications for technology management and surgical education. NeuroArm is a step toward a future in which a variety of machines are merged with medicine.
Collapse
|
22
|
Rincon-Gonzalez L, Naufel SN, Santos VJ, Helms Tillery S. Interactions Between Tactile and Proprioceptive Representations in Haptics. J Mot Behav 2012; 44:391-401. [DOI: 10.1080/00222895.2012.746281] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
|
23
|
Rincon-Gonzalez L, Buneo CA, Helms Tillery SI. The proprioceptive map of the arm is systematic and stable, but idiosyncratic. PLoS One 2011; 6:e25214. [PMID: 22110578 PMCID: PMC3217916 DOI: 10.1371/journal.pone.0025214] [Citation(s) in RCA: 56] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2011] [Accepted: 08/29/2011] [Indexed: 01/08/2023] Open
Abstract
Visual and somatosensory signals participate together in providing an estimate of the hand's spatial location. While the ability of subjects to identify the spatial location of their hand based on visual and proprioceptive signals has previously been characterized, relatively few studies have examined in detail the spatial structure of the proprioceptive map of the arm. Here, we reconstructed and analyzed the spatial structure of the estimation errors that resulted when subjects reported the location of their unseen hand across a 2D horizontal workspace. Hand position estimation was mapped under four conditions: with and without tactile feedback, and with the right and left hands. In the task, we moved each subject's hand to one of 100 targets in the workspace while their eyes were closed. Then, we either a) applied tactile stimulation to the fingertip by allowing the index finger to touch the target or b) as a control, hovered the fingertip 2 cm above the target. After returning the hand to a neutral position, subjects opened their eyes to verbally report where their fingertip had been. We measured and analyzed both the direction and magnitude of the resulting estimation errors. Tactile feedback reduced the magnitude of these estimation errors, but did not change their overall structure. In addition, the spatial structure of these errors was idiosyncratic: each subject had a unique pattern of errors that was stable between hands and over time. Finally, we found that at the population level the magnitude of the estimation errors had a characteristic distribution over the workspace: errors were smallest closer to the body. The stability of estimation errors across conditions and time suggests the brain constructs a proprioceptive map that is reliable, even if it is not necessarily accurate. The idiosyncrasy across subjects emphasizes that each individual constructs a map that is unique to their own experiences.
Collapse
Affiliation(s)
- Liliana Rincon-Gonzalez
- Graduate Program in Biomedical Engineering, School of Biological and Health Systems Engineering, and Department of Psychology, Arizona State University, Tempe, Arizona, United States of America
| | - Christopher A. Buneo
- Graduate Program in Biomedical Engineering, School of Biological and Health Systems Engineering, and Department of Psychology, Arizona State University, Tempe, Arizona, United States of America
| | - Stephen I. Helms Tillery
- Graduate Program in Biomedical Engineering, School of Biological and Health Systems Engineering, and Department of Psychology, Arizona State University, Tempe, Arizona, United States of America
- * E-mail:
| |
Collapse
|
24
|
Brain training: cortical plasticity and afferent feedback in brain-machine interface systems. IEEE Trans Neural Syst Rehabil Eng 2011; 19:465-7. [PMID: 21947530 DOI: 10.1109/tnsre.2011.2168989] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|