51
|
Mihelčič M, Podlesek A. The influence of proprioception on reading performance. Clin Exp Optom 2016; 100:138-143. [PMID: 27561230 DOI: 10.1111/cxo.12428] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2016] [Revised: 04/25/2016] [Accepted: 04/28/2016] [Indexed: 11/29/2022] Open
Abstract
BACKGROUND Visual ergonomics has an impact on visual performance in reading. Based on the assumption that reading from an LCD screen held in the hands provides more accurate information about the distance to the object than reading from a screen, which has no contact with our body, this study assessed the influence of the proprioceptive input on the speed of reading and on accommodative and pupillary responses and their micro-oscillations. METHODS Participants (n = 47; all emmetropic, non-amblyopic), were asked to read in their minds two-digit numerals presented on a 10-inch LCD screen at 40 cm distance. In one condition, the participants held the screen in their hands; in the other, the screen was placed on the holder and there was no body contact with the participant. The number of numerals read in 90 seconds was recorded. Accommodative and pupillary responses were measured with Power Refractor 3 at a 50 Hz measurement rate. RESULTS The number of numerals read was greater for the condition with proprioceptive input than in the condition without contact. The mean pupil size and the average accommodative response were similar in the two conditions. The rate of change in pupil size showed a steeper decline in the condition without the proprioceptive input compared to the condition with this input. The increase in the lag of accommodation with time was similar in both conditions, as well as pupillary and accommodative micro-oscillations. CONCLUSIONS When the screen was held in the hands, reading of numerals was faster and resulted in less pupil size change over the 90-second test interval. This indicates that proprioception might influence some aspects of visual performance.
Collapse
Affiliation(s)
- Matjaž Mihelčič
- Optometry Department, University of Velika Gorica, Velika Gorica, Croatia
| | - Anja Podlesek
- Department of Psychology, University of Ljubljana, Ljubljana, Slovenia
| |
Collapse
|
52
|
Sathian K. Analysis of haptic information in the cerebral cortex. J Neurophysiol 2016; 116:1795-1806. [PMID: 27440247 DOI: 10.1152/jn.00546.2015] [Citation(s) in RCA: 58] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2015] [Accepted: 07/20/2016] [Indexed: 11/22/2022] Open
Abstract
Haptic sensing of objects acquires information about a number of properties. This review summarizes current understanding about how these properties are processed in the cerebral cortex of macaques and humans. Nonnoxious somatosensory inputs, after initial processing in primary somatosensory cortex, are partially segregated into different pathways. A ventrally directed pathway carries information about surface texture into parietal opercular cortex and thence to medial occipital cortex. A dorsally directed pathway transmits information regarding the location of features on objects to the intraparietal sulcus and frontal eye fields. Shape processing occurs mainly in the intraparietal sulcus and lateral occipital complex, while orientation processing is distributed across primary somatosensory cortex, the parietal operculum, the anterior intraparietal sulcus, and a parieto-occipital region. For each of these properties, the respective areas outside primary somatosensory cortex also process corresponding visual information and are thus multisensory. Consistent with the distributed neural processing of haptic object properties, tactile spatial acuity depends on interaction between bottom-up tactile inputs and top-down attentional signals in a distributed neural network. Future work should clarify the roles of the various brain regions and how they interact at the network level.
Collapse
Affiliation(s)
- K Sathian
- Departments of Neurology, Rehabilitation Medicine and Psychology, Emory University, Atlanta, Georgia; and Center for Visual and Neurocognitive Rehabilitation, Atlanta Department of Veterans Affairs Medical Center, Decatur, Georgia
| |
Collapse
|
53
|
Gomez-Ramirez M, Hysaj K, Niebur E. Neural mechanisms of selective attention in the somatosensory system. J Neurophysiol 2016; 116:1218-31. [PMID: 27334956 DOI: 10.1152/jn.00637.2015] [Citation(s) in RCA: 39] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/25/2015] [Accepted: 06/09/2016] [Indexed: 11/22/2022] Open
Abstract
Selective attention allows organisms to extract behaviorally relevant information while ignoring distracting stimuli that compete for the limited resources of their central nervous systems. Attention is highly flexible, and it can be harnessed to select information based on sensory modality, within-modality feature(s), spatial location, object identity, and/or temporal properties. In this review, we discuss the body of work devoted to understanding mechanisms of selective attention in the somatosensory system. In particular, we describe the effects of attention on tactile behavior and corresponding neural activity in somatosensory cortex. Our focus is on neural mechanisms that select tactile stimuli based on their location on the body (somatotopic-based attention) or their sensory feature (feature-based attention). We highlight parallels between selection mechanisms in touch and other sensory systems and discuss several putative neural coding schemes employed by cortical populations to signal the behavioral relevance of sensory inputs. Specifically, we contrast the advantages and disadvantages of using a gain vs. spike-spike correlation code for representing attended sensory stimuli. We favor a neural network model of tactile attention that is composed of frontal, parietal, and subcortical areas that controls somatosensory cells encoding the relevant stimulus features to enable preferential processing throughout the somatosensory hierarchy. Our review is based on data from noninvasive electrophysiological and imaging data in humans as well as single-unit recordings in nonhuman primates.
Collapse
Affiliation(s)
- Manuel Gomez-Ramirez
- Department of Neuroscience, Brown University, Providence, Rhode Island; The Zanvyl Krieger Mind/Brain Institute, The Johns Hopkins University, Baltimore, Maryland; and The Solomon H. Snyder Department of Neuroscience, The Johns Hopkins School of Medicine, Baltimore, Maryland
| | - Kristjana Hysaj
- The Zanvyl Krieger Mind/Brain Institute, The Johns Hopkins University, Baltimore, Maryland; and
| | - Ernst Niebur
- The Zanvyl Krieger Mind/Brain Institute, The Johns Hopkins University, Baltimore, Maryland; and The Solomon H. Snyder Department of Neuroscience, The Johns Hopkins School of Medicine, Baltimore, Maryland
| |
Collapse
|
54
|
Rosenblum LD, Dias JW, Dorsi J. The supramodal brain: implications for auditory perception. JOURNAL OF COGNITIVE PSYCHOLOGY 2016. [DOI: 10.1080/20445911.2016.1181691] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
|
55
|
McIntyre S, Birznieks I, Vickery RM, Holcombe AO, Seizova-Cajic T. The tactile motion aftereffect suggests an intensive code for speed in neurons sensitive to both speed and direction of motion. J Neurophysiol 2016; 115:1703-12. [PMID: 26823511 PMCID: PMC4808137 DOI: 10.1152/jn.00460.2015] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2015] [Accepted: 01/13/2016] [Indexed: 11/22/2022] Open
Abstract
Neurophysiological studies in primates have found that direction-sensitive neurons in the primary somatosensory cortex (SI) generally increase their response rate with increasing speed of object motion across the skin and show little evidence of speed tuning. We employed psychophysics to determine whether human perception of motion direction could be explained by features of such neurons and whether evidence can be found for a speed-tuned process. After adaptation to motion across the skin, a subsequently presented dynamic test stimulus yields an impression of motion in the opposite direction. We measured the strength of this tactile motion aftereffect (tMAE) induced with different combinations of adapting and test speeds. Distal-to-proximal or proximal-to-distal adapting motion was applied to participants' index fingers using a tactile array, after which participants reported the perceived direction of a bidirectional test stimulus. An intensive code for speed, like that observed in SI neurons, predicts greater adaptation (and a stronger tMAE) the faster the adapting speed, regardless of the test speed. In contrast, speed tuning of direction-sensitive neurons predicts the greatest tMAE when the adapting and test stimuli have matching speeds. We found that the strength of the tMAE increased monotonically with adapting speed, regardless of the test speed, showing no evidence of speed tuning. Our data are consistent with neurophysiological findings that suggest an intensive code for speed along the motion processing pathways comprising neurons sensitive both to speed and direction of motion.
Collapse
Affiliation(s)
- S McIntyre
- School of Psychology, University of Sydney, Sydney, Australia; Neuroscience Research Australia, Sydney, Australia; Faculty of Health Sciences, University of Sydney, Sydney, Australia; MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia; and
| | - I Birznieks
- Neuroscience Research Australia, Sydney, Australia; MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia; and School of Medical Sciences, University of New South Wales, Australia, Sydney, Australia
| | - R M Vickery
- Neuroscience Research Australia, Sydney, Australia; School of Medical Sciences, University of New South Wales, Australia, Sydney, Australia
| | - A O Holcombe
- School of Psychology, University of Sydney, Sydney, Australia
| | - T Seizova-Cajic
- Faculty of Health Sciences, University of Sydney, Sydney, Australia
| |
Collapse
|
56
|
Greenlee M, Frank S, Kaliuzhna M, Blanke O, Bremmer F, Churan J, Cuturi LF, MacNeilage P, Smith A. Multisensory Integration in Self Motion Perception. Multisens Res 2016. [DOI: 10.1163/22134808-00002527] [Citation(s) in RCA: 43] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/19/2022]
Abstract
Self motion perception involves the integration of visual, vestibular, somatosensory and motor signals. This article reviews the findings from single unit electrophysiology, functional and structural magnetic resonance imaging and psychophysics to present an update on how the human and non-human primate brain integrates multisensory information to estimate one’s position and motion in space. The results indicate that there is a network of regions in the non-human primate and human brain that processes self motion cues from the different sense modalities.
Collapse
Affiliation(s)
- Mark W. Greenlee
- Institute of Experimental Psychology, University of Regensburg, Regensburg, Germany
| | - Sebastian M. Frank
- Institute of Experimental Psychology, University of Regensburg, Regensburg, Germany
- Department of Psychological and Brain Sciences, Dartmouth College, Hanover, NH, USA
| | - Mariia Kaliuzhna
- Center for Neuroprosthetics, Laboratory of Cognitive Neuroscience, Ecole Polytechnique Fédérale de Lausanne, EPFL, Switzerland
| | - Olaf Blanke
- Center for Neuroprosthetics, Laboratory of Cognitive Neuroscience, Ecole Polytechnique Fédérale de Lausanne, EPFL, Switzerland
| | - Frank Bremmer
- Department of Neurophysics, University of Marburg, Marburg, Germany
| | - Jan Churan
- Department of Neurophysics, University of Marburg, Marburg, Germany
| | - Luigi F. Cuturi
- German Center for Vertigo, University Hospital of Munich, LMU, Munich, Germany
| | - Paul R. MacNeilage
- German Center for Vertigo, University Hospital of Munich, LMU, Munich, Germany
| | - Andrew T. Smith
- Department of Psychology, Royal Holloway, University of London, UK
| |
Collapse
|
57
|
Hidaka S, Teramoto W, Sugita Y. Spatiotemporal Processing in Crossmodal Interactions for Perception of the External World: A Review. Front Integr Neurosci 2015; 9:62. [PMID: 26733827 PMCID: PMC4686600 DOI: 10.3389/fnint.2015.00062] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2015] [Accepted: 12/03/2015] [Indexed: 11/13/2022] Open
Abstract
Research regarding crossmodal interactions has garnered much interest in the last few decades. A variety of studies have demonstrated that multisensory information (vision, audition, tactile sensation, and so on) can perceptually interact with each other in the spatial and temporal domains. Findings regarding crossmodal interactions in the spatiotemporal domain (i.e., motion processing) have also been reported, with updates in the last few years. In this review, we summarize past and recent findings on spatiotemporal processing in crossmodal interactions regarding perception of the external world. A traditional view regarding crossmodal interactions holds that vision is superior to audition in spatial processing, but audition is dominant over vision in temporal processing. Similarly, vision is considered to have dominant effects over the other sensory modalities (i.e., visual capture) in spatiotemporal processing. However, recent findings demonstrate that sound could have a driving effect on visual motion perception. Moreover, studies regarding perceptual associative learning reported that, after association is established between a sound sequence without spatial information and visual motion information, the sound sequence could trigger visual motion perception. Other sensory information, such as motor action or smell, has also exhibited similar driving effects on visual motion perception. Additionally, recent brain imaging studies demonstrate that similar activation patterns could be observed in several brain areas, including the motion processing areas, between spatiotemporal information from different sensory modalities. Based on these findings, we suggest that multimodal information could mutually interact in spatiotemporal processing in the percept of the external world and that common perceptual and neural underlying mechanisms would exist for spatiotemporal processing.
Collapse
Affiliation(s)
- Souta Hidaka
- Department of Psychology, Rikkyo University Saitama, Japan
| | - Wataru Teramoto
- Department of Psychology, Kumamoto University Kumamoto, Japan
| | - Yoichi Sugita
- Department of Psychology, Waseda University Tokyo, Japan
| |
Collapse
|
58
|
Markel PD. Spatial Memory for Patterns of Taps on the Fingers. IEEE TRANSACTIONS ON HAPTICS 2015; 8:447-453. [PMID: 26259248 DOI: 10.1109/toh.2015.2462831] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Abstract
Ongoing development of haptic technology has the potential to provide significant improvement in safety and performance in demanding environments where vision and hearing are compromised. Research regarding the cognitive psychology of touch is lacking and could be beneficial in the development of expectations about human performance for the refinement and implementation of haptic technology. This study examines haptic-spatial memory using a novel assessment method based on finger anatomy. In addition, evidence is presented for a serial-position effect for haptic-spatial memory that is analogous to the classic serial-position effect demonstrated in the verbal recall of word lists. Finally, haptic-spatial memory is compared with short- and long-term memory for visual-spatial tasks.
Collapse
|
59
|
Abstract
While the different sensory modalities are sensitive to different stimulus energies, they are often charged with extracting analogous information about the environment. Neural systems may thus have evolved to implement similar algorithms across modalities to extract behaviorally relevant stimulus information, leading to the notion of a canonical computation. In both vision and touch, information about motion is extracted from a spatiotemporal pattern of activation across a sensory sheet (in the retina and in the skin, respectively), a process that has been extensively studied in both modalities. In this essay, we examine the processing of motion information as it ascends the primate visual and somatosensory neuraxes and conclude that similar computations are implemented in the two sensory systems. A close look at the cortical areas that support vision and touch suggests that the brain uses similar computational strategies to handle different kinds of sensory inputs.
Collapse
|
60
|
Moscatelli A, Hayward V, Wexler M, Ernst MO. Illusory Tactile Motion Perception: An Analog of the Visual Filehne Illusion. Sci Rep 2015; 5:14584. [PMID: 26412592 PMCID: PMC4585937 DOI: 10.1038/srep14584] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2015] [Accepted: 08/17/2015] [Indexed: 11/29/2022] Open
Abstract
We continually move our body and our eyes when exploring the world, causing our sensory surfaces, the skin and the retina, to move relative to external objects. In order to estimate object motion consistently, an ideal observer would transform estimates of motion acquired from the sensory surface into fixed, world-centered estimates, by taking the motion of the sensor into account. This ability is referred to as spatial constancy. Human vision does not follow this rule strictly and is therefore subject to perceptual illusions during eye movements, where immobile objects can appear to move. Here, we investigated whether one of these, the Filehne illusion, had a counterpart in touch. To this end, observers estimated the movement of a surface from tactile slip, with a moving or with a stationary finger. We found the perceived movement of the surface to be biased if the surface was sensed while moving. This effect exemplifies a failure of spatial constancy that is similar to the Filehne illusion in vision. We quantified this illusion by using a Bayesian model with a prior for stationarity, applied previously in vision. The analogy between vision and touch points to a modality-independent solution to the spatial constancy problem.
Collapse
Affiliation(s)
- Alessandro Moscatelli
- Department of Cognitive Neuroscience, University of Bielefeld, Bielefeld, Germany.,Cognitive Interaction Technology Centre of Excellence, University of Bielefeld, Bielefeld, Germany
| | - Vincent Hayward
- Sorbonne Universités, UPMC Univ Paris 06, UMR 7222, ISIR, F-75005, Paris, France
| | - Mark Wexler
- CNRS, UMR 7222, ISIR, F-75005, Paris, France.,Laboratoire Psychologie de la Perception and CNRS, Université Paris Descartes, F-75006 Paris, France
| | - Marc O Ernst
- Department of Cognitive Neuroscience, University of Bielefeld, Bielefeld, Germany.,Cognitive Interaction Technology Centre of Excellence, University of Bielefeld, Bielefeld, Germany.,Multisensory Perception and Action Group, Max Planck Institute for Biological Cybernetics, Tübingen, Germany
| |
Collapse
|
61
|
Andreeva IG. The motion aftereffect as a universal phenomenon in sensory systems involved in space orientation: II. Auditory motion aftereffect. J EVOL BIOCHEM PHYS+ 2015. [DOI: 10.1134/s0022093015030015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
62
|
Levitan CA, Ban YHA, Stiles NRB, Shimojo S. Rate perception adapts across the senses: evidence for a unified timing mechanism. Sci Rep 2015; 5:8857. [PMID: 25748443 PMCID: PMC4894401 DOI: 10.1038/srep08857] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2014] [Accepted: 02/04/2015] [Indexed: 11/09/2022] Open
Abstract
The brain constructs a representation of temporal properties of events, such as duration and frequency, but the underlying neural mechanisms are under debate. One open question is whether these mechanisms are unisensory or multisensory. Duration perception studies provide some evidence for a dissociation between auditory and visual timing mechanisms; however, we found active crossmodal interaction between audition and vision for rate perception, even when vision and audition were never stimulated together. After exposure to 5 Hz adaptors, people perceived subsequent test stimuli centered around 4 Hz to be slower, and the reverse after exposure to 3 Hz adaptors. This aftereffect occurred even when the adaptor and test were different modalities that were never presented together. When the discrepancy in rate between adaptor and test increased, the aftereffect was attenuated, indicating that the brain uses narrowly-tuned channels to process rate information. Our results indicate that human timing mechanisms for rate perception are not entirely segregated between modalities and have substantial implications for models of how the brain encodes temporal features. We propose a model of multisensory channels for rate perception, and consider the broader implications of such a model for how the brain encodes timing.
Collapse
Affiliation(s)
- Carmel A Levitan
- Cognitive Science, Occidental College, 1600 Campus Road, Los Angeles CA 90041
| | - Yih-Hsin A Ban
- Cognitive Science, Occidental College, 1600 Campus Road, Los Angeles CA 90041
| | - Noelle R B Stiles
- Computation and Neural Systems, California Institute of Technology, 1200 E. California Blvd, Pasadena, CA 91125
| | - Shinsuke Shimojo
- 1] Computation and Neural Systems, California Institute of Technology, 1200 E. California Blvd, Pasadena, CA 91125 [2] Division of Biology and Biological Engineering, California Institute of Technology, 1200 E. California Blvd, Pasadena, CA 91125
| |
Collapse
|
63
|
Affiliation(s)
- Katherine R Storrs
- Perception Lab, School of Psychology, University of Queensland Brisbane, QLD, Australia
| |
Collapse
|
64
|
Andreeva IG. The motion aftereffect as a universal phenomenon for sensory systems involved in spatial orientation: I. Visual aftereffects. J EVOL BIOCHEM PHYS+ 2015. [DOI: 10.1134/s0022093014060015] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
65
|
Cuturi L, MacNeilage P. Optic Flow Induces Nonvisual Self-Motion Aftereffects. Curr Biol 2014; 24:2817-21. [DOI: 10.1016/j.cub.2014.10.015] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2014] [Revised: 08/26/2014] [Accepted: 10/07/2014] [Indexed: 11/30/2022]
|
66
|
Abstract
The manipulation of objects commonly involves motion between object and skin. In this review, we discuss the neural basis of tactile motion perception and its similarities with its visual counterpart. First, much like in vision, the perception of tactile motion relies on the processing of spatiotemporal patterns of activation across populations of sensory receptors. Second, many neurons in primary somatosensory cortex are highly sensitive to motion direction, and the response properties of these neurons draw strong analogies to those of direction-selective neurons in visual cortex. Third, tactile speed may be encoded in the strength of the response of cutaneous mechanoreceptive afferents and of a subpopulation of speed-sensitive neurons in cortex. However, both afferent and cortical responses are strongly dependent on texture as well, so it is unclear how texture and speed signals are disambiguated. Fourth, motion signals from multiple fingers must often be integrated during the exploration of objects, but the way these signals are combined is complex and remains to be elucidated. Finally, visual and tactile motion perception interact powerfully, an integration process that is likely mediated by visual association cortex.
Collapse
Affiliation(s)
- Yu-Cheng Pei
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital, Taoyuan, Taiwan, Republic of China; Healthy Aging Research Center, Chang Gung University, Taoyuan, Taiwan, Republic of China
| | - Sliman J Bensmaia
- Department of Organismal Biology and Anatomy, University of Chicago, Chicago, Illinois; and Committee on Computational Neuroscience, University of Chicago, Chicago, Illinois
| |
Collapse
|
67
|
Moscatelli A, Naceri A, Ernst MO. Path integration in tactile perception of shapes. Behav Brain Res 2014; 274:355-64. [PMID: 25151621 DOI: 10.1016/j.bbr.2014.08.025] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2014] [Revised: 08/07/2014] [Accepted: 08/11/2014] [Indexed: 11/17/2022]
Abstract
Whenever we move the hand across a surface, tactile signals provide information about the relative velocity between the skin and the surface. If the system were able to integrate the tactile velocity information over time, cutaneous touch may provide an estimate of the relative displacement between the hand and the surface. Here, we asked whether humans are able to form a reliable representation of the motion path from tactile cues only, integrating motion information over time. In order to address this issue, we conducted three experiments using tactile motion and asked participants (1) to estimate the length of a simulated triangle, (2) to reproduce the shape of a simulated triangular path, and (3) to estimate the angle between two-line segments. Participants were able to accurately indicate the length of the path, whereas the perceived direction was affected by a direction bias (inward bias). The response pattern was thus qualitatively similar to the ones reported in classical path integration studies involving locomotion. However, we explain the directional biases as the result of a tactile motion aftereffect.
Collapse
Affiliation(s)
- Alessandro Moscatelli
- Cognitive Neuroscience Department, Bielefeld University, 33615 Bielefeld, Germany; Cognitive Interaction Technology-Center of Excellence, Bielefeld University, 33615 Bielefeld, Germany.
| | - Abdeldjallil Naceri
- Cognitive Neuroscience Department, Bielefeld University, 33615 Bielefeld, Germany; Cognitive Interaction Technology-Center of Excellence, Bielefeld University, 33615 Bielefeld, Germany
| | - Marc O Ernst
- Cognitive Neuroscience Department, Bielefeld University, 33615 Bielefeld, Germany; Cognitive Interaction Technology-Center of Excellence, Bielefeld University, 33615 Bielefeld, Germany
| |
Collapse
|
68
|
Experience with a talker can transfer across modalities to facilitate lipreading. Atten Percept Psychophys 2014; 75:1359-65. [PMID: 23955059 DOI: 10.3758/s13414-013-0534-x] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Rosenblum, Miller, and Sanchez (Psychological Science, 18, 392-396, 2007) found that subjects first trained to lip-read a particular talker were then better able to perceive the auditory speech of that same talker, as compared with that of a novel talker. This suggests that the talker experience a perceiver gains in one sensory modality can be transferred to another modality to make that speech easier to perceive. An experiment was conducted to examine whether this cross-sensory transfer of talker experience could occur (1) from auditory to lip-read speech, (2) with subjects not screened for adequate lipreading skill, (3) when both a familiar and an unfamiliar talker are presented during lipreading, and (4) for both old (presentation set) and new words. Subjects were first asked to identify a set of words from a talker. They were then asked to perform a lipreading task from two faces, one of which was of the same talker they heard in the first phase of the experiment. Results revealed that subjects who lip-read from the same talker they had heard performed better than those who lip-read a different talker, regardless of whether the words were old or new. These results add further evidence that learning of amodal talker information can facilitate speech perception across modalities and also suggest that this information is not restricted to previously heard words.
Collapse
|
69
|
Tactile and visual motion direction processing in hMT+/V5. Neuroimage 2014; 84:420-7. [DOI: 10.1016/j.neuroimage.2013.09.004] [Citation(s) in RCA: 48] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2013] [Revised: 08/20/2013] [Accepted: 09/03/2013] [Indexed: 11/18/2022] Open
|
70
|
Matsumiya K. Seeing a haptically explored face: visual facial-expression aftereffect from haptic adaptation to a face. Psychol Sci 2013; 24:2088-98. [PMID: 24002886 DOI: 10.1177/0956797613486981] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
Current views on face perception assume that the visual system receives only visual facial signals. However, I show that the visual perception of faces is systematically biased by adaptation to a haptically explored face. Recently, face aftereffects (FAEs; the altered perception of faces after adaptation to a face) have been demonstrated not only in visual perception but also in haptic perception; therefore, I combined the two FAEs to examine whether the visual system receives face-related signals from the haptic modality. I found that adaptation to a haptically explored facial expression on a face mask produced a visual FAE for facial expression. This cross-modal FAE was not due to explicitly imaging a face, response bias, or adaptation to local features. Furthermore, FAEs transferred from vision to haptics. These results indicate that visual face processing depends on substrates adapted by haptic faces, which suggests that face processing relies on shared representation underlying cross-modal interactions.
Collapse
|
71
|
Pei YC, Chang TY, Lee TC, Saha S, Lai HY, Gomez-Ramirez M, Chou SW, Wong AMK. Cross-modal sensory integration of visual-tactile motion information: instrument design and human psychophysics. SENSORS 2013; 13:7212-23. [PMID: 23727955 PMCID: PMC3715219 DOI: 10.3390/s130607212] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/18/2013] [Revised: 05/22/2013] [Accepted: 05/23/2013] [Indexed: 11/23/2022]
Abstract
Information obtained from multiple sensory modalities, such as vision and touch, is integrated to yield a holistic percept. As a haptic approach usually involves cross-modal sensory experiences, it is necessary to develop an apparatus that can characterize how a biological system integrates visual-tactile sensory information as well as how a robotic device infers object information emanating from both vision and touch. In the present study, we develop a novel visual-tactile cross-modal integration stimulator that consists of an LED panel to present visual stimuli and a tactile stimulator with three degrees of freedom that can present tactile motion stimuli with arbitrary motion direction, speed, and indentation depth in the skin. The apparatus can present cross-modal stimuli in which the spatial locations of visual and tactile stimulations are perfectly aligned. We presented visual-tactile stimuli in which the visual and tactile directions were either congruent or incongruent, and human observers reported the perceived visual direction of motion. Results showed that perceived direction of visual motion can be biased by the direction of tactile motion when visual signals are weakened. The results also showed that the visual-tactile motion integration follows the rule of temporal congruency of multi-modal inputs, a fundamental property known for cross-modal integration.
Collapse
Affiliation(s)
- Yu-Cheng Pei
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
- Healthy Aging Research Center, Chang Gung University, No. 259, Wen-Hwa 1st Road, Taoyuan 333, Taiwan
- School of Medicine, Chang Gung University, No. 259, Wen-Hwa 1st Road, Taoyuan 333, Taiwan
- Author to whom correspondence should be addressed; E-Mail: ; Tel.: +886-33281200 (ext. 8146); Fax: +886-33281200 (ext. 2667)
| | - Ting-Yu Chang
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
| | - Tsung-Chi Lee
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
| | - Sudipta Saha
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
| | - Hsin-Yi Lai
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
| | - Manuel Gomez-Ramirez
- The Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, 3400 N. Charles Street 338 Krieger Hall, Baltimore, MD 21218, USA; E-Mail:
| | - Shih-Wei Chou
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
| | - Alice M. K. Wong
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
| |
Collapse
|
72
|
Konkle T, Moore CI. What can crossmodal aftereffects reveal about neural representation and dynamics? Commun Integr Biol 2012; 2:479-81. [PMID: 22811763 PMCID: PMC3398893 DOI: 10.4161/cib.2.6.9344] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022] Open
Abstract
The brain continuously adapts to incoming sensory stimuli, which can lead to perceptual illusions in the form of aftereffects. Recently we demonstrated that motion aftereffects transfer between vision and touch.(1) Here, the adapted brain state induced by one modality has consequences for processes in another modality, implying that somewhere in the processing stream, visual and tactile motion have shared underlying neural representations. We propose the adaptive processing hypothesis-any area that processes a stimulus adapts to the features of the stimulus it represents, and this adaptation has consequences for perception. This view argues that there is no single locus of an aftereffect. Rather, aftereffects emerge when the test stimulus used to probe the effect of adaptation requires processing of a given type. The illusion will reflect the properties of the brain area(s) that support that specific level of representation. We further suggest that many cortical areas are more process-dependent than modality-dependent, with crossmodal interactions reflecting shared processing demands in even 'early' sensory cortices.
Collapse
|
73
|
McIntyre S, Holcombe AO, Birznieks I, Seizova-Cajic T. Tactile motion adaptation reduces perceived speed but shows no evidence of direction sensitivity. PLoS One 2012; 7:e45438. [PMID: 23029010 PMCID: PMC3454433 DOI: 10.1371/journal.pone.0045438] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2012] [Accepted: 08/20/2012] [Indexed: 12/03/2022] Open
Abstract
INTRODUCTION While the directionality of tactile motion processing has been studied extensively, tactile speed processing and its relationship to direction is little-researched and poorly understood. We investigated this relationship in humans using the 'tactile speed aftereffect' (tSAE), in which the speed of motion appears slower following prolonged exposure to a moving surface. METHOD We used psychophysical methods to test whether the tSAE is direction sensitive. After adapting to a ridged moving surface with one hand, participants compared the speed of test stimuli on the adapted and unadapted hands. We varied the direction of the adapting stimulus relative to the test stimulus. RESULTS Perceived speed of the surface moving at 81 mms(-1) was reduced by about 30% regardless of the direction of the adapting stimulus (when adapted in the same direction, Mean reduction = 23 mms(-1), SD = 11; with opposite direction, Mean reduction = 26 mms(-1), SD = 9). In addition to a large reduction in perceived speed due to adaptation, we also report that this effect is not direction sensitive. CONCLUSIONS Tactile motion is susceptible to speed adaptation. This result complements previous reports of reliable direction aftereffects when using a dynamic test stimulus as together they describe how perception of a moving stimulus in touch depends on the immediate history of stimulation. Given that the tSAE is not direction sensitive, we argue that peripheral adaptation does not explain it, because primary afferents are direction sensitive with friction-creating stimuli like ours (thus motion in their preferred direction should result in greater adaptation, and if perceived speed were critically dependent on these afferents' response intensity, the tSAE should be direction sensitive). The adaptation that reduces perceived speed therefore seems to be of central origin.
Collapse
Affiliation(s)
- Sarah McIntyre
- Faculty of Health Sciences, University of Sydney, Sydney, Australia.
| | | | | | | |
Collapse
|
74
|
van Elk M, Blanke O. Balancing bistable perception during self-motion. Exp Brain Res 2012; 222:219-28. [DOI: 10.1007/s00221-012-3209-2] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2012] [Accepted: 07/24/2012] [Indexed: 11/28/2022]
|
75
|
Bair WN, Kiemel T, Jeka JJ, Clark JE. Development of multisensory reweighting is impaired for quiet stance control in children with developmental coordination disorder (DCD). PLoS One 2012; 7:e40932. [PMID: 22815872 PMCID: PMC3399799 DOI: 10.1371/journal.pone.0040932] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2012] [Accepted: 06/15/2012] [Indexed: 11/29/2022] Open
Abstract
Background Developmental Coordination Disorder (DCD) is a leading movement disorder in children that commonly involves poor postural control. Multisensory integration deficit, especially the inability to adaptively reweight to changing sensory conditions, has been proposed as a possible mechanism but with insufficient characterization. Empirical quantification of reweighting significantly advances our understanding of its developmental onset and improves the characterization of its difference in children with DCD compared to their typically developing (TD) peers. Methodology/Principal Findings Twenty children with DCD (6.6 to 11.8 years) were tested with a protocol in which visual scene and touch bar simultaneously oscillateded medio-laterally at different frequencies and various amplitudes. Their data were compared to data on TD children (4.2 to 10.8 years) from a previous study. Gains and phases were calculated for medio-lateral responses of the head and center of mass to both sensory stimuli. Gains and phases were simultaneously fitted by linear functions of age for each amplitude condition, segment, modality and group. Fitted gains and phases at two comparison ages (6.6 and 10.8 years) were tested for reweighting within each group and for group differences. Children with DCD reweight touch and vision at a later age (10.8 years) than their TD peers (4.2 years). Children with DCD demonstrate a weak visual reweighting, no advanced multisensory fusion and phase lags larger than those of TD children in response to both touch and vision. Conclusions/Significance Two developmental perspectives, postural body scheme and dorsal stream development, are provided to explain the weak vision reweighting. The lack of multisensory fusion supports the notion that optimal multisensory integration is a slow developmental process and is vulnerable in children with DCD.
Collapse
Affiliation(s)
- Woei-Nan Bair
- Department of Physical Therapy and Rehabilitation Science, University of Maryland, Baltimore, Baltimore, Maryland, United States of America.
| | | | | | | |
Collapse
|
76
|
Abstract
How do people learn multisensory, or amodal, representations, and what consequences do these representations have for perceptual performance? We address this question by performing a rational analysis of the problem of learning multisensory representations. This analysis makes use of a Bayesian nonparametric model that acquires latent multisensory features that optimally explain the unisensory features arising in individual sensory modalities. The model qualitatively accounts for several important aspects of multisensory perception: (a) it integrates information from multiple sensory sources in such a way that it leads to superior performances in, for example, categorization tasks; (b) its performances suggest that multisensory training leads to better learning than unisensory training, even when testing is conducted in unisensory conditions; (c) its multisensory representations are modality invariant; and (d) it predicts ''missing" sensory representations in modalities when the input to those modalities is absent. Our rational analysis indicates that all of these aspects emerge as part of the optimal solution to the problem of learning to represent complex multisensory environments.
Collapse
Affiliation(s)
- Ilker Yildirim
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY 14627, USA
| | | |
Collapse
|
77
|
Kuroki S, Watanabe J, Mabuchi K, Tachi S, Nishida S. Directional remapping in tactile inter-finger apparent motion: a motion aftereffect study. Exp Brain Res 2011; 216:311-20. [PMID: 22080151 DOI: 10.1007/s00221-011-2936-0] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2011] [Accepted: 10/31/2011] [Indexed: 11/27/2022]
Abstract
Tactile motion provides critical information for perception and manipulation of objects in touch. Perceived directions of tactile motion are primarily defined in the environmental coordinate, which means they change drastically with body posture even when the same skin sensors are stimulated. Despite the ecological importance of this perceptual constancy, the sensory processing underlying tactile directional remapping remains poorly understood. The present study psychophysically investigated the mechanisms underlying directional remapping in human tactile motion processing by examining whether finger posture modulates the direction of the tactile motion aftereffect (MAE) induced by inter-finger apparent motions. We introduced conflicts in the adaptation direction between somatotopic and environmental spaces by having participants change their finger posture between adaptation and test phases. In a critical condition, they touched stimulators with crossed index and middle fingers during adaptation but with uncrossed fingers during tests. Since the adaptation effect was incongruent between the somatotopic and environmental spaces, the direction of the MAE reflects the coordinate of tactile motion processing. The results demonstrated that the tactile MAE was induced in accordance with the motion direction determined by the environmental rather than the somatotopic space. In addition, it was found that though the physical adaptation of the test fingers was not changed, the tactile MAE disappeared when the adaptation stimuli were vertically aligned or when subjective motion perception was suppressed during adaptation. We also found that the tactile MAE, measured with our procedure, did not transfer across different hands, which implies that the observed MAEs mainly reflect neural adaptations occurring within sensor-specific, tactile-specific processing. The present findings provide a novel behavioral method to analyze the neural representation for directional remapping of tactile motion within tactile sensory processing in the human brain.
Collapse
Affiliation(s)
- Scinob Kuroki
- NTT Communication Science Laboratories, Nippon Telegraph and Telephone Corporation, 3-1 Morinosato-Wakamiya, Atsugi, Kanagawa 243-0198, Japan.
| | | | | | | | | |
Collapse
|
78
|
Tomassini A, Gori M, Burr D, Sandini G, Morrone MC. Perceived duration of Visual and Tactile Stimuli Depends on Perceived Speed. Front Integr Neurosci 2011; 5:51. [PMID: 21941471 PMCID: PMC3170919 DOI: 10.3389/fnint.2011.00051] [Citation(s) in RCA: 39] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2011] [Accepted: 08/23/2011] [Indexed: 11/13/2022] Open
Abstract
IT IS KNOWN THAT THE PERCEIVED DURATION OF VISUAL STIMULI IS STRONGLY INFLUENCED BY SPEED: faster moving stimuli appear to last longer. To test whether this is a general property of sensory systems we asked participants to reproduce the duration of visual and tactile gratings, and visuo-tactile gratings moving at a variable speed (3.5-15 cm/s) for three different durations (400, 600, and 800 ms). For both modalities, the apparent duration of the stimulus increased strongly with stimulus speed, more so for tactile than for visual stimuli. In addition, visual stimuli were perceived to last approximately 200 ms longer than tactile stimuli. The apparent duration of visuo-tactile stimuli lay between the unimodal estimates, as the Bayesian account predicts, but the bimodal precision of the reproduction did not show the theoretical improvement. A cross-modal speed-matching task revealed that visual stimuli were perceived to move faster than tactile stimuli. To test whether the large difference in the perceived duration of visual and tactile stimuli resulted from the difference in their perceived speed, we repeated the time reproduction task with visual and tactile stimuli matched in apparent speed. This reduced, but did not completely eliminate the difference in apparent duration. These results show that for both vision and touch, perceived duration depends on speed, pointing to common strategies of time perception.
Collapse
Affiliation(s)
- Alice Tomassini
- Department of Robotics, Brain and Cognitive Sciences, Istituto Italiano di Tecnologia Genova, Italy
| | | | | | | | | |
Collapse
|
79
|
Burr D, Thompson P. Motion psychophysics: 1985–2010. Vision Res 2011; 51:1431-56. [PMID: 21324335 DOI: 10.1016/j.visres.2011.02.008] [Citation(s) in RCA: 119] [Impact Index Per Article: 9.2] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2010] [Revised: 02/08/2011] [Accepted: 02/09/2011] [Indexed: 11/19/2022]
Affiliation(s)
- David Burr
- Department of Psychology, University of Florence, Florence, Italy.
| | | |
Collapse
|
80
|
Gori M, Mazzilli G, Sandini G, Burr D. Cross-Sensory Facilitation Reveals Neural Interactions between Visual and Tactile Motion in Humans. Front Psychol 2011; 2:55. [PMID: 21734892 PMCID: PMC3110703 DOI: 10.3389/fpsyg.2011.00055] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2010] [Accepted: 03/23/2011] [Indexed: 11/13/2022] Open
Abstract
Many recent studies show that the human brain integrates information across the different senses and that stimuli of one sensory modality can enhance the perception of other modalities. Here we study the processes that mediate cross-modal facilitation and summation between visual and tactile motion. We find that while summation produced a generic, non-specific improvement of thresholds, probably reflecting higher-order interaction of decision signals, facilitation reveals a strong, direction-specific interaction, which we believe reflects sensory interactions. We measured visual and tactile velocity discrimination thresholds over a wide range of base velocities and conditions. Thresholds for both visual and tactile stimuli showed the characteristic “dipper function,” with the minimum thresholds occurring at a given “pedestal speed.” When visual and tactile coherent stimuli were combined (summation condition) the thresholds for these multisensory stimuli also showed a “dipper function” with the minimum thresholds occurring in a similar range to that for unisensory signals. However, the improvement of multisensory thresholds was weak and not directionally specific, well predicted by the maximum-likelihood estimation model (agreeing with previous research). A different technique (facilitation) did, however, reveal direction-specific enhancement. Adding a non-informative “pedestal” motion stimulus in one sensory modality (vision or touch) selectively lowered thresholds in the other, by the same amount as pedestals in the same modality. Facilitation did not occur for neutral stimuli like sounds (that would also have reduced temporal uncertainty), nor for motion in opposite direction, even in blocked trials where the subjects knew that the motion was in the opposite direction showing that the facilitation was not under subject control. Cross-sensory facilitation is strong evidence for functionally relevant cross-sensory integration at early levels of sensory processing.
Collapse
Affiliation(s)
- Monica Gori
- Istituto Italiano di Tecnologia, Robotics, Brain and Cognitive Sciences Genova, Italy
| | | | | | | |
Collapse
|
81
|
Butz MV, Thomaschke R, Linhardt MJ, Herbort O. Remapping motion across modalities: tactile rotations influence visual motion judgments. Exp Brain Res 2010; 207:1-11. [DOI: 10.1007/s00221-010-2420-2] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2009] [Accepted: 09/09/2010] [Indexed: 11/28/2022]
|
82
|
Shams L, Kim R. Crossmodal influences on visual perception. Phys Life Rev 2010; 7:269-84. [DOI: 10.1016/j.plrev.2010.04.006] [Citation(s) in RCA: 87] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/19/2010] [Revised: 03/25/2010] [Accepted: 03/25/2010] [Indexed: 10/19/2022]
|
83
|
Pei YC, Hsiao SS, Craig JC, Bensmaia SJ. Shape invariant coding of motion direction in somatosensory cortex. PLoS Biol 2010; 8:e1000305. [PMID: 20126380 PMCID: PMC2814823 DOI: 10.1371/journal.pbio.1000305] [Citation(s) in RCA: 68] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2009] [Accepted: 12/29/2009] [Indexed: 11/19/2022] Open
Abstract
A subpopulation of neurons in primate somatosensory cortex signal the direction in which objects move across the skin of the fingertips. Invariant representations of stimulus features are thought to play an important role in producing stable percepts of objects. In the present study, we assess the invariance of neural representations of tactile motion direction with respect to other stimulus properties. To this end, we record the responses evoked in individual neurons in somatosensory cortex of primates, including areas 3b, 1, and 2, by three types of motion stimuli, namely scanned bars and dot patterns, and random dot displays, presented to the fingertips of macaque monkeys. We identify a population of neurons in area 1 that is highly sensitive to the direction of stimulus motion and whose motion signals are invariant across stimulus types and conditions. The motion signals conveyed by individual neurons in area 1 can account for the ability of human observers to discriminate the direction of motion of these stimuli, as measured in paired psychophysical experiments. We conclude that area 1 contains a robust representation of motion and discuss similarities in the neural mechanisms of visual and tactile motion processing. When we physically interact with an object, our hands convey information about the shape of the object, its texture, its compliance, and its thermal properties. This information allows us to manipulate tools and to recognize objects based on tactile exploration alone. One of the hallmarks of tactile object recognition is that it involves movement between the skin and the object. In this study, we investigate how the direction in which objects move relative to the skin is represented in the brain. Specifically, we scan a variety of stimuli, including bars and dot patterns, across the fingers of non-human primates while recording the evoked neuronal activity. We find that a population of neurons in somatosensory cortex encodes the direction of moving stimuli regardless of the shape of the stimuli, the speed at which they are scanned across the skin, or the force with which they contact the skin. We show that these neurons can account for our ability to perceive the direction of motion of tactile stimuli.
Collapse
Affiliation(s)
- Yu-Cheng Pei
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, Maryland, United States of America
- Solomon H. Snyder Department of Neuroscience, Johns Hopkins University, Baltimore, Maryland, United States of America
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital and Chang Gung University, Taiwan
| | - Steven S. Hsiao
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, Maryland, United States of America
- Solomon H. Snyder Department of Neuroscience, Johns Hopkins University, Baltimore, Maryland, United States of America
| | - James C. Craig
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, Maryland, United States of America
- Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - Sliman J. Bensmaia
- Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, Maryland, United States of America
- Solomon H. Snyder Department of Neuroscience, Johns Hopkins University, Baltimore, Maryland, United States of America
- Department of Organismal Biology and Anatomy, University of Chicago, Chicago, Illinois, United States of America
- * E-mail:
| |
Collapse
|
84
|
Qi Wang, Hayward V. Biomechanically Optimized Distributed Tactile Transducer Based on Lateral Skin Deformation. Int J Rob Res 2009. [DOI: 10.1177/0278364909345289] [Citation(s) in RCA: 55] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
In this paper we describe a tactile transducer device that is optimized from biomechanical data and that has a compact, yet modular design. The tactile transducer comprises a 6 × 10 piezoelectric bimorph actuator array with a spatial resolution of 1 .8 mm × 1 .2 mm and has a wide temporal bandwidth. The actuator mounting method was improved from a conventional cantilever method to a dual-pinned method, giving the actuator the ability to deform the glabrous skin maximally during laterotactile stimulation. The results were validated by asking subjects to detect tactile features under a wide range of operating conditions. The tactile display device is modular, makes use of ordinary fabrication methods, and can be assembled and dismantled in a short time for debugging and maintenance. It weighs 60 g, it is self-contained in a 150 cm 3 volume and may be interfaced to most computers, provided that two analog outputs and six digital I/O lines are available. Psychophysical experiments were carried out to assess its effectiveness in rendering virtual tactile features.
Collapse
Affiliation(s)
- Qi Wang
- Haptics Laboratory, Center for Intelligent Machines, McGill University, Montréal, Canada,
| | - Vincent Hayward
- Haptics Laboratory, Center for Intelligent Machines, McGill University, Montréal, Canada
| |
Collapse
|