1
|
Wang G, Alais D. Tactile adaptation to orientation produces a robust tilt aftereffect and exhibits crossmodal transfer when tested in vision. Sci Rep 2024; 14:10164. [PMID: 38702338 PMCID: PMC11068783 DOI: 10.1038/s41598-024-60343-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2023] [Accepted: 04/22/2024] [Indexed: 05/06/2024] Open
Abstract
Orientation processing is one of the most fundamental functions in both visual and somatosensory perception. Converging findings suggest that orientation processing in both modalities is closely linked: somatosensory neurons share a similar orientation organisation as visual neurons, and the visual cortex has been found to be heavily involved in tactile orientation perception. Hence, we hypothesized that somatosensation would exhibit a similar orientation adaptation effect, and this adaptation effect would be transferable between the two modalities, considering the above-mentioned connection. The tilt aftereffect (TAE) is a demonstration of orientation adaptation and is used widely in behavioural experiments to investigate orientation mechanisms in vision. By testing the classic TAE paradigm in both tactile and crossmodal orientation tasks between vision and touch, we were able to show that tactile perception of orientation shows a very robust TAE, similar to its visual counterpart. We further show that orientation adaptation in touch transfers to produce a TAE when tested in vision, but not vice versa. Additionally, when examining the test sequence following adaptation for serial effects, we observed another asymmetry between the two conditions where the visual test sequence displayed a repulsive intramodal serial dependence effect while the tactile test sequence exhibited an attractive serial dependence. These findings provide concrete evidence that vision and touch engage a similar orientation processing mechanism. However, the asymmetry in the crossmodal transfer of TAE and serial dependence points to a non-reciprocal connection between the two modalities, providing further insights into the underlying processing mechanism.
Collapse
Affiliation(s)
- Guandong Wang
- School of Psychology, The University of Sydney, Sydney, Australia.
| | - David Alais
- School of Psychology, The University of Sydney, Sydney, Australia
| |
Collapse
|
2
|
Bola Ł, Vetter P, Wenger M, Amedi A. Decoding Reach Direction in Early "Visual" Cortex of Congenitally Blind Individuals. J Neurosci 2023; 43:7868-7878. [PMID: 37783506 PMCID: PMC10648511 DOI: 10.1523/jneurosci.0376-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2023] [Revised: 08/22/2023] [Accepted: 08/26/2023] [Indexed: 10/04/2023] Open
Abstract
Motor actions, such as reaching or grasping, can be decoded from fMRI activity of early visual cortex (EVC) in sighted humans. This effect can depend on vision or visual imagery, or alternatively, could be driven by mechanisms independent of visual experience. Here, we show that the actions of reaching in different directions can be reliably decoded from fMRI activity of EVC in congenitally blind humans (both sexes). Thus, neither visual experience nor visual imagery is necessary for EVC to represent action-related information. We also demonstrate that, within EVC of blind humans, the accuracy of reach direction decoding is highest in areas typically representing foveal vision and gradually decreases in areas typically representing peripheral vision. We propose that this might indicate the existence of a predictive, hard-wired mechanism of aligning action and visual spaces. This mechanism might send action-related information primarily to the high-resolution foveal visual areas, which are critical for guiding and online correction of motor actions. Finally, we show that, beyond EVC, the decoding of reach direction in blind humans is most accurate in dorsal stream areas known to be critical for visuo-spatial and visuo-motor integration in the sighted. Thus, these areas can develop space and action representations even in the lifelong absence of vision. Overall, our findings in congenitally blind humans match previous research on the action system in the sighted, and suggest that the development of action representations in the human brain might be largely independent of visual experience.SIGNIFICANCE STATEMENT Early visual cortex (EVC) was traditionally thought to process only visual signals from the retina. Recent studies proved this account incomplete, and showed EVC involvement in many activities not directly related to incoming visual information, such as memory, sound, or action processing. Is EVC involved in these activities because of visual imagery? Here, we show robust reach direction representation in EVC of humans born blind. This demonstrates that EVC can represent actions independently of vision and visual imagery. Beyond EVC, we found that reach direction representation in blind humans is strongest in dorsal brain areas, critical for action processing in the sighted. This suggests that the development of action representations in the human brain is largely independent of visual experience.
Collapse
Affiliation(s)
- Łukasz Bola
- Institute of Psychology, Polish Academy of Sciences, Warsaw, 00-378, Poland
| | - Petra Vetter
- Visual & Cognitive Neuroscience Lab, Department of Psychology, University of Fribourg, Fribourg, 1700, Switzerland
| | - Mohr Wenger
- Department of Medical Neurobiology, Faculty of Medicine, Hebrew University Jerusalem, Jerusalem, Israel, 91120
| | - Amir Amedi
- Department of Medical Neurobiology, Faculty of Medicine, Hebrew University Jerusalem, Jerusalem, Israel, 91120
- Baruch Ivcher Institute for Brain, Cognition & Technology, Baruch Ivcher School of Psychology, Reichman University, Interdisciplinary Center Herzliya, Herzliya, Israel, 461010
| |
Collapse
|
3
|
Vision- and touch-dependent brain correlates of body-related mental processing. Cortex 2022; 157:30-52. [PMID: 36272330 DOI: 10.1016/j.cortex.2022.09.005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2022] [Revised: 06/17/2022] [Accepted: 09/07/2022] [Indexed: 12/15/2022]
Abstract
In humans, the nature of sensory input influences body-related mental processing. For instance, behavioral differences (e.g., response time) can be found between mental spatial transformations (e.g., mental rotation) of viewed and touched body parts. It can thus be hypothesized that distinct brain activation patterns are associated with such sensory-dependent body-related mental processing. However, direct evidence that the neural correlates of body-related mental processing can be modulated by the nature of the sensory stimuli is still missing. We thus analyzed event-related functional magnetic resonance imaging (fMRI) data from thirty-one healthy participants performing mental rotation of visually- (images) and haptically-presented (plastic) hands. We also dissociated the neural activity related to rotation or task-related performance using models that either regressed out or included the variance associated with response time. Haptically-mediated mental rotation recruited mostly the sensorimotor brain network. Visually-mediated mental rotation led to parieto-occipital activations. In addition, faster mental rotation was associated with sensorimotor activity, while slower mental rotation was associated with parieto-occipital activations. The fMRI results indicated that changing the type of sensory inputs modulates the neural correlates of body-related mental processing. These findings suggest that distinct sensorimotor brain dynamics can be exploited to execute similar tasks depending on the available sensory input. The present study can contribute to a better evaluation of body-related mental processing in experimental and clinical settings.
Collapse
|
4
|
Pramudya RC, Seo HS. Hand-Feel Touch Cues and Their Influences on Consumer Perception and Behavior with Respect to Food Products: A Review. Foods 2019; 8:foods8070259. [PMID: 31311188 PMCID: PMC6678767 DOI: 10.3390/foods8070259] [Citation(s) in RCA: 25] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2019] [Revised: 07/05/2019] [Accepted: 07/09/2019] [Indexed: 12/12/2022] Open
Abstract
There has been a great deal of research investigating intrinsic/extrinsic cues and their influences on consumer perception and purchasing decisions at points of sale, product usage, and consumption. Consumers create expectations toward a food product through sensory information extracted from its surface (intrinsic cues) or packaging (extrinsic cues) at retail stores. Packaging is one of the important extrinsic cues that can modulate consumer perception, liking, and decision making of a product. For example, handling a product packaging during consumption, even just touching the packaging while opening or holding it during consumption, may result in a consumer expectation of the package content. Although hand-feel touch cues are an integral part of the food consumption experience, as can be observed in such an instance, little has been known about their influences on consumer perception, acceptability, and purchase behavior of food products. This review therefore provided a better understanding about hand-feel touch cues and their influences in the context of food and beverage experience with a focus on (1) an overview of touch as a sensory modality, (2) factors influencing hand-feel perception, (3) influences of hand-feel touch cues on the perception of other sensory modalities, and (4) the effects of hand-feel touch cues on emotional responses and purchase behavior.
Collapse
Affiliation(s)
- Ragita C Pramudya
- Department of Food Science, University of Arkansas, 2650 North Young Avenue, Fayetteville, AR 72704, USA
| | - Han-Seok Seo
- Department of Food Science, University of Arkansas, 2650 North Young Avenue, Fayetteville, AR 72704, USA.
| |
Collapse
|
5
|
Prieto A, Mayas J, Ballesteros S. Behavioral and electrophysiological correlates of interactions between grouping principles in touch: Evidence from psychophysical indirect tasks. Neuropsychologia 2019; 129:21-36. [PMID: 30879999 DOI: 10.1016/j.neuropsychologia.2019.03.005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2018] [Revised: 12/28/2018] [Accepted: 03/11/2019] [Indexed: 11/30/2022]
Abstract
In two experiments we investigated the behavioral and brain correlates of the interactions between spatial-proximity and texture-similarity grouping principles in touch. We designed two adaptations of the repetition discrimination task (RDT) previously used in vision. This task provides an indirect measure of grouping that does not require explicit attention to the grouping process. In Experiment 1, participants were presented with a row of elements alternating in texture except for one pair in which the same texture was repeated. The participants had to decide whether the repeated texture stimuli (similarity grouping) were smooth or rough, while the spatial proximity between targets and distractors was varied either to facilitate or hinder the response. In Experiment 2, participants indicated which cohort (proximity grouping) contained more elements, while texture-similarity within and between cohorts was modified. The results indicated additive effects of grouping cues in which proximity dominated the perceptual grouping process when the two principles acted together. In addition, the independent component analysis (ICA) performed on electrophysiological data revealed the implication of a widespread network of sensorimotor, prefrontal, parietal and occipital brain areas in both experiments.
Collapse
Affiliation(s)
- Antonio Prieto
- Departamento de Psicología Básica II, Studies on Aging and Neurodegenerative Diseases Research Group, Spain.
| | - Julia Mayas
- Departamento de Psicología Básica II, Studies on Aging and Neurodegenerative Diseases Research Group, Spain.
| | - Soledad Ballesteros
- Departamento de Psicología Básica II, Studies on Aging and Neurodegenerative Diseases Research Group, Spain.
| |
Collapse
|
6
|
Tivadar RI, Rouillard T, Chappaz C, Knebel JF, Turoman N, Anaflous F, Roche J, Matusz PJ, Murray MM. Mental Rotation of Digitally-Rendered Haptic Objects. Front Integr Neurosci 2019; 13:7. [PMID: 30930756 PMCID: PMC6427928 DOI: 10.3389/fnint.2019.00007] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2018] [Accepted: 02/25/2019] [Indexed: 11/13/2022] Open
Abstract
Sensory substitution is an effective means to rehabilitate many visual functions after visual impairment or blindness. Tactile information, for example, is particularly useful for functions such as reading, mental rotation, shape recognition, or exploration of space. Extant haptic technologies typically rely on real physical objects or pneumatically driven renderings and thus provide a limited library of stimuli to users. New developments in digital haptic technologies now make it possible to actively simulate an unprecedented range of tactile sensations. We provide a proof-of-concept for a new type of technology (hereafter haptic tablet) that renders haptic feedback by modulating the friction of a flat screen through ultrasonic vibrations of varying shapes to create the sensation of texture when the screen is actively explored. We reasoned that participants should be able to create mental representations of letters presented in normal and mirror-reversed haptic form without the use of any visual information and to manipulate such representations in a mental rotation task. Healthy sighted, blindfolded volunteers were trained to discriminate between two letters (either L and P, or F and G; counterbalanced across participants) on a haptic tablet. They then tactually explored all four letters in normal or mirror-reversed form at different rotations (0°, 90°, 180°, and 270°) and indicated letter form (i.e., normal or mirror-reversed) by pressing one of two mouse buttons. We observed the typical effect of rotation angle on object discrimination performance (i.e., greater deviation from 0° resulted in worse performance) for trained letters, consistent with mental rotation of these haptically-rendered objects. We likewise observed generally slower and less accurate performance with mirror-reversed compared to prototypically oriented stimuli. Our findings extend existing research in multisensory object recognition by indicating that a new technology simulating active haptic feedback can support the generation and spatial manipulation of mental representations of objects. Thus, such haptic tablets can offer a new avenue to mitigate visual impairments and train skills dependent on mental object-based representations and their spatial manipulation.
Collapse
Affiliation(s)
- Ruxandra I. Tivadar
- The Laboratory for Investigative Neurophysiology (LINE), Department of Radiology and Clinical Neurosciences, University Hospital Center and University of Lausanne, Lausanne, Switzerland
- Department of Ophthalmology, Fondation Asile des Aveugles, Lausanne, Switzerland
| | | | | | - Jean-François Knebel
- The Laboratory for Investigative Neurophysiology (LINE), Department of Radiology and Clinical Neurosciences, University Hospital Center and University of Lausanne, Lausanne, Switzerland
- Electroencephalography Brain Mapping Core, Center for Biomedical Imaging (CIBM) of Lausanne and Geneva, Lausanne, Switzerland
| | - Nora Turoman
- The Laboratory for Investigative Neurophysiology (LINE), Department of Radiology and Clinical Neurosciences, University Hospital Center and University of Lausanne, Lausanne, Switzerland
| | - Fatima Anaflous
- Department of Ophthalmology, Fondation Asile des Aveugles, Lausanne, Switzerland
| | - Jean Roche
- Department of Ophthalmology, Fondation Asile des Aveugles, Lausanne, Switzerland
| | - Pawel J. Matusz
- The Laboratory for Investigative Neurophysiology (LINE), Department of Radiology and Clinical Neurosciences, University Hospital Center and University of Lausanne, Lausanne, Switzerland
- Information Systems Institute at the University of Applied Sciences Western Switzerland (HES-SO Valais), Sierre, Switzerland
- Department of Hearing and Speech Sciences, Vanderbilt University, Nashville, TN, United States
| | - Micah M. Murray
- The Laboratory for Investigative Neurophysiology (LINE), Department of Radiology and Clinical Neurosciences, University Hospital Center and University of Lausanne, Lausanne, Switzerland
- Department of Ophthalmology, Fondation Asile des Aveugles, Lausanne, Switzerland
- Electroencephalography Brain Mapping Core, Center for Biomedical Imaging (CIBM) of Lausanne and Geneva, Lausanne, Switzerland
- Department of Hearing and Speech Sciences, Vanderbilt University, Nashville, TN, United States
| |
Collapse
|
7
|
Testing the perceptual equivalence hypothesis in mental rotation of 3D stimuli with visual and tactile input. Exp Brain Res 2018; 236:881-896. [DOI: 10.1007/s00221-018-5172-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2016] [Accepted: 01/05/2018] [Indexed: 10/18/2022]
|
8
|
Sander TH, Zhou B. Linking neuroimaging signals to behavioral responses in single cases: Challenges and opportunities. Psych J 2016; 5:161-9. [DOI: 10.1002/pchj.143] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2016] [Revised: 08/09/2016] [Accepted: 08/09/2016] [Indexed: 11/08/2022]
Affiliation(s)
| | - Bin Zhou
- Key Laboratory of Behavioral Sciences, Institute of Psychology; Chinese Academy of Sciences; Beijing China
| |
Collapse
|
9
|
Araneda R, Renier LA, Rombaux P, Cuevas I, De Volder AG. Cortical Plasticity and Olfactory Function in Early Blindness. Front Syst Neurosci 2016; 10:75. [PMID: 27625596 PMCID: PMC5003898 DOI: 10.3389/fnsys.2016.00075] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2016] [Accepted: 08/17/2016] [Indexed: 11/13/2022] Open
Abstract
Over the last decade, functional brain imaging has provided insight to the maturation processes and has helped elucidate the pathophysiological mechanisms involved in brain plasticity in the absence of vision. In case of congenital blindness, drastic changes occur within the deafferented “visual” cortex that starts receiving and processing non visual inputs, including olfactory stimuli. This functional reorganization of the occipital cortex gives rise to compensatory perceptual and cognitive mechanisms that help blind persons achieve perceptual tasks, leading to superior olfactory abilities in these subjects. This view receives support from psychophysical testing, volumetric measurements and functional brain imaging studies in humans, which are presented here.
Collapse
Affiliation(s)
- Rodrigo Araneda
- Institute of Neuroscience (IoNS), Université catholique de Louvain Brussels, Belgium
| | - Laurent A Renier
- Institute of Neuroscience (IoNS), Université catholique de Louvain Brussels, Belgium
| | - Philippe Rombaux
- Institute of Neuroscience (IoNS), Université catholique de LouvainBrussels, Belgium; Department of Otorhinolaryngology, Cliniques Universitaires Saint-LucBrussels, Belgium
| | - Isabel Cuevas
- Laboratorio de Neurociencias, Escuela de Kinesiología, Facultad de Ciencias, Pontificia Universidad Católica de Valparaíso Valparaíso, Chile
| | - Anne G De Volder
- Institute of Neuroscience (IoNS), Université catholique de Louvain Brussels, Belgium
| |
Collapse
|
10
|
Sathian K. Analysis of haptic information in the cerebral cortex. J Neurophysiol 2016; 116:1795-1806. [PMID: 27440247 DOI: 10.1152/jn.00546.2015] [Citation(s) in RCA: 58] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2015] [Accepted: 07/20/2016] [Indexed: 11/22/2022] Open
Abstract
Haptic sensing of objects acquires information about a number of properties. This review summarizes current understanding about how these properties are processed in the cerebral cortex of macaques and humans. Nonnoxious somatosensory inputs, after initial processing in primary somatosensory cortex, are partially segregated into different pathways. A ventrally directed pathway carries information about surface texture into parietal opercular cortex and thence to medial occipital cortex. A dorsally directed pathway transmits information regarding the location of features on objects to the intraparietal sulcus and frontal eye fields. Shape processing occurs mainly in the intraparietal sulcus and lateral occipital complex, while orientation processing is distributed across primary somatosensory cortex, the parietal operculum, the anterior intraparietal sulcus, and a parieto-occipital region. For each of these properties, the respective areas outside primary somatosensory cortex also process corresponding visual information and are thus multisensory. Consistent with the distributed neural processing of haptic object properties, tactile spatial acuity depends on interaction between bottom-up tactile inputs and top-down attentional signals in a distributed neural network. Future work should clarify the roles of the various brain regions and how they interact at the network level.
Collapse
Affiliation(s)
- K Sathian
- Departments of Neurology, Rehabilitation Medicine and Psychology, Emory University, Atlanta, Georgia; and Center for Visual and Neurocognitive Rehabilitation, Atlanta Department of Veterans Affairs Medical Center, Decatur, Georgia
| |
Collapse
|
11
|
Yau JM, Kim SS, Thakur PH, Bensmaia SJ. Feeling form: the neural basis of haptic shape perception. J Neurophysiol 2016; 115:631-42. [PMID: 26581869 PMCID: PMC4752307 DOI: 10.1152/jn.00598.2015] [Citation(s) in RCA: 43] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2015] [Accepted: 10/23/2015] [Indexed: 11/22/2022] Open
Abstract
The tactile perception of the shape of objects critically guides our ability to interact with them. In this review, we describe how shape information is processed as it ascends the somatosensory neuraxis of primates. At the somatosensory periphery, spatial form is represented in the spatial patterns of activation evoked across populations of mechanoreceptive afferents. In the cerebral cortex, neurons respond selectively to particular spatial features, like orientation and curvature. While feature selectivity of neurons in the earlier processing stages can be understood in terms of linear receptive field models, higher order somatosensory neurons exhibit nonlinear response properties that result in tuning for more complex geometrical features. In fact, tactile shape processing bears remarkable analogies to its visual counterpart and the two may rely on shared neural circuitry. Furthermore, one of the unique aspects of primate somatosensation is that it contains a deformable sensory sheet. Because the relative positions of cutaneous mechanoreceptors depend on the conformation of the hand, the haptic perception of three-dimensional objects requires the integration of cutaneous and proprioceptive signals, an integration that is observed throughout somatosensory cortex.
Collapse
Affiliation(s)
- Jeffrey M Yau
- Department of Neuroscience, Baylor College of Medicine, Houston, Texas;
| | - Sung Soo Kim
- Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, Virginia
| | | | - Sliman J Bensmaia
- Department of Organismal Biology and Anatomy, University of Chicago, Chicago, Illinois
| |
Collapse
|
12
|
Abstract
UNLABELLED The visual and haptic perceptual systems are understood to share a common neural representation of object shape. A region thought to be critical for recognizing visual and haptic shape information is the lateral occipital complex (LOC). We investigated whether LOC is essential for haptic shape recognition in humans by studying behavioral responses and brain activation for haptically explored objects in a patient (M.C.) with bilateral lesions of the occipitotemporal cortex, including LOC. Despite severe deficits in recognizing objects using vision, M.C. was able to accurately recognize objects via touch. M.C.'s psychophysical response profile to haptically explored shapes was also indistinguishable from controls. Using fMRI, M.C. showed no object-selective visual or haptic responses in LOC, but her pattern of haptic activation in other brain regions was remarkably similar to healthy controls. Although LOC is routinely active during visual and haptic shape recognition tasks, it is not essential for haptic recognition of object shape. SIGNIFICANCE STATEMENT The lateral occipital complex (LOC) is a brain region regarded to be critical for recognizing object shape, both in vision and in touch. However, causal evidence linking LOC with haptic shape processing is lacking. We studied recognition performance, psychophysical sensitivity, and brain response to touched objects, in a patient (M.C.) with extensive lesions involving LOC bilaterally. Despite being severely impaired in visual shape recognition, M.C. was able to identify objects via touch and she showed normal sensitivity to a haptic shape illusion. M.C.'s brain response to touched objects in areas of undamaged cortex was also very similar to that observed in neurologically healthy controls. These results demonstrate that LOC is not necessary for recognizing objects via touch.
Collapse
|
13
|
Tal Z, Geva R, Amedi A. The origins of metamodality in visual object area LO: Bodily topographical biases and increased functional connectivity to S1. Neuroimage 2015; 127:363-375. [PMID: 26673114 PMCID: PMC4758827 DOI: 10.1016/j.neuroimage.2015.11.058] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2015] [Revised: 10/16/2015] [Accepted: 11/24/2015] [Indexed: 11/14/2022] Open
Abstract
Recent evidence from blind participants suggests that visual areas are task-oriented and sensory modality input independent rather than sensory-specific to vision. Specifically, visual areas are thought to retain their functional selectivity when using non-visual inputs (touch or sound) even without having any visual experience. However, this theory is still controversial since it is not clear whether this also characterizes the sighted brain, and whether the reported results in the sighted reflect basic fundamental a-modal processes or are an epiphenomenon to a large extent. In the current study, we addressed these questions using a series of fMRI experiments aimed to explore visual cortex responses to passive touch on various body parts and the coupling between the parietal and visual cortices as manifested by functional connectivity. We show that passive touch robustly activated the object selective parts of the lateral–occipital (LO) cortex while deactivating almost all other occipital–retinotopic-areas. Furthermore, passive touch responses in the visual cortex were specific to hand and upper trunk stimulations. Psychophysiological interaction (PPI) analysis suggests that LO is functionally connected to the hand area in the primary somatosensory homunculus (S1), during hand and shoulder stimulations but not to any of the other body parts. We suggest that LO is a fundamental hub that serves as a node between visual-object selective areas and S1 hand representation, probably due to the critical evolutionary role of touch in object recognition and manipulation. These results might also point to a more general principle suggesting that recruitment or deactivation of the visual cortex by other sensory input depends on the ecological relevance of the information conveyed by this input to the task/computations carried out by each area or network. This is likely to rely on the unique and differential pattern of connectivity for each visual area with the rest of the brain. We studied cross-modal effects of passive somatosensory inputs on the visual cortex. Passive touch on the body evoked massive deactivation in the visual cortex. Passive hand stimulation evoked unique activation in visual object area LO. This area was also uniquely connected to the hand area in Penfield's homunculus — S1.
Collapse
Affiliation(s)
- Zohar Tal
- Department of Medical Neurobiology, Institute of Medical Research Israel - Canada (IMRIC), Faculty of Medicine, The Hebrew University of Jerusalem, Jerusalem 91220, Israel.
| | - Ran Geva
- Department of Medical Neurobiology, Institute of Medical Research Israel - Canada (IMRIC), Faculty of Medicine, The Hebrew University of Jerusalem, Jerusalem 91220, Israel
| | - Amir Amedi
- Department of Medical Neurobiology, Institute of Medical Research Israel - Canada (IMRIC), Faculty of Medicine, The Hebrew University of Jerusalem, Jerusalem 91220, Israel; The Edmond and Lily Safra Center for Brain Science (ELSC), The Hebrew University of Jerusalem, Jerusalem 91220, Israel; Program of Cognitive Science, The Hebrew University of Jerusalem, Jerusalem 91220, Israel
| |
Collapse
|
14
|
Occelli V, Lacey S, Stephens C, John T, Sathian K. Haptic Object Recognition is View-Independent in Early Blind but not Sighted People. Perception 2015; 45:337-45. [PMID: 26562881 DOI: 10.1177/0301006615614489] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
Object recognition, whether visual or haptic, is impaired in sighted people when objects are rotated between learning and test, relative to an unrotated condition, that is, recognition is view-dependent. Loss of vision early in life results in greater reliance on haptic perception for object identification compared with the sighted. Therefore, we hypothesized that early blind people may be more adept at recognizing objects despite spatial transformations. To test this hypothesis, we compared early blind and sighted control participants on a haptic object recognition task. Participants studied pairs of unfamiliar three-dimensional objects and performed a two-alternative forced-choice identification task, with the learned objects presented both unrotated and rotated 180° about they-axis. Rotation impaired the recognition accuracy of sighted, but not blind, participants. We propose that, consistent with our hypothesis, haptic view-independence in the early blind reflects their greater experience with haptic object perception.
Collapse
Affiliation(s)
| | - Simon Lacey
- Department of Neurology, Emory University, Atlanta, GA, USA
| | - Careese Stephens
- Department of Neurology, Emory University, Atlanta, GA, USARehabilitation R&D Center of Excellence, Atlanta VAMC, Decatur, GA, USA
| | - Thomas John
- Department of Neurology, Emory University, Atlanta, GA, USA
| | - K Sathian
- Department of Neurology, Emory University, Atlanta, GA, USADepartment of Rehabilitation Medicine, Emory University, Atlanta, GA, USA; Department of Psychology, Emory University, Atlanta, GA, USARehabilitation R&D Center of Excellence, Atlanta VAMC, Decatur, GA, USA
| |
Collapse
|
15
|
Jao RJ, James TW, James KH. Crossmodal enhancement in the LOC for visuohaptic object recognition over development. Neuropsychologia 2015; 77:76-89. [PMID: 26272239 DOI: 10.1016/j.neuropsychologia.2015.08.008] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2014] [Revised: 08/05/2015] [Accepted: 08/07/2015] [Indexed: 10/23/2022]
Abstract
Research has provided strong evidence of multisensory convergence of visual and haptic information within the visual cortex. These studies implement crossmodal matching paradigms to examine how systems use information from different sensory modalities for object recognition. Developmentally, behavioral evidence of visuohaptic crossmodal processing has suggested that communication within sensory systems develops earlier than across systems; nonetheless, it is unknown how the neural mechanisms driving these behavioral effects develop. To address this gap in knowledge, BOLD functional Magnetic Resonance Imaging (fMRI) was measured during delayed match-to-sample tasks that examined intramodal (visual-to-visual, haptic-to-haptic) and crossmodal (visual-to-haptic, haptic-to-visual) novel object recognition in children aged 7-8.5 years and adults. Tasks were further divided into sample encoding and test matching phases to dissociate the relative contributions of each. Results of crossmodal and intramodal object recognition revealed the network of known visuohaptic multisensory substrates, including the lateral occipital complex (LOC) and the intraparietal sulcus (IPS). Critically, both adults and children showed crossmodal enhancement within the LOC, suggesting a sensitivity to changes in sensory modality during recognition. These groups showed similar regions of activation, although children generally exhibited more widespread activity during sample encoding and weaker BOLD signal change during test matching than adults. Results further provided evidence of a bilateral region in the occipitotemporal cortex that was haptic-preferring in both age groups. This region abutted the bimodal LOtv, and was consistent with a medial to lateral organization that transitioned from a visual to haptic bias within the LOC. These findings converge with existing evidence of visuohaptic processing in the LOC in adults, and extend our knowledge of crossmodal processing in adults and children.
Collapse
Affiliation(s)
- R Joanne Jao
- Cognitive Science Program, Indiana University, Bloomington, USA; Department of Psychological and Brain Sciences, Indiana University, Bloomington, USA.
| | - Thomas W James
- Cognitive Science Program, Indiana University, Bloomington, USA; Department of Psychological and Brain Sciences, Indiana University, Bloomington, USA; Program in Neuroscience, Indiana University, Bloomington, USA
| | - Karin Harman James
- Cognitive Science Program, Indiana University, Bloomington, USA; Department of Psychological and Brain Sciences, Indiana University, Bloomington, USA; Program in Neuroscience, Indiana University, Bloomington, USA
| |
Collapse
|
16
|
Koijck LA, Toet A, Van Erp JBF. Tactile roughness perception in the presence of olfactory and trigeminal stimulants. PeerJ 2015; 3:e955. [PMID: 26020010 PMCID: PMC4435474 DOI: 10.7717/peerj.955] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/24/2014] [Accepted: 04/23/2015] [Indexed: 12/21/2022] Open
Abstract
Previous research has shown that odorants consistently evoke associations with textures and their tactile properties like smoothness and roughness. Also, it has been observed that olfaction can modulate tactile perception. We therefore hypothesized that tactile roughness perception may be biased towards the somatosensory connotation of an ambient odorant. We performed two experiments to test this hypothesis. In the first experiment, we investigated the influence of ambient chemosensory stimuli with different roughness connotations on tactile roughness perception. In addition to a pleasant odor with a connotation of softness (PEA), we also included a trigeminal stimulant with a rough, sharp or prickly connotation (Ethanol). We expected that—compared to a No-odorant control condition—tactile texture perception would be biased towards smoothness in the presence of PEA and towards roughness in the presence of Ethanol. However, our results show no significant interaction between chemosensory stimulation and perceived tactile surface roughness. It could be argued that ambient odors may be less effective in stimulating crossmodal associations, since they are by definition extraneous to the tactile stimuli. In an attempt to optimize the conditions for sensory integration, we therefore performed a second experiment in which the olfactory and tactile stimuli were presented in synchrony and in close spatial proximity. In addition, we included pleasant (Lemon) and unpleasant (Indole) odorants that are known to have the ability to affect tactile perception. We expected that tactile stimuli would be perceived as less rough when simultaneously presented with Lemon or PEA (both associated with softness) than when presented with Ethanol or Indole (odors that can be associated with roughness). Again, we found no significant main effect of chemosensory condition on perceived tactile roughness. We discuss the limitations of this study and we present suggestions for future research.
Collapse
Affiliation(s)
| | | | - Jan B F Van Erp
- TNO , Soesterberg , The Netherlands ; Human Media Interaction, University of Twente , Enschede , The Netherlands
| |
Collapse
|
17
|
Braier J, Lattenkamp K, Räthel B, Schering S, Wojatzki M, Weyers B. Haptic 3D Surface Representation of Table-Based Data for People With Visual Impairments. ACM TRANSACTIONS ON ACCESSIBLE COMPUTING 2015. [DOI: 10.1145/2700433] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
Abstract
The UN Convention on the Rights of Persons with Disabilities Article 24 states that “States Parties shall ensure inclusive education at all levels of education and life long learning.” This article focuses on the inclusion of people with visual impairments in learning processes including complex table-based data. Gaining insight into and understanding of complex data is a highly demanding task for people with visual impairments. Especially in the case of table-based data, the classic approaches of braille-based output devices and printing concepts are limited. Haptic perception requires sequential information processing rather than the parallel processing used by the visual system, which hinders haptic perception to gather a fast overview of and deeper insight into the data. Nevertheless, neuroscientific research has identified great dependencies between haptic perception and the cognitive processing of visual sensing. Based on these findings, we developed a haptic 3D surface representation of classic diagrams and charts, such as bar graphs and pie charts. In a qualitative evaluation study, we identified certain advantages of our relief-type 3D chart approach. Finally, we present an education model for German schools that includes a 3D printing approach to help integrate students with visual impairments.
Collapse
|
18
|
Abstract
Distinct preference for visual number symbols was recently discovered in the human right inferior temporal gyrus (rITG). It remains unclear how this preference emerges, what is the contribution of shape biases to its formation and whether visual processing underlies it. Here we use congenital blindness as a model for brain development without visual experience. During fMRI, we present blind subjects with shapes encoded using a novel visual-to-music sensory-substitution device (The EyeMusic). Greater activation is observed in the rITG when subjects process symbols as numbers compared with control tasks on the same symbols. Using resting-state fMRI in the blind and sighted, we further show that the areas with preference for numerals and letters exhibit distinct patterns of functional connectivity with quantity and language-processing areas, respectively. Our findings suggest that specificity in the ventral ‘visual’ stream can emerge independently of sensory modality and visual experience, under the influence of distinct connectivity patterns. The human visual cortex includes areas with preference for various object categories. Here, Abboud et al. demonstrate using visual-to-music substitution, that the congenitally blind show a similar preference for numerals in the right inferior temporal cortex as sighted individuals, despite having no visual experience.
Collapse
|
19
|
Lacey S, Sathian K. CROSSMODAL AND MULTISENSORY INTERACTIONS BETWEEN VISION AND TOUCH. SCHOLARPEDIA 2015; 10:7957. [PMID: 26783412 DOI: 10.4249/scholarpedia.7957] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/23/2023] Open
Affiliation(s)
- Simon Lacey
- Departments of Neurology, Emory University, Atlanta, GA, USA
| | - K Sathian
- Departments of Neurology, Emory University, Atlanta, GA, USA; Rehabilitation Medicine, Emory University, Atlanta, GA, USA; Psychology, Emory University, Atlanta, GA, USA; Rehabilitation R&D Center of Excellence, Atlanta VAMC, Decatur, GA, USA
| |
Collapse
|
20
|
Kagawa T, Narita N, Iwaki S, Kawasaki S, Kamiya K, Minakuchi S. Does shape discrimination by the mouth activate the parietal and occipital lobes? - near-infrared spectroscopy study. PLoS One 2014; 9:e108685. [PMID: 25299397 PMCID: PMC4191970 DOI: 10.1371/journal.pone.0108685] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2014] [Accepted: 09/02/2014] [Indexed: 11/19/2022] Open
Abstract
A cross-modal association between somatosensory tactile sensation and parietal and occipital activities during Braille reading was initially discovered in tests with blind subjects, with sighted and blindfolded healthy subjects used as controls. However, the neural background of oral stereognosis remains unclear. In the present study, we investigated whether the parietal and occipital cortices are activated during shape discrimination by the mouth using functional near-infrared spectroscopy (fNIRS). Following presentation of the test piece shape, a sham discrimination trial without the test pieces induced posterior parietal lobe (BA7), extrastriate cortex (BA18, BA19), and striate cortex (BA17) activation as compared with the rest session, while shape discrimination of the test pieces markedly activated those areas as compared with the rest session. Furthermore, shape discrimination of the test pieces specifically activated the posterior parietal cortex (precuneus/BA7), extrastriate cortex (BA18, 19), and striate cortex (BA17), as compared with sham sessions without a test piece. We concluded that oral tactile sensation is recognized through tactile/visual cross-modal substrates in the parietal and occipital cortices during shape discrimination by the mouth.
Collapse
Affiliation(s)
- Tomonori Kagawa
- Gerodontology and Oral Rehabilitation Department of Gerontology and Gerodontology, Graduate School of Medical and Dental Sciences, Tokyo Medical and Dental University, Tokyo, Japan
| | - Noriyuki Narita
- Department of Removable Prosthodontics, Nihon University School of Dentistry at Matsudo, Chiba, Japan
| | - Sunao Iwaki
- Cognition and Action Research Group, Human Technology Research Institute, National Institute of Advanced Industrial Science and Technology (AIST), Aist Tsukuba Central 6, Ibaraki, Japan
| | - Shingo Kawasaki
- Application Development Office, Hitachi Medical Corporation, Chiba, Japan
| | - Kazunobu Kamiya
- Department of Removable Prosthodontics, Nihon University School of Dentistry at Matsudo, Chiba, Japan
| | - Shunsuke Minakuchi
- Gerodontology and Oral Rehabilitation Department of Gerontology and Gerodontology, Graduate School of Medical and Dental Sciences, Tokyo Medical and Dental University, Tokyo, Japan
| |
Collapse
|
21
|
Lacey S, Sathian K. Visuo-haptic multisensory object recognition, categorization, and representation. Front Psychol 2014; 5:730. [PMID: 25101014 PMCID: PMC4102085 DOI: 10.3389/fpsyg.2014.00730] [Citation(s) in RCA: 60] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2014] [Accepted: 06/23/2014] [Indexed: 12/15/2022] Open
Abstract
Visual and haptic unisensory object processing show many similarities in terms of categorization, recognition, and representation. In this review, we discuss how these similarities contribute to multisensory object processing. In particular, we show that similar unisensory visual and haptic representations lead to a shared multisensory representation underlying both cross-modal object recognition and view-independence. This shared representation suggests a common neural substrate and we review several candidate brain regions, previously thought to be specialized for aspects of visual processing, that are now known also to be involved in analogous haptic tasks. Finally, we lay out the evidence for a model of multisensory object recognition in which top-down and bottom-up pathways to the object-selective lateral occipital complex are modulated by object familiarity and individual differences in object and spatial imagery.
Collapse
Affiliation(s)
- Simon Lacey
- Department of Neurology, Emory University School of Medicine Atlanta, GA, USA
| | - K Sathian
- Department of Neurology, Emory University School of Medicine Atlanta, GA, USA ; Department of Rehabilitation Medicine, Emory University School of Medicine Atlanta, GA, USA ; Department of Psychology, Emory University School of Medicine Atlanta, GA, USA ; Rehabilitation Research and Development Center of Excellence, Atlanta Veterans Affairs Medical Center Decatur, GA, USA
| |
Collapse
|
22
|
Podrebarac SK, Goodale MA, Snow JC. Are visual texture-selective areas recruited during haptic texture discrimination? Neuroimage 2014; 94:129-137. [DOI: 10.1016/j.neuroimage.2014.03.013] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2013] [Revised: 02/02/2014] [Accepted: 03/07/2014] [Indexed: 11/25/2022] Open
|
23
|
Lacey S, Stilla R, Sreenivasan K, Deshpande G, Sathian K. Spatial imagery in haptic shape perception. Neuropsychologia 2014; 60:144-58. [PMID: 25017050 DOI: 10.1016/j.neuropsychologia.2014.05.008] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2014] [Revised: 04/27/2014] [Accepted: 05/13/2014] [Indexed: 12/14/2022]
Abstract
We have proposed that haptic activation of the shape-selective lateral occipital complex (LOC) reflects a model of multisensory object representation in which the role of visual imagery is modulated by object familiarity. Supporting this, a previous functional magnetic resonance imaging (fMRI) study from our laboratory used inter-task correlations of blood oxygenation level-dependent (BOLD) signal magnitude and effective connectivity (EC) patterns based on the BOLD signals to show that the neural processes underlying visual object imagery (objIMG) are more similar to those mediating haptic perception of familiar (fHS) than unfamiliar (uHS) shapes. Here we employed fMRI to test a further hypothesis derived from our model, that spatial imagery (spIMG) would evoke activation and effective connectivity patterns more related to uHS than fHS. We found that few of the regions conjointly activated by spIMG and either fHS or uHS showed inter-task correlations of BOLD signal magnitudes, with parietal foci featuring in both sets of correlations. This may indicate some involvement of spIMG in HS regardless of object familiarity, contrary to our hypothesis, although we cannot rule out alternative explanations for the commonalities between the networks, such as generic imagery or spatial processes. EC analyses, based on inferred neuronal time series obtained by deconvolution of the hemodynamic response function from the measured BOLD time series, showed that spIMG shared more common paths with uHS than fHS. Re-analysis of our previous data, using the same EC methods as those used here, showed that, by contrast, objIMG shared more common paths with fHS than uHS. Thus, although our model requires some refinement, its basic architecture is supported: a stronger relationship between spIMG and uHS compared to fHS, and a stronger relationship between objIMG and fHS compared to uHS.
Collapse
Affiliation(s)
- Simon Lacey
- Department of Neurology, Emory University, Atlanta, GA, USA
| | - Randall Stilla
- Department of Neurology, Emory University, Atlanta, GA, USA
| | - Karthik Sreenivasan
- AU MRI Research Center, Department of Electrical & Computer Engineering, Auburn University, Auburn, AL, USA
| | - Gopikrishna Deshpande
- AU MRI Research Center, Department of Electrical & Computer Engineering, Auburn University, Auburn, AL, USA; Department of Psychology, Auburn University, Auburn, AL, USA
| | - K Sathian
- Department of Neurology, Emory University, Atlanta, GA, USA; Department of Rehabilitation Medicine, Emory University, Atlanta, GA, USA; Department of Psychology, Emory University, Atlanta, GA, USA; Rehabilitation R&D Center of Excellence, Atlanta VAMC, Decatur, GA, USA.
| |
Collapse
|
24
|
Joanne Jao R, James TW, Harman James K. Multisensory convergence of visual and haptic object preference across development. Neuropsychologia 2014; 56:381-92. [PMID: 24560914 PMCID: PMC4020146 DOI: 10.1016/j.neuropsychologia.2014.02.009] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2013] [Revised: 01/10/2014] [Accepted: 02/10/2014] [Indexed: 11/27/2022]
Abstract
Visuohaptic inputs offer redundant and complementary information regarding an object׳s geometrical structure. The integration of these inputs facilitates object recognition in adults. While the ability to recognize objects in the environment both visually and haptically develops early on, the development of the neural mechanisms for integrating visual and haptic object shape information remains unknown. In the present study, we used functional Magnetic Resonance Imaging (fMRI) in three groups of participants, 4 to 5.5 year olds, 7 to 8.5 year olds, and adults. Participants were tested in a block design involving visual exploration of two-dimensional images of common objects and real textures, and haptic exploration of their three-dimensional counterparts. As in previous studies, object preference was defined as a greater BOLD response for objects than textures. The analyses specifically target two sites of known visuohaptic convergence in adults: the lateral occipital tactile-visual region (LOtv) and intraparietal sulcus (IPS). Results indicated that the LOtv is involved in visuohaptic object recognition early on. More importantly, object preference in the LOtv became increasingly visually dominant with development. Despite previous reports that the lateral occipital complex (LOC) is adult-like by 8 years, these findings indicate that at least part of the LOC is not. Whole-brain maps showed overlap between adults and both groups of children in the LOC. However, the overlap did not build incrementally from the younger to the older group, suggesting that visuohaptic object preference does not develop in an additive manner. Taken together, the results show that the development of neural substrates for visuohaptic recognition is protracted compared to substrates that are primarily visual or haptic.
Collapse
Affiliation(s)
- R Joanne Jao
- Cognitive Science Program, Indiana University, Bloomington, IN, United States; Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN, United States.
| | - Thomas W James
- Cognitive Science Program, Indiana University, Bloomington, IN, United States; Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN, United States; Program in Neuroscience, Indiana University, Bloomington, IN, United States
| | - Karin Harman James
- Cognitive Science Program, Indiana University, Bloomington, IN, United States; Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN, United States; Program in Neuroscience, Indiana University, Bloomington, IN, United States
| |
Collapse
|
25
|
Maidenbaum S, Abboud S, Amedi A. Sensory substitution: closing the gap between basic research and widespread practical visual rehabilitation. Neurosci Biobehav Rev 2013; 41:3-15. [PMID: 24275274 DOI: 10.1016/j.neubiorev.2013.11.007] [Citation(s) in RCA: 89] [Impact Index Per Article: 8.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2013] [Revised: 10/06/2013] [Accepted: 11/08/2013] [Indexed: 11/25/2022]
Abstract
Sensory substitution devices (SSDs) have come a long way since first developed for visual rehabilitation. They have produced exciting experimental results, and have furthered our understanding of the human brain. Unfortunately, they are still not used for practical visual rehabilitation, and are currently considered as reserved primarily for experiments in controlled settings. Over the past decade, our understanding of the neural mechanisms behind visual restoration has changed as a result of converging evidence, much of which was gathered with SSDs. This evidence suggests that the brain is more than a pure sensory-machine but rather is a highly flexible task-machine, i.e., brain regions can maintain or regain their function in vision even with input from other senses. This complements a recent set of more promising behavioral achievements using SSDs and new promising technologies and tools. All these changes strongly suggest that the time has come to revive the focus on practical visual rehabilitation with SSDs and we chart several key steps in this direction such as training protocols and self-train tools.
Collapse
Affiliation(s)
- Shachar Maidenbaum
- Department of Medical Neurobiology, The Institute for Medical Research Israel-Canada, Faculty of Medicine, The Hebrew University of Jerusalem, Jerusalem 91220, Israel
| | - Sami Abboud
- Department of Medical Neurobiology, The Institute for Medical Research Israel-Canada, Faculty of Medicine, The Hebrew University of Jerusalem, Jerusalem 91220, Israel
| | - Amir Amedi
- Department of Medical Neurobiology, The Institute for Medical Research Israel-Canada, Faculty of Medicine, The Hebrew University of Jerusalem, Jerusalem 91220, Israel; The Edmond and Lily Safra Center for Brain Sciences (ELSC), The Hebrew University of Jerusalem, Jerusalem 91220, Israel.
| |
Collapse
|
26
|
Gleeson BT, Provancher WR. Mental rotation of tactile stimuli: using directional haptic cues in mobile devices. IEEE TRANSACTIONS ON HAPTICS 2013; 6:330-339. [PMID: 24808329 DOI: 10.1109/toh.2013.5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
Haptic interfaces have the potential to enrich users' interactions with mobile devices and convey information without burdening the user's visual or auditory attention. Haptic stimuli with directional content, for example, navigational cues, may be difficult to use in handheld applications; the user's hand, where the cues are delivered, may not be aligned with the world, where the cues are to be interpreted. In such a case, the user would be required to mentally transform the stimuli between different reference frames. We examine the mental rotation of directional haptic stimuli in three experiments, investigating: 1) users' intuitive interpretation of rotated stimuli, 2) mental rotation of haptic stimuli about a single axis, and 3) rotation about multiple axes and the effects of specific hand poses and joint rotations. We conclude that directional haptic stimuli are suitable for use in mobile applications, although users do not naturally interpret rotated stimuli in any one universal way. We find evidence of cognitive processes involving the rotation of analog, spatial representations and discuss how our results fit into the larger body of mental rotation research. For small angles (e.g., less than 40 degree), these mental rotations come at little cost, but rotations with larger misalignment angles impact user performance. When considering the design of a handheld haptic device, our results indicate that hand pose must be carefully considered, as certain poses increase the difficulty of stimulus interpretation. Generally, all tested joint rotations impact task difficulty, but finger flexion and wrist rotation interact to greatly increase the cost of stimulus interpretation; such hand poses should be avoided when designing a haptic interface.
Collapse
|
27
|
Neural pathways conveying novisual information to the visual cortex. Neural Plast 2013; 2013:864920. [PMID: 23840972 PMCID: PMC3690246 DOI: 10.1155/2013/864920] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2013] [Accepted: 05/22/2013] [Indexed: 11/18/2022] Open
Abstract
The visual cortex has been traditionally considered as a stimulus-driven, unimodal system with a hierarchical organization. However, recent animal and human studies have shown that the visual cortex responds to non-visual stimuli, especially in individuals with visual deprivation congenitally, indicating the supramodal nature of the functional representation in the visual cortex. To understand the neural substrates of the cross-modal processing of the non-visual signals in the visual cortex, we firstly showed the supramodal nature of the visual cortex. We then reviewed how the nonvisual signals reach the visual cortex. Moreover, we discussed if these non-visual pathways are reshaped by early visual deprivation. Finally, the open question about the nature (stimulus-driven or top-down) of non-visual signals is also discussed.
Collapse
|
28
|
Renier L, De Volder AG, Rauschecker JP. Cortical plasticity and preserved function in early blindness. Neurosci Biobehav Rev 2013; 41:53-63. [PMID: 23453908 DOI: 10.1016/j.neubiorev.2013.01.025] [Citation(s) in RCA: 66] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2012] [Revised: 01/09/2013] [Accepted: 01/28/2013] [Indexed: 10/27/2022]
Abstract
The "neural Darwinism" theory predicts that when one sensory modality is lacking, as in congenital blindness, the target structures are taken over by the afferent inputs from other senses that will promote and control their functional maturation (Edelman, 1993). This view receives support from both cross-modal plasticity experiments in animal models and functional imaging studies in man, which are presented here.
Collapse
Affiliation(s)
- Laurent Renier
- Université catholique de Louvain, Institute of Neuroscience (IoNS), Avenue Hippocrate, 54, UCL-B1.5409, B-1200 Brussels, Belgium.
| | - Anne G De Volder
- Université catholique de Louvain, Institute of Neuroscience (IoNS), Avenue Hippocrate, 54, UCL-B1.5409, B-1200 Brussels, Belgium
| | - Josef P Rauschecker
- Laboratory for Integrative Neuroscience and Cognition; Department of Neuroscience; Georgetown University, Medical Center; 3970 Reservoir Road, NW, Washington, DC 20007, USA
| |
Collapse
|
29
|
Costagli M, Ueno K, Sun P, Gardner JL, Wan X, Ricciardi E, Pietrini P, Tanaka K, Cheng K. Functional signalers of changes in visual stimuli: cortical responses to increments and decrements in motion coherence. ACTA ACUST UNITED AC 2012; 24:110-8. [PMID: 23010749 DOI: 10.1093/cercor/bhs294] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
How does our brain detect changes in a natural scene? While changes by increments of specific visual attributes, such as contrast or motion coherence, can be signaled by an increase in neuronal activity in early visual areas, like the primary visual cortex (V1) or the human middle temporal complex (hMT+), respectively, the mechanisms for signaling changes resulting from decrements in a stimulus attribute are largely unknown. We have discovered opposing patterns of cortical responses to changes in motion coherence: unlike areas hMT+, V3A and parieto-occipital complex (V6+) that respond to changes in the level of motion coherence monotonically, human areas V4 (hV4), V3B, and ventral occipital always respond positively to both transient increments and decrements. This pattern of responding always positively to stimulus changes can emerge in the presence of either coherence-selective neuron populations, or neurons that are not tuned to particular coherences but adapt to a particular coherence level in a stimulus-selective manner. Our findings provide evidence that these areas possess physiological properties suited for signaling increments and decrements in a stimulus and may form a part of cortical vigilance system for detecting salient changes in the environment.
Collapse
Affiliation(s)
- Mauro Costagli
- Laboratory for Cognitive Brain Mapping, RIKEN Brain Science Institute, Japan
| | | | | | | | | | | | | | | | | |
Collapse
|
30
|
Toussaint L, Caissie AF, Blandin Y. Does mental rotation ability depend on sensory-specific experience? JOURNAL OF COGNITIVE PSYCHOLOGY 2012. [DOI: 10.1080/20445911.2011.641529] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
|
31
|
The neural mechanisms of reliability weighted integration of shape information from vision and touch. Neuroimage 2012; 60:1063-72. [DOI: 10.1016/j.neuroimage.2011.09.072] [Citation(s) in RCA: 40] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2011] [Revised: 09/08/2011] [Accepted: 09/24/2011] [Indexed: 11/23/2022] Open
|
32
|
|
33
|
Kim S, Stevenson RA, James TW. Visuo-haptic neuronal convergence demonstrated with an inversely effective pattern of BOLD activation. J Cogn Neurosci 2011; 24:830-42. [PMID: 22185495 DOI: 10.1162/jocn_a_00176] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
Abstract
We investigated the neural substrates involved in visuo-haptic neuronal convergence using an additive-factors design in combination with fMRI. Stimuli were explored under three sensory modality conditions: viewing the object through a mirror without touching (V), touching the object with eyes closed (H), or simultaneously viewing and touching the object (VH). This modality factor was crossed with a task difficulty factor, which had two levels. On the basis of an idea similar to the principle of inverse effectiveness, we predicted that increasing difficulty would increase the relative level of multisensory gain in brain regions where visual and haptic sensory inputs converged. An ROI analysis focused on the lateral occipital tactile-visual area found evidence of inverse effectiveness in the left lateral occipital tactile-visual area, but not in the right. A whole-brain analysis also found evidence for the same pattern in the anterior aspect of the intraparietal sulcus, the premotor cortex, and the posterior insula, all in the left hemisphere. In conclusion, this study is the first to demonstrate visuo-haptic neuronal convergence based on an inversely effective pattern of brain activation.
Collapse
Affiliation(s)
- Sunah Kim
- 360 Minor Hall, University of California, Berkeley, Berkeley, CA 94720, USA.
| | | | | |
Collapse
|
34
|
|
35
|
|
36
|
Abstract
Shape is an object property that inherently exists in vision and touch, and is processed in part by the lateral occipital complex (LOC). Recent studies have shown that shape can be artificially coded by sound using sensory substitution algorithms and learned with behavioral training. This finding offers a unique opportunity to test intermodal generalizability of the LOC beyond the sensory modalities in which shape is naturally perceived. Therefore, we investigated the role of the LOC in processing of shape by examining neural activity associated with learning tactile-shape-coded auditory information. Nine blindfolded sighted people learned the tactile-auditory relationship between raised abstract shapes and their corresponding shape-coded sounds over 5 d of training. Using functional magnetic resonance imaging, subjects were scanned before and after training during a task in which they first listened to a shape-coded sound transformation, then touched an embossed shape, and responded whether or not the tactile stimulus matched the auditory stimulus in terms of shape. We found that behavioral scores improved after training and that the LOC was commonly activated during the auditory and tactile conditions both before and after training. However, no significant training-related change was observed in magnitude or size of LOC activity; rather, the auditory cortex and LOC showed strengthened functional connectivity after training. These findings suggest that the LOC is available to different sensory systems for shape processing and that auditory-tactile sensory substitution training leads to neural changes allowing more direct or efficient access to this site by the auditory system.
Collapse
|
37
|
Hach S, Schütz-Bosbach S. Touching base: The effect of participant and stimulus modulation factors on a haptic line bisection task. Laterality 2011; 17:180-201. [PMID: 22385141 DOI: 10.1080/1357650x.2010.551128] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
Acquiring information about our environment through touch is vital in everyday life. Yet very little literature exists about factors that may influence haptic or tactile processing. Recent neuroimaging studies have reported haptic laterality effects that parallel those reported in the visual literature. With the use of a haptic variant of the classical line bisection task, the present study aimed to determine the presence of laterality effects on a behavioural level. Specifically, three handedness groups including strong dextrals, strong sinistrals, and-the to-date largely neglected group of-mixed-handers were examined in their ability to accurately bisect stimuli constructed from corrugated board strips of various lengths. Stimulus factors known to play a role in visuospatial perception including stimulus location, the hand used for bisection, and direction of exploration were systematically varied through pseudo-randomisation. Similar to the visual domain, stimulus location and length as well as participants' handedness and the hand used for bisection exerted a significant influence on participants' estimate of the centre of haptically explored stimuli. However, these effects differed qualitatively from those described for the visual domain, and the factor direction of exploration did not exert any significant effect. This indicates that laterality effects reported on a neural level are sufficiently pronounced to result in measurable behavioural effects. The results, first, add to laterality effects reported for the visual and auditory domain, second, are in line with supramodal spatial processing and third, provide additional evidence to a conceptualisation of pseudoneglect and neglect as signs of hemispheric attentional asymmetries.
Collapse
Affiliation(s)
- Sylvia Hach
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.
| | | |
Collapse
|
38
|
Haggard P, Giovagnoli G. Spatial patterns in tactile perception: is there a tactile field? Acta Psychol (Amst) 2011; 137:65-75. [PMID: 21470584 DOI: 10.1016/j.actpsy.2011.03.001] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2010] [Revised: 02/26/2011] [Accepted: 03/02/2011] [Indexed: 11/29/2022] Open
Abstract
Previous studies of tactile spatial perception focussed either on a single point of stimulation, on local patterns within a single skin region such as the fingertip, on tactile motion, or on active touch. It remains unclear whether we should speak of a tactile field, analogous to the visual field, and supporting spatial relations between stimulus locations. Here we investigate this question by studying perception of large-scale tactile spatial patterns on the hand, arm and back. Experiment 1 investigated the relation between perception of tactile patterns and the identification of subsets of those patterns. The results suggest that perception of tactile spatial patterns is based on representing the spatial relations between locations of individual stimuli. Experiment 2 investigated the spatial and temporal organising principles underlying these relations. Experiment 3 showed that tactile pattern perception makes reference to structural representations of the body, such as body parts separated by joints. Experiment 4 found that precision of pattern perception is poorer for tactile patterns that extend across the midline, compared to unilateral patterns. Overall, the results suggest that the human sense of touch involves a tactile field, analogous to the visual field. The tactile field supports computation of spatial relations between individual stimulus locations, and thus underlies tactile pattern perception.
Collapse
Affiliation(s)
- Patrick Haggard
- Institute of Cognitive Neuroscience, University College London, UK.
| | | |
Collapse
|
39
|
Burton H, Sinclair RJ, Dixit S. Working memory for vibrotactile frequencies: comparison of cortical activity in blind and sighted individuals. Hum Brain Mapp 2011; 31:1686-701. [PMID: 20162595 DOI: 10.1002/hbm.20966] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Abstract
In blind, occipital cortex showed robust activation to nonvisual stimuli in many prior functional neuroimaging studies. The cognitive processes represented by these activations are not fully determined, although a verbal recognition memory role has been demonstrated. In congenitally blind and sighted (10 per group), we contrasted responses to a vibrotactile one-back frequency retention task with 5-s delays and a vibrotactile amplitude-change task; both tasks involved the same vibration parameters. The one-back paradigm required continuous updating for working memory (WM). Findings in both groups confirmed roles in WM for right hemisphere dorsolateral prefrontal (DLPFC) and dorsal/ventral attention components of posterior parietal cortex. Negative findings in bilateral ventrolateral prefrontal cortex suggested task performance without subvocalization. In bilateral occipital cortex, blind showed comparable positive responses to both tasks, whereas WM evoked large negative responses in sighted. Greater utilization of attention resources in blind were suggested as causing larger responses in dorsal and ventral attention systems, right DLPFC, and persistent responses across delays between trials in somatosensory and premotor cortex. In sighted, responses in somatosensory and premotor areas showed iterated peaks matched to stimulation trial intervals. The findings in occipital cortex of blind suggest that tactile activations do not represent cognitive operations for nonverbal WM task. However, these data suggest a role in sensory processing for tactile information in blind that parallels a similar contribution for visual stimuli in occipital cortex of sighted.
Collapse
Affiliation(s)
- Harold Burton
- Department of Anatomy and Neurobiology, Washington University School of Medicine, St Louis, Missouri 63110, USA.
| | | | | |
Collapse
|
40
|
|
41
|
Renier LA, Anurova I, De Volder AG, Carlson S, VanMeter J, Rauschecker JP. Preserved functional specialization for spatial processing in the middle occipital gyrus of the early blind. Neuron 2010; 68:138-48. [PMID: 20920797 DOI: 10.1016/j.neuron.2010.09.021] [Citation(s) in RCA: 210] [Impact Index Per Article: 15.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/19/2010] [Indexed: 12/24/2022]
Abstract
The occipital cortex (OC) of early-blind humans is activated during various nonvisual perceptual and cognitive tasks, but little is known about its modular organization. Using functional MRI we tested whether processing of auditory versus tactile and spatial versus nonspatial information was dissociated in the OC of the early blind. No modality-specific OC activation was observed. However, the right middle occipital gyrus (MOG) showed a preference for spatial over nonspatial processing of both auditory and tactile stimuli. Furthermore, MOG activity was correlated with accuracy of individual sound localization performance. In sighted controls, most of extrastriate OC, including the MOG, was deactivated during auditory and tactile conditions, but the right MOG was more activated during spatial than nonspatial visual tasks. Thus, although the sensory modalities driving the neurons in the reorganized OC of blind individuals are altered, the functional specialization of extrastriate cortex is retained regardless of visual experience.
Collapse
Affiliation(s)
- Laurent A Renier
- Department of Physiology and Biophysics, Georgetown University Medical Center, Washington, DC 20007, USA
| | | | | | | | | | | |
Collapse
|
42
|
Lacey S, Hall J, Sathian K. Are surface properties integrated into visuohaptic object representations? Eur J Neurosci 2010; 31:1882-8. [PMID: 20584193 DOI: 10.1111/j.1460-9568.2010.07204.x] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
Object recognition studies have almost exclusively involved vision, focusing on shape rather than surface properties such as color. Visual object representations are thought to integrate shape and color information because changing the color of studied objects impairs their subsequent recognition. However, little is known about integration of surface properties into visuohaptic multisensory representations. Here, participants studied objects with distinct patterns of surface properties (color in Experiment 1, texture in Experiments 2 and 3) and had to discriminate between object shapes when color or texture schemes were altered in within-modal (visual and haptic) and cross-modal (visual study followed by haptic test and vice versa) conditions. In Experiment 1, color changes impaired within-modal visual recognition but had no effect on cross-modal recognition, suggesting that the multisensory representation was not influenced by modality-specific surface properties. In Experiment 2, texture changes impaired recognition in all conditions, suggesting that both unisensory and multisensory representations integrated modality-independent surface properties. However, the cross-modal impairment might have reflected either the texture change or a failure to form the multisensory representation. Experiment 3 attempted to distinguish between these possibilities by combining changes in texture with changes in orientation, taking advantage of the known view-independence of the multisensory representation, but the results were not conclusive owing to the overwhelming effect of texture change. The simplest account is that the multisensory representation integrates shape and modality-independent surface properties. However, more work is required to investigate this and the conditions under which multisensory integration of structural and surface properties occurs.
Collapse
Affiliation(s)
- Simon Lacey
- Department of Neurology, Emory University, WMB-6000, 101 Woodruff Circle, Atlanta, GA 30322, USA.
| | | | | |
Collapse
|
43
|
Savini N, Babiloni C, Brunetti M, Caulo M, Del Gratta C, Perrucci MG, Rossini PM, Romani GL, Ferretti A. Passive tactile recognition of geometrical shape in humans: An fMRI study. Brain Res Bull 2010; 83:223-31. [PMID: 20696217 DOI: 10.1016/j.brainresbull.2010.08.001] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2010] [Revised: 07/19/2010] [Accepted: 08/01/2010] [Indexed: 10/19/2022]
Abstract
Tactile shape discrimination involves frontal other than somatosensory cortex (Palva et al., 2005 [48]), but it is unclear if this frontal activity is related to exploratory concomitants. In this study, we investigated topographical details of prefrontal, premotor, and parietal areas during passive tactile recognition of 2D geometrical shapes in conditions avoiding exploratory movements. Functional magnetic resonance imaging (fMRI) was performed while the same wooden 2D geometrical shapes were blindly pressed on subjects' passive right palm in three conditions. In the RAW condition, shapes were pressed while subjects were asked to attend to the stimuli but were not trained to recognize them. After a brief training, in the SHAPE condition subjects were asked to covertly recognize shapes. In the RECOGNITION condition, they were asked to overtly recognize shapes, using response buttons with their opposite hand. Results showed that somatosensory cortex including contralateral SII, contralateral SI, and left insula was active in all conditions, confirming its importance in processing tactile shapes. In the RAW vs. SHAPE contrast, bilateral posterior parietal, insular, premotor, prefrontal, and (left) Broca's areas were more active in the latter. In the RECOGNITION, activation of (left) Broca's area correlated with correct responses. These results suggest that, even without exploratory movements, passive recognition of tactile geometrical shapes involves prefrontal and premotor as well as somatosensory regions. In this framework, Broca's area might be involved in a successful selection and/or execution of the correct responses.
Collapse
Affiliation(s)
- Nicoletta Savini
- Department of Neuroscience and Imaging, University "G. d' Annunzio" of Chieti, Italy
| | | | | | | | | | | | | | | | | |
Collapse
|
44
|
Beauchamp MS, Pasalar S, Ro T. Neural substrates of reliability-weighted visual-tactile multisensory integration. Front Syst Neurosci 2010; 4:25. [PMID: 20631844 PMCID: PMC2903191 DOI: 10.3389/fnsys.2010.00025] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2010] [Accepted: 05/25/2010] [Indexed: 02/03/2023] Open
Abstract
As sensory systems deteriorate in aging or disease, the brain must relearn the appropriate weights to assign each modality during multisensory integration. Using blood-oxygen level dependent functional magnetic resonance imaging of human subjects, we tested a model for the neural mechanisms of sensory weighting, termed “weighted connections.” This model holds that the connection weights between early and late areas vary depending on the reliability of the modality, independent of the level of early sensory cortex activity. When subjects detected viewed and felt touches to the hand, a network of brain areas was active, including visual areas in lateral occipital cortex, somatosensory areas in inferior parietal lobe, and multisensory areas in the intraparietal sulcus (IPS). In agreement with the weighted connection model, the connection weight measured with structural equation modeling between somatosensory cortex and IPS increased for somatosensory-reliable stimuli, and the connection weight between visual cortex and IPS increased for visual-reliable stimuli. This double dissociation of connection strengths was similar to the pattern of behavioral responses during incongruent multisensory stimulation, suggesting that weighted connections may be a neural mechanism for behavioral reliability weighting.
Collapse
Affiliation(s)
- Michael S Beauchamp
- Department of Neurobiology and Anatomy, University of Texas Health Science Center at Houston Houston, TX, USA
| | | | | |
Collapse
|
45
|
Multisensory integration of sounds and vibrotactile stimuli in processing streams for "what" and "where". J Neurosci 2009; 29:10950-60. [PMID: 19726653 DOI: 10.1523/jneurosci.0910-09.2009] [Citation(s) in RCA: 88] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
The segregation between cortical pathways for the identification and localization of objects is thought of as a general organizational principle in the brain. Yet, little is known about the unimodal versus multimodal nature of these processing streams. The main purpose of the present study was to test whether the auditory and tactile dual pathways converged into specialized multisensory brain areas. We used functional magnetic resonance imaging (fMRI) to compare directly in the same subjects the brain activation related to localization and identification of comparable auditory and vibrotactile stimuli. Results indicate that the right inferior frontal gyrus (IFG) and both left and right insula were more activated during identification conditions than during localization in both touch and audition. The reverse dissociation was found for the left and right inferior parietal lobules (IPL), the left superior parietal lobule (SPL) and the right precuneus-SPL, which were all more activated during localization conditions in the two modalities. We propose that specialized areas in the right IFG and the left and right insula are multisensory operators for the processing of stimulus identity whereas parts of the left and right IPL and SPL are specialized for the processing of spatial attributes independently of sensory modality.
Collapse
|
46
|
Tal N, Amedi A. Multisensory visual-tactile object related network in humans: insights gained using a novel crossmodal adaptation approach. Exp Brain Res 2009; 198:165-82. [PMID: 19652959 PMCID: PMC2733194 DOI: 10.1007/s00221-009-1949-4] [Citation(s) in RCA: 70] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2008] [Accepted: 07/07/2009] [Indexed: 11/19/2022]
Abstract
Neuroimaging techniques have provided ample evidence for multisensory integration in humans. However, it is not clear whether this integration occurs at the neuronal level or whether it reflects areal convergence without such integration. To examine this issue as regards visuo-tactile object integration we used the repetition suppression effect, also known as the fMRI-based adaptation paradigm (fMR-A). Under some assumptions, fMR-A can tag specific neuronal populations within an area and investigate their characteristics. This technique has been used extensively in unisensory studies. Here we applied it for the first time to study multisensory integration and identified a network of occipital (LOtv and calcarine sulcus), parietal (aIPS), and prefrontal (precentral sulcus and the insula) areas all showing a clear crossmodal repetition suppression effect. These results provide a crucial first insight into the neuronal basis of visuo-haptic integration of objects in humans and highlight the power of using fMR-A to study multisensory integration using non-invasinve neuroimaging techniques.
Collapse
Affiliation(s)
- Noa Tal
- Department of Medical Neurobiology, Institute for Medical Research Israel-Canada, Hebrew University, Hadassah Medical School, 91220 Jerusalem, Israel
| | | |
Collapse
|
47
|
Konkle T, Wang Q, Hayward V, Moore CI. Motion aftereffects transfer between touch and vision. Curr Biol 2009; 19:745-50. [PMID: 19361996 PMCID: PMC3398123 DOI: 10.1016/j.cub.2009.03.035] [Citation(s) in RCA: 83] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2008] [Revised: 02/15/2009] [Accepted: 03/03/2009] [Indexed: 11/21/2022]
Abstract
Current views on multisensory motion integration assume separate substrates where visual motion perceptually dominates tactile motion [1, 2]. However, recent neuroimaging findings demonstrate strong activation of visual motion processing areas by tactile stimuli [3-6], implying a potentially bidirectional relationship. To test the relationship between visual and tactile motion processing, we examined the transfer of motion aftereffects. In the well-known visual motion aftereffect, adapting to visual motion in one direction causes a subsequently presented stationary stimulus to be perceived as moving in the opposite direction [7, 8]. The existence of motion aftereffects in the tactile domain was debated [9-11], though robust tactile motion aftereffects have recently been demonstrated [12, 13]. By using a motion adaptation paradigm, we found that repeated exposure to visual motion in a given direction produced a tactile motion aftereffect, the illusion of motion in the opponent direction across the finger pad. We also observed that repeated exposure to tactile motion induces a visual motion aftereffect, biasing the perceived direction of counterphase gratings. These crossmodal aftereffects, operating both from vision to touch and from touch to vision, present strong behavioral evidence that the processing of visual and tactile motion rely on shared representations that dynamically impact modality-specific perception.
Collapse
Affiliation(s)
- Talia Konkle
- McGovern Institute for Brain Research and Department of Brain & Cognitive Sciences, Massachusetts Institute of Technology, 77 Massachusetts Avenue, 46-2171, Cambridge, MA 02139, USA
| | - Qi Wang
- Department of Biomedical Engineering, Georgia Institute of Technology, 313 Ferst Drive, Atlanta, GA 30332-0535, USA
| | - Vincent Hayward
- Institute des Systemes Intelligents et de Robotique, Universite Pierre et Marie Curie, 4 place Jussieu, 75252 Paris, France
| | - Christopher I. Moore
- McGovern Institute for Brain Research and Department of Brain & Cognitive Sciences, Massachusetts Institute of Technology, 77 Massachusetts Avenue, 46-2171, Cambridge, MA 02139, USA
| |
Collapse
|
48
|
Carli G, Manzoni D, Santarcangelo EL. Hypnotizability-related integration of perception and action. Cogn Neuropsychol 2009; 25:1065-76. [PMID: 18608323 DOI: 10.1080/02643290801913712] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
Hypnotizability is a cognitive trait able to modulate many behavioural/physiological processes and associated with peculiar functional characteristics of the frontal executive system. This review summarizes experimental results on hypnotizability-related differences in sensorimotor integration at a reflex and an integrated level (postural control) and suggests possible interpretations based on morpho-functional considerations. In particular, hypnotizability-related differences in spinal motoneurones excitability are described, and the role of attention and imagery in maintaining a stable upright stance when sensory information is reduced or altered and when attention is absorbed in cognitive tasks is discussed as a function of hypnotic susceptibility. The projections from prefrontal cortex to spinal motoneurones and the balance between the activation of the right and left cortical hemisphere are considered responsible for the hypnotizability-related modulation of reflex responses, while the differences in postural control between subjects with high (highs) and low (lows) hypnotic susceptibility are considered a possible consequence of the activity of the locus coeruleus, which is also involved in attention, and of the cerebellum, which might be responsible for different internal models of postural control. We suggest a highly pervasive role of hypnotic susceptibility in human behaviour through the modulation of the integration of perception and action, which could be relevant for neurorehabilitative treatments and for the adaptation to special environments.
Collapse
|
49
|
Lacey S, Tal N, Amedi A, Sathian K. A putative model of multisensory object representation. Brain Topogr 2009; 21:269-74. [PMID: 19330441 PMCID: PMC3156680 DOI: 10.1007/s10548-009-0087-4] [Citation(s) in RCA: 95] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2009] [Accepted: 03/11/2009] [Indexed: 10/21/2022]
Abstract
This review surveys the recent literature on visuo-haptic convergence in the perception of object form, with particular reference to the lateral occipital complex (LOC) and the intraparietal sulcus (IPS) and discusses how visual imagery or multisensory representations might underlie this convergence. Drawing on a recent distinction between object- and spatially-based visual imagery, we propose a putative model in which LOtv, a subregion of LOC, contains a modality-independent representation of geometric shape that can be accessed either bottom-up from direct sensory inputs or top-down from frontoparietal regions. We suggest that such access is modulated by object familiarity: spatial imagery may be more important for unfamiliar objects and involve IPS foci in facilitating somatosensory inputs to the LOC; by contrast, object imagery may be more critical for familiar objects, being reflected in prefrontal drive to the LOC.
Collapse
Affiliation(s)
- Simon Lacey
- Department of Neurology, Emory University, Atlanta, GA, USA
| | - Noa Tal
- Physiology Department, Faculty of Medicine, The Hebrew University of Jerusalem, Jerusalem 91220, Israel
| | - Amir Amedi
- Physiology Department, Faculty of Medicine, The Hebrew University of Jerusalem, Jerusalem 91220, Israel
- Cognitive Science Program, The Hebrew University of Jerusalem, Jerusalem 91220, Israel
| | - K. Sathian
- Department of Neurology, Emory University, Atlanta, GA, USA
- Department of Rehabilitation Medicine, Emory University, Atlanta, GA, USA
- Department of Psychology, Emory University, Atlanta, GA, USA
- Rehabilitation R&D Center of Excellence, Atlanta VAMC, Decatur, GA, USA
| |
Collapse
|
50
|
Stilla R, Sathian K. Selective visuo-haptic processing of shape and texture. Hum Brain Mapp 2008; 29:1123-38. [PMID: 17924535 DOI: 10.1002/hbm.20456] [Citation(s) in RCA: 144] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022] Open
Abstract
Previous functional neuroimaging studies have described shape-selectivity for haptic stimuli in many cerebral cortical regions, of which some are also visually shape-selective. However, the literature is equivocal on the existence of haptic or visuo-haptic texture-selectivity. We report here on a human functional magnetic resonance imaging (fMRI) study in which shape and texture perception were contrasted using haptic stimuli presented to the right hand, and visual stimuli presented centrally. Bilateral selectivity for shape, with overlap between modalities, was found in a dorsal set of parietal areas: the postcentral sulcus and anterior, posterior and ventral parts of the intraparietal sulcus (IPS); as well as ventrally in the lateral occipital complex. The magnitude of visually- and haptically-evoked activity was significantly correlated across subjects in the left posterior IPS and right lateral occipital complex, suggesting that these areas specifically house representations of object shape. Haptic shape-selectivity was also found in the left postcentral gyrus, the left lingual gyrus, and a number of frontal cortical sites. Haptic texture-selectivity was found in ventral somatosensory areas: the parietal operculum and posterior insula bilaterally, as well as in the right medial occipital cortex, overlapping with a medial occipital cortical region, which was texture-selective for visual stimuli. The present report corroborates and elaborates previous suggestions of specialized visuo-haptic processing of texture and shape.
Collapse
Affiliation(s)
- Randall Stilla
- Department of Neurology, Emory University, Atlanta, Georgia 30322, USA
| | | |
Collapse
|