1
|
Kusnir F, Pesin S, Landau AN. Hello from the other side: Robust contralateral interference in tactile detection. Atten Percept Psychophys 2024; 86:295-311. [PMID: 37872432 PMCID: PMC10769913 DOI: 10.3758/s13414-023-02801-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/24/2023] [Indexed: 10/25/2023]
Abstract
Touch is unique among the sensory modalities in that our tactile receptors are spread across the body surface and continuously receive different inputs at the same time. These inputs vary in type, properties, relevance according to current goals, and, of course, location on the body. Sometimes, they must be integrated, and other times set apart and distinguished. Here, we investigate how simultaneous stimulation to different body sites affects tactile cognition. Specifically, we characterized the impact of irrelevant tactile sensations on tactile change detection. To this end, we embedded detection targets amidst ongoing performance, akin to the conditions encountered in everyday life, where we are constantly confronted with new events within ongoing stimuli. In the set of experiments presented here, participants detected a brief intensity change (.04 s) within an ongoing vibrotactile stimulus (1.6 s) that was always presented in a constantly attended location. The intensity change (i.e., the detection target) varied parametrically, from hardly detectable to easily detectable. In half of the trials, irrelevant ongoing stimulation was simultaneously presented to a site across the body midline, but participants were instructed to ignore it. In line with previous bimanual studies employing brief onset targets, we document robust interference on performance due to the irrelevant stimulation at each of the measured body sites (homologous and nonhomologous fingers, and the contralateral ankle). After describing this basic phenomenon, we further examine the conditions under which such interference occurs in three additional tasks. In each task, we honed in on a different aspect of the stimulation protocol (e.g., hand distance, the strength of the irrelevant stimulation, the detection target itself) in order to better understand the principles governing the observed interference effects. Our findings suggest a minimal role for exogenous attentional capture in producing the observed interference effects (Exp. 2), and a principled distribution of attentional resources or sensory integration between body sides (Exps. 3, 4). In our last study (Exp. 4), we presented bilateral tactile targets of varying intensities to both the relevant and irrelevant stimulation sites. We then characterized the degree to which the irrelevant stimulation is also processed. Our results-that participants' perception of target intensity is always proportional to the combined bilateral signal-suggest that both body sites are equally weighed and processed despite clear instructions to attend only the target site. In light of this observation and participants' inability to use selection processes to guide their perception, we propose that bilateral tactile inputs are automatically combined, quite possibly early in the hierarchy of somatosensory processing.
Collapse
Affiliation(s)
- Flor Kusnir
- Departments of Psychology and Cognitive Science, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Slav Pesin
- Departments of Psychology and Cognitive Science, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Ayelet N Landau
- Departments of Psychology and Cognitive Science, The Hebrew University of Jerusalem, Jerusalem, Israel.
| |
Collapse
|
2
|
Cuppini C, Magosso E, Monti M, Ursino M, Yau JM. A neurocomputational analysis of visual bias on bimanual tactile spatial perception during a crossmodal exposure. Front Neural Circuits 2022; 16:933455. [PMID: 36439678 PMCID: PMC9684216 DOI: 10.3389/fncir.2022.933455] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2022] [Accepted: 10/13/2022] [Indexed: 11/11/2022] Open
Abstract
Vision and touch both support spatial information processing. These sensory systems also exhibit highly specific interactions in spatial perception, which may reflect multisensory representations that are learned through visuo-tactile (VT) experiences. Recently, Wani and colleagues reported that task-irrelevant visual cues bias tactile perception, in a brightness-dependent manner, on a task requiring participants to detect unimanual and bimanual cues. Importantly, tactile performance remained spatially biased after VT exposure, even when no visual cues were presented. These effects on bimanual touch conceivably reflect cross-modal learning, but the neural substrates that are changed by VT experience are unclear. We previously described a neural network capable of simulating VT spatial interactions. Here, we exploited this model to test different hypotheses regarding potential network-level changes that may underlie the VT learning effects. Simulation results indicated that VT learning effects are inconsistent with plasticity restricted to unisensory visual and tactile hand representations. Similarly, VT learning effects were also inconsistent with changes restricted to the strength of inter-hemispheric inhibitory interactions. Instead, we found that both the hand representations and the inter-hemispheric inhibitory interactions need to be plastic to fully recapitulate VT learning effects. Our results imply that crossmodal learning of bimanual spatial perception involves multiple changes distributed over a VT processing cortical network.
Collapse
Affiliation(s)
- Cristiano Cuppini
- Department of Electrical, Electronic, and Information Engineering “Guglielmo Marconi,” University of Bologna, Bologna, Italy,*Correspondence: Cristiano Cuppini,
| | - Elisa Magosso
- Department of Electrical, Electronic, and Information Engineering “Guglielmo Marconi,” University of Bologna, Bologna, Italy
| | - Melissa Monti
- Department of Electrical, Electronic, and Information Engineering “Guglielmo Marconi,” University of Bologna, Bologna, Italy
| | - Mauro Ursino
- Department of Electrical, Electronic, and Information Engineering “Guglielmo Marconi,” University of Bologna, Bologna, Italy
| | - Jeffrey M. Yau
- Department of Neuroscience, Baylor College of Medicine, Houston, TX, United States
| |
Collapse
|
3
|
Arslanova I, Takamuku S, Gomi H, Haggard P. Multi-digit tactile perception I: motion integration benefits for tactile trajectories presented bimanually. J Neurophysiol 2022; 128:418-433. [PMID: 35822710 PMCID: PMC9359661 DOI: 10.1152/jn.00022.2022] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Interactions with objects involve simultaneous contact with multiple, not necessarily adjacent, skin regions. While advances have been made in understanding the capacity to selectively attend to a single tactile element among distracting stimulations, here, we examine how multiple stimulus elements are explicitly integrated into an overall tactile percept. Across four experiments, participants averaged the direction of two simultaneous tactile motion trajectories of varying discrepancy delivered to different fingerpads. Averaging performance differed between within- and between-hands conditions in terms of sensitivity and precision but was unaffected by somatotopic proximity between stimulated fingers. First, precision was greater in between-hand compared to within-hand conditions, demonstrating a bimanual perceptual advantage in multi-touch integration. Second, sensitivity to the average direction was influenced by the discrepancy between individual motion signals, but only for within-hand conditions. Overall, our experiments identify key factors that influence perception of simultaneous tactile events. In particular, we show that multi-touch integration is constrained by hand-specific rather than digit-specific mechanisms.
Collapse
Affiliation(s)
- Irena Arslanova
- Institute of Cognitive Neuroscience, grid.83440.3bUniversity College London, London, United Kingdom
| | - Shinya Takamuku
- NTT Communication Science Laboratories, Nippon Telegraph and Telephone Corporation, Atsugi, Kanagawa, Japan
| | - Hiroaki Gomi
- NTT Communication Science Laboratories, Nippon Telegraph and Telephone Corporation, Atsugi, Kanagawa, Japan
| | - Patrick Haggard
- Institute of Cognitive Neuroscience, grid.83440.3bUniversity College London, London, United Kingdom
| |
Collapse
|
4
|
Yokosaka T, Suzuishi Y, Kuroki S. Feeling Illusory Textures Through a Hole: Rotating Frame At Skin-Object Interface Modifies Perceived Tactile Texture. IEEE TRANSACTIONS ON HAPTICS 2022; 15:304-314. [PMID: 34727039 DOI: 10.1109/toh.2021.3124138] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Modulating tactile texture perception for the surface of real objects is a promising way to artificially present various tactile textures. Here, we propose a simple method of modulating tactile textures for various materials, which is named the rotating-frame method. In the method, one touches an arbitrary material's surface through a hole in a cardboard frame. When the frame is rotated between the hand and material, the tactile texture of the material is perceived as if it has turned into another material. We investigated the qualitative and quantitative characteristics of the illusory modulation created by the method in a series of psychophysical experiments. We found that the method altered the tactile textures of the surfaces of touched materials such as glass and carpet to seem softer, smoother, slipperier, and warmer than they actually are. The illusory texture change occurred robustly when the method was applied with different categories of materials. Our method paves the way for the development of simple techniques for texture augmentation that can be applied to a wide range of materials and do not disrupt stable direct contact between the hand and the materials.
Collapse
|
5
|
Kilteni K, Ehrsson HH. Predictive attenuation of touch and tactile gating are distinct perceptual phenomena. iScience 2022; 25:104077. [PMID: 35372807 PMCID: PMC8968059 DOI: 10.1016/j.isci.2022.104077] [Citation(s) in RCA: 13] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2021] [Revised: 12/21/2021] [Accepted: 03/11/2022] [Indexed: 01/26/2023] Open
Abstract
In recent decades, research on somatosensory perception has led to two important observations. First, self-generated touches that are predicted by voluntary movements become attenuated compared with externally generated touches of the same intensity (attenuation). Second, externally generated touches feel weaker and are more difficult to detect during movement than at rest (gating). At present, researchers often consider gating and attenuation the same suppression process; however, this assumption is unwarranted because, despite more than 40 years of research, no study has combined them in a single paradigm. We quantified how people perceive self-generated and externally generated touches during movement and rest. We show that whereas voluntary movement gates the precision of both self-generated and externally generated touch, the amplitude of self-generated touch is robustly attenuated compared with externally generated touch. Furthermore, attenuation and gating do not interact and are not correlated, and we conclude that they represent distinct perceptual phenomena. We tested the perception of self-generated and external touch during movement and rest The intensity of self-generated touch is reduced during movement and rest (attenuation) The precision of self-generated and external touch is reduced during movement (gating) Attenuation and gating neither interact nor correlate, and are distinct phenomena
Collapse
Affiliation(s)
- Konstantina Kilteni
- Department of Neuroscience, Karolinska Institutet, Solnavägen 9, 17165 Stockholm, Sweden
- Corresponding author
| | - H. Henrik Ehrsson
- Department of Neuroscience, Karolinska Institutet, Solnavägen 9, 17165 Stockholm, Sweden
| |
Collapse
|
6
|
Yokosaka T, Kuroki S, Nishida S. Describing the Sensation of the 'Velvet Hand Illusion' in Terms of Common Materials. IEEE TRANSACTIONS ON HAPTICS 2021; 14:680-685. [PMID: 33347414 DOI: 10.1109/toh.2020.3046376] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
When sandwiching two moving parallel metallic wires between both hands, one often experiences an unexpected tactile sensation known as the "velvet hand illusion" (VHI). Researchers have revealed the optimal conditions for inducing VHI, while the subjective nature of VHI remains obscure. In this article, we conducted a psychophysical experiment to investigate the quality and magnitude of the illusory sensation felt during VHI. Participants were asked to evaluate the tactile sensation of moving wires by giving tactile adjective and intensity ratings of the illusory sensation. In the same experiment, for the sake of comparison, participants also rated the sensation for various common materials one may encounter in daily life. We found that, as the intensity of the illusory sensation increased, the tactile sensation became softer, wetter, warmer, and more favorable. We also found that, when a strong illusion was reported, the sensation was similar to those for leather and fabrics rather than metallic wire, which suggests that the illusion indeed changes the perceived material category. These findings provide a better characterization of VHI as well as a better understanding of tactile texture perception.
Collapse
|
7
|
Abstract
Hearing aid and cochlear implant (CI) users often struggle to locate and segregate sounds. The dominant sound-localisation cues are time and intensity differences across the ears. A recent study showed that CI users locate sounds substantially better when these cues are provided through haptic stimulation on each wrist. However, the sensitivity of the wrists to these cues and the robustness of this sensitivity to aging is unknown. The current study showed that time difference sensitivity is much poorer across the wrists than across the ears and declines with age. In contrast, high sensitivity to across-wrist intensity differences was found that was robust to aging. This high sensitivity was observed across a range of stimulation intensities for both amplitude modulated and unmodulated sinusoids and matched across-ear intensity difference sensitivity for normal-hearing individuals. Furthermore, the usable dynamic range for haptic stimulation on the wrists was found to be around four times larger than for CIs. These findings suggest that high-precision haptic sound-localisation can be achieved, which could aid many hearing-impaired listeners. Furthermore, the finding that high-fidelity across-wrist intensity information can be transferred could be exploited in human-machine interfaces to enhance virtual reality and improve remote control of military, medical, or research robots.
Collapse
|
8
|
Fletcher MD. Using haptic stimulation to enhance auditory perception in hearing-impaired listeners. Expert Rev Med Devices 2020; 18:63-74. [PMID: 33372550 DOI: 10.1080/17434440.2021.1863782] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
INTRODUCTION Hearing-assistive devices, such as hearing aids and cochlear implants, transform the lives of hearing-impaired people. However, users often struggle to locate and segregate sounds. This leads to impaired threat detection and an inability to understand speech in noisy environments. Recent evidence suggests that segregation and localization can be improved by providing missing sound-information through haptic stimulation. AREAS COVERED This article reviews the evidence that haptic stimulation can effectively provide sound information. It then discusses the research and development required for this approach to be implemented in a clinically viable device. This includes discussion of what sound information should be provided and how that information can be extracted and delivered. EXPERT OPINION Although this research area has only recently emerged, it builds on a significant body of work showing that sound information can be effectively transferred through haptic stimulation. Current evidence suggests that haptic stimulation is highly effective at providing missing sound-information to cochlear implant users. However, a great deal of work remains to implement this approach in an effective wearable device. If successful, such a device could offer an inexpensive, noninvasive means of improving educational, work, and social experiences for hearing-impaired individuals, including those without access to hearing-assistive devices.
Collapse
Affiliation(s)
- Mark D Fletcher
- University of Southampton Auditory Implant Service, Southampton, UK.,Institute of Sound and Vibration Research, University of Southampton, Southampton, UK
| |
Collapse
|
9
|
Arslanova I, Wang K, Gomi H, Haggard P. Somatosensory evoked potentials that index lateral inhibition are modulated according to the mode of perceptual processing: comparing or combining multi-digit tactile motion. Cogn Neurosci 2020; 13:47-59. [PMID: 33307992 DOI: 10.1080/17588928.2020.1839403] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
Many perceptual studies focus on the brain's capacity to discriminate between stimuli. However, our normal experience of the world also involves integrating multiple stimuli into a single perceptual event. Neural mechanisms such as lateral inhibition are believed to enhance local differences between sensory inputs from nearby regions of the receptor surface. However, this mechanism would seem dysfunctional when sensory inputs need to be combined rather than contrasted. Here, we investigated whether the brain can strategically regulate the strength of suppressive interactions that underlie lateral inhibition between finger representations in human somatosensory processing. To do this, we compared sensory processing between conditions that required either comparing or combining information. We delivered two simultaneous tactile motion trajectories to index and middle fingertips of the right hand. Participants had to either compare the directions of the two stimuli, or to combine them to form their average direction. To reveal preparatory tuning of somatosensory cortex, we used an established event-related potential design to measure the interaction between cortical representations evoked by digital nerve shocks immediately before each tactile stimulus. Consistent with previous studies, we found a clear suppression between cortical activations when participants were instructed to compare the tactile motion directions. Importantly, this suppression was significantly reduced when participants had to combine the same stimuli. These findings suggest that the brain can strategically switch between a comparative and a combinative mode of somatosensory processing, according to the perceptual goal, by preparatorily adjusting the strength of a process akin to lateral inhibition.
Collapse
Affiliation(s)
- Irena Arslanova
- Institute of Cognitive Neuroscience, University College London, London, UK
| | - Keying Wang
- Institute of Cognitive Neuroscience, University College London, London, UK
| | - Hiroaki Gomi
- NTT Communication Science Laboratories, NTT Corporation, Atsugishi, Japan
| | - Patrick Haggard
- Institute of Cognitive Neuroscience, University College London, London, UK
| |
Collapse
|
10
|
Togoli I, Marlair C, Collignon O, Arrighi R, Crollen V. Tactile numerosity is coded in external space. Cortex 2020; 134:43-51. [PMID: 33249299 DOI: 10.1016/j.cortex.2020.10.008] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2020] [Revised: 09/13/2020] [Accepted: 10/14/2020] [Indexed: 11/18/2022]
Abstract
Humans, and several non-human species, possess the ability to make approximate but reliable estimates of the number of objects around them. Alike other perceptual features, numerosity perception is susceptible to adaptation: exposure to a high number of items causes underestimation of the numerosity of a subsequent set of items, and vice versa. Several studies have investigated adaptation in the auditory and visual modality, whereby stimuli are preferentially encoded in an external coordinate system. As tactile stimuli are primarily coded in an internal (body-centered) reference frame, here we ask whether tactile numerosity adaptation operates based on internal or external spatial coordinates as it occurs in vision or audition. Twenty participants performed an adaptation task with their right hand located either in the right (uncrossed) or left (crossed) hemispace, in order for the two hands to occupy either two completely different positions, or the same position in space, respectively. Tactile adaptor and test stimuli were passively delivered either to the same (adapted) or different (non-adapted) hands. Our results show a clear signature of tactile numerosity adaptation aftereffects with a pattern of over- and under-estimation according to the adaptation rate (low and high, respectively). In the uncrossed position, we observed stronger adaptation effects when adaptor and test stimuli were delivered to the "adapted" hand. However, when both hands were aligned in the same spatial position (crossed condition), the magnitude of adaptation was similar irrespective of which hand received adaptor and test stimuli. These results demonstrate that numerosity information is automatically coded in external coordinates even in the tactile modality, suggesting that such a spatial reference frame is an intrinsic property of numerosity processing irrespective of the sensory modality.
Collapse
Affiliation(s)
- Irene Togoli
- International School for Advanced Studies (SISSA), Trieste, Italy.
| | - Cathy Marlair
- Psychological Sciences Research Institute (IPSY) and Institute of NeuroScience (IoNS), Université Catholique de Louvain, Louvain-la-Neuve, Belgium
| | - Olivier Collignon
- Psychological Sciences Research Institute (IPSY) and Institute of NeuroScience (IoNS), Université Catholique de Louvain, Louvain-la-Neuve, Belgium
| | - Roberto Arrighi
- University of Florence, Department of Neuroscience, Psychology and Child Health, Florence, Italy.
| | - Virginie Crollen
- Psychological Sciences Research Institute (IPSY) and Institute of NeuroScience (IoNS), Université Catholique de Louvain, Louvain-la-Neuve, Belgium
| |
Collapse
|
11
|
Rahman MS, Barnes KA, Crommett LE, Tommerdahl M, Yau JM. Auditory and tactile frequency representations are co-embedded in modality-defined cortical sensory systems. Neuroimage 2020; 215:116837. [PMID: 32289461 PMCID: PMC7292761 DOI: 10.1016/j.neuroimage.2020.116837] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2019] [Revised: 03/17/2020] [Accepted: 04/06/2020] [Indexed: 11/18/2022] Open
Abstract
Sensory information is represented and elaborated in hierarchical cortical systems that are thought to be dedicated to individual sensory modalities. This traditional view of sensory cortex organization has been challenged by recent evidence of multimodal responses in primary and association sensory areas. Although it is indisputable that sensory areas respond to multiple modalities, it remains unclear whether these multimodal responses reflect selective information processing for particular stimulus features. Here, we used fMRI adaptation to identify brain regions that are sensitive to the temporal frequency information contained in auditory, tactile, and audiotactile stimulus sequences. A number of brain regions distributed over the parietal and temporal lobes exhibited frequency-selective temporal response modulation for both auditory and tactile stimulus events, as indexed by repetition suppression effects. A smaller set of regions responded to crossmodal adaptation sequences in a frequency-dependent manner. Despite an extensive overlap of multimodal frequency-selective responses across the parietal and temporal lobes, representational similarity analysis revealed a cortical "regional landscape" that clearly reflected distinct somatosensory and auditory processing systems that converged on modality-invariant areas. These structured relationships between brain regions were also evident in spontaneous signal fluctuation patterns measured at rest. Our results reveal that multimodal processing in human cortex can be feature-specific and that multimodal frequency representations are embedded in the intrinsically hierarchical organization of cortical sensory systems.
Collapse
Affiliation(s)
- Md Shoaibur Rahman
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, 77030, USA
| | - Kelly Anne Barnes
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, 77030, USA; Department of Behavioral and Social Sciences, San Jacinto College - South, Houston, 13735 Beamer Rd, S13.269, Houston, TX, 77089, USA
| | - Lexi E Crommett
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, 77030, USA
| | - Mark Tommerdahl
- Department of Biomedical Engineering, University of North Carolina at Chapel Hill, CB No. 7575, Chapel Hill, NC, 27599, USA
| | - Jeffrey M Yau
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, 77030, USA.
| |
Collapse
|
12
|
Halfen EJ, Magnotti JF, Rahman MS, Yau JM. Principles of tactile search over the body. J Neurophysiol 2020; 123:1955-1968. [PMID: 32233886 DOI: 10.1152/jn.00694.2019] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022] Open
Abstract
Although we routinely experience complex tactile patterns over our entire body, how we selectively experience multisite touch over our bodies remains poorly understood. Here, we characterized tactile search behavior over the full body using a tactile analog of the classic visual search task. On each trial, participants judged whether a target stimulus (e.g., 10-Hz vibration) was present or absent anywhere on the body. When present, the target stimulus could occur alone or simultaneously with distractor stimuli (e.g., 30-Hz vibrations) on other body locations. We systematically varied the number and spatial configurations of the distractors as well as the target and distractor frequencies and measured the impact of these factors on tactile search response times. First, we found that response times were faster on target-present trials compared with target-absent trials. Second, response times increased with the number of stimulated sites, suggesting a serial search process. Third, search performance differed depending on stimulus frequencies. This frequency-dependent behavior may be related to perceptual grouping effects based on timing cues. We constructed linear models to explore how the locations of the target and distractor cues influenced tactile search behavior. Our modeling results reveal that, in isolation, cues on the index fingers make relatively greater contributions to search performance compared with stimulation experienced on other body sites. Additionally, costimulation of sites within the same limb or simply on the same body side preferentially influence search behavior. Our collective findings identify some principles of attentional search that are common to vision and touch, but others that highlight key differences that may be unique to body-based spatial perception.NEW & NOTEWORTHY Little is known about how we selectively experience multisite touch patterns over the body. Using a tactile analog of the classic visual target search paradigm, we show that tactile search behavior for flutter cues is generally consistent with a serial search process. Modeling results reveal the preferential contributions of index finger stimulation and two-site stimulus interactions involving ipsilateral patterns and within-limb patterns. Our results offer initial evidence for spatial and temporal principles underlying tactile search behavior over the body.
Collapse
Affiliation(s)
- Elizabeth J Halfen
- Departments of Neuroscience and Neurosurgery, Baylor College of Medicine, Houston, Texas
| | - John F Magnotti
- Departments of Neuroscience and Neurosurgery, Baylor College of Medicine, Houston, Texas
| | - Md Shoaibur Rahman
- Departments of Neuroscience and Neurosurgery, Baylor College of Medicine, Houston, Texas
| | - Jeffrey M Yau
- Departments of Neuroscience and Neurosurgery, Baylor College of Medicine, Houston, Texas
| |
Collapse
|
13
|
Electro-Haptic Enhancement of Spatial Hearing in Cochlear Implant Users. Sci Rep 2020; 10:1621. [PMID: 32005889 PMCID: PMC6994470 DOI: 10.1038/s41598-020-58503-8] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2019] [Accepted: 01/15/2020] [Indexed: 11/08/2022] Open
Abstract
Cochlear implants (CIs) have enabled hundreds of thousands of profoundly hearing-impaired people to perceive sounds by electrically stimulating the auditory nerve. However, CI users are often very poor at locating sounds, which leads to impaired sound segregation and threat detection. We provided missing spatial hearing cues through haptic stimulation to augment the electrical CI signal. We found that this "electro-haptic" stimulation dramatically improved sound localisation. Furthermore, participants were able to effectively integrate spatial information transmitted through these two senses, performing better with combined audio and haptic stimulation than with either alone. Our haptic signal was presented to the wrists and could readily be delivered by a low-cost wearable device. This approach could provide a non-invasive means of improving outcomes for the vast majority of CI users who have only one implant, without the expense and risk of a second implantation.
Collapse
|