76
|
Kusnir F, Pesin S, Landau AN. Hello from the other side: Robust contralateral interference in tactile detection. Atten Percept Psychophys 2024; 86:295-311. [PMID: 37872432 PMCID: PMC10769913 DOI: 10.3758/s13414-023-02801-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/24/2023] [Indexed: 10/25/2023]
Abstract
Touch is unique among the sensory modalities in that our tactile receptors are spread across the body surface and continuously receive different inputs at the same time. These inputs vary in type, properties, relevance according to current goals, and, of course, location on the body. Sometimes, they must be integrated, and other times set apart and distinguished. Here, we investigate how simultaneous stimulation to different body sites affects tactile cognition. Specifically, we characterized the impact of irrelevant tactile sensations on tactile change detection. To this end, we embedded detection targets amidst ongoing performance, akin to the conditions encountered in everyday life, where we are constantly confronted with new events within ongoing stimuli. In the set of experiments presented here, participants detected a brief intensity change (.04 s) within an ongoing vibrotactile stimulus (1.6 s) that was always presented in a constantly attended location. The intensity change (i.e., the detection target) varied parametrically, from hardly detectable to easily detectable. In half of the trials, irrelevant ongoing stimulation was simultaneously presented to a site across the body midline, but participants were instructed to ignore it. In line with previous bimanual studies employing brief onset targets, we document robust interference on performance due to the irrelevant stimulation at each of the measured body sites (homologous and nonhomologous fingers, and the contralateral ankle). After describing this basic phenomenon, we further examine the conditions under which such interference occurs in three additional tasks. In each task, we honed in on a different aspect of the stimulation protocol (e.g., hand distance, the strength of the irrelevant stimulation, the detection target itself) in order to better understand the principles governing the observed interference effects. Our findings suggest a minimal role for exogenous attentional capture in producing the observed interference effects (Exp. 2), and a principled distribution of attentional resources or sensory integration between body sides (Exps. 3, 4). In our last study (Exp. 4), we presented bilateral tactile targets of varying intensities to both the relevant and irrelevant stimulation sites. We then characterized the degree to which the irrelevant stimulation is also processed. Our results-that participants' perception of target intensity is always proportional to the combined bilateral signal-suggest that both body sites are equally weighed and processed despite clear instructions to attend only the target site. In light of this observation and participants' inability to use selection processes to guide their perception, we propose that bilateral tactile inputs are automatically combined, quite possibly early in the hierarchy of somatosensory processing.
Collapse
|
77
|
Hachisu T, Kajiura M, Takeshita T, Takei Y, Kobayashi T, Konyo M. Lever Mechanism for Diaphragm-Type Vibrators to Enhance Vibrotactile Intensity. IEEE TRANSACTIONS ON HAPTICS 2024; 17:20-25. [PMID: 38227399 DOI: 10.1109/toh.2024.3354253] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/17/2024]
Abstract
Thin and light vibrators that leverage the inverse piezoelectric effect with a diaphragm mechanism are promising vibrotactile actuators owing to their form factors and high temporal and frequency response. However, generating perceptually sufficient displacement in the low-frequency domain is challenging. This study presents a lever mechanism mounted on a diaphragm vibrator to enhance the vibrotactile intensity of low-frequency vibrotactile stimuli. The lever mechanism is inspired by the tactile contact lens consisting of an array of cylinders held against the skin on a sheet that enhances micro-bump tactile detection. We built an experimental apparatus including our previously developed thin-film diaphragm-type vibrator, which reproduced the common characteristic of piezoelectric vibrators: near-threshold displacement (10 to 20 μm) at low frequency. Experiments demonstrated enhanced vibrotactile intensity at frequencies less than 100 Hz with the lever mechanism. Although the arrangement and material of the mechanism can be improved, our findings can help improve the expressiveness of diaphragm-type vibrators.
Collapse
|
78
|
Lee J, Choi S. Multimodal Haptic Feedback for Virtual Collisions Combining Vibrotactile and Electrical Muscle Stimulation. IEEE TRANSACTIONS ON HAPTICS 2024; 17:33-38. [PMID: 38227400 DOI: 10.1109/toh.2024.3354268] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/17/2024]
Abstract
In this paper, we explore the effects of multimodal haptic feedback combining vibrotactile and electrical muscle stimulation (EMS) on expressing virtual collisions. We first present a wearable multimodal haptic device capable of generating both mechanical vibration and EMS stimuli. The two types of haptic stimulus are combined into a haptic rendering method that conveys improved virtual collision sensations. This multimodal rendering method highlights the strengths of each modality while compensating for mutual weaknesses. The multimodal rendering method is compared in subjective quality with two unimodal methods (vibration only and EMS only) by a user study. Experimental results demonstrate that our multimodal feedback method can elicit more realistic, enjoyable, expressive, and preferable user experiences.
Collapse
|
79
|
Boada MD, Gutierrez S, Eisenach JC. Effects of systemic oxytocin administration on ultraviolet B-induced nociceptive hypersensitivity and tactile hyposensitivity in mice. Mol Pain 2024; 20:17448069241226553. [PMID: 38172079 PMCID: PMC10846038 DOI: 10.1177/17448069241226553] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2023] [Revised: 11/24/2023] [Accepted: 12/07/2023] [Indexed: 01/05/2024] Open
Abstract
Ultraviolet B (UVB) radiation induces cutaneous inflammation, leading to thermal and mechanical hypersensitivity. Here, we examine the mechanical properties and profile of tactile and nociceptive peripheral afferents functionally disrupted by this injury and the role of oxytocin (OXT) as a modulator of this disruption. We recorded intracellularly from L4 afferents innervating the irradiated area (5.1 J/cm2) in 4-6 old week male mice (C57BL/6J) after administering OXT intraperitoneally, 6 mg/Kg. The distribution of recorded neurons was shifted by UVB radiation to a pattern observed after acute and chronic injuries and reduced mechanical thresholds of A and C- high threshold mechanoreceptors while reducing tactile sensitivity. UVB radiation did not change somatic membrane electrical properties or fiber conduction velocity. OXT systemic administration rapidly reversed these peripheral changes toward normal in both low and high-threshold mechanoreceptors and shifted recorded neuron distribution toward normal. OXT and V1aR receptors were present on the terminals of myelinated and unmyelinated afferents innervating the skin. We conclude that UVB radiation, similar to local tissue surgical injury, cancer metastasis, and peripheral nerve injury, alters the distribution of low and high threshold mechanoreceptors afferents and sensitizes nociceptors while desensitizing tactile units. Acute systemic OXT administration partially returns all of those effects to normal.
Collapse
|
80
|
Lee J, Kim J, Kang J, Jo E, Park DC, Choi S. Telemetry-Based Haptic Rendering for Racing Game Experience Improvement. IEEE TRANSACTIONS ON HAPTICS 2024; 17:72-79. [PMID: 38265896 DOI: 10.1109/toh.2024.3357885] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/26/2024]
Abstract
Many recent games, such as racing and flight games, open their game telemetry data to users by storing them in the local memory. Such telemetry data can provide useful information for haptic rendering, and this advantage has been exploited by the industry. This approach applies to any applications that export telemetry data in run time. The haptic rendering module operates as a separate process that accesses the telemetry data in parallel with the application. It is simple, efficient, and modular while retaining the application intact. We examine the approach's viability for user experience improvement by developing three telemetry-based haptic rendering algorithms for car racing games. They express the car engine response, collisions with external objects, and the road surface texture, respectively. Building a haptics-enabled driving platform, we conducted a user study comparing gaming experiences between our telemetry-based algorithms and conventional sound-to-tactile conversion algorithms. The results showed that the telemetry-based effects elicited better experiences than the sound-based effects.
Collapse
|
81
|
Ryan L, Sun-Yan A, Laughton M, Peron S. Cortical circuitry mediating interareal touch signal amplification. Cell Rep 2023; 42:113532. [PMID: 38064338 PMCID: PMC10842872 DOI: 10.1016/j.celrep.2023.113532] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2023] [Revised: 09/29/2023] [Accepted: 11/16/2023] [Indexed: 12/19/2023] Open
Abstract
Sensory cortical areas are organized into topographic maps representing the sensory epithelium. Interareal projections typically connect topographically matched subregions across areas. Because matched subregions process the same stimulus, their interaction is central to many computations. Here, we ask how topographically matched subregions of primary and secondary vibrissal somatosensory cortices (vS1 and vS2) interact during active touch. Volumetric calcium imaging in mice palpating an object with two whiskers revealed a sparse population of highly responsive, broadly tuned touch neurons especially pronounced in layer 2 of both areas. These rare neurons exhibited elevated synchrony and carried most touch-evoked activity in both directions. Lesioning the subregion of either area responding to the spared whiskers degraded touch responses in the unlesioned area, with whisker-specific vS1 lesions degrading whisker-specific vS2 touch responses. Thus, a sparse population of broadly tuned touch neurons dominates vS1-vS2 communication in both directions, and topographically matched vS1 and vS2 subregions recurrently amplify whisker touch activity.
Collapse
|
82
|
Memeo M, Sandini G, Cocchi E, Brayda L. Blind people can actively manipulate virtual objects with a novel tactile device. Sci Rep 2023; 13:22845. [PMID: 38129483 PMCID: PMC10739710 DOI: 10.1038/s41598-023-49507-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2023] [Accepted: 12/08/2023] [Indexed: 12/23/2023] Open
Abstract
Frequently in rehabilitation, visually impaired persons are passive agents of exercises with fixed environmental constraints. In fact, a printed tactile map, i.e. a particular picture with a specific spatial arrangement, can usually not be edited. Interaction with map content, instead, facilitates the learning of spatial skills because it exploits mental imagery, manipulation and strategic planning simultaneously. However, it has rarely been applied to maps, mainly because of technological limitations. This study aims to understand if visually impaired people can autonomously build objects that are completely virtual. Specifically, we investigated if a group of twelve blind persons, with a wide age range, could exploit mental imagery to interact with virtual content and actively manipulate it by means of a haptic device. The device is mouse-shaped and designed to jointly perceive, with one finger only, local tactile height and inclination cues of arbitrary scalar fields. Spatial information can be mentally constructed by integrating local tactile cues, given by the device, with global proprioceptive cues, given by hand and arm motion. The experiment consisted of a bi-manual task, in which one hand explored some basic virtual objects and the other hand acted on a keyboard to change the position of one object in real-time. The goal was to merge basic objects into more complex objects, like a puzzle. The experiment spanned different resolutions of the tactile information. We measured task accuracy, efficiency, usability and execution time. The average accuracy in solving the puzzle was 90.5%. Importantly, accuracy was linearly predicted by efficiency, measured as the number of moves needed to solve the task. Subjective parameters linked to usability and spatial resolutions did not predict accuracy; gender modulated the execution time, with men being faster than women. Overall, we show that building purely virtual tactile objects is possible in absence of vision and that the process is measurable and achievable in partial autonomy. Introducing virtual tactile graphics in rehabilitation protocols could facilitate the stimulation of mental imagery, a basic element for the ability to orient in space. The behavioural variable introduced in the current study can be calculated after each trial and therefore could be used to automatically measure and tailor protocols to specific user needs. In perspective, our experimental setup can inspire remote rehabilitation scenarios for visually impaired people.
Collapse
|
83
|
Job X, Kilteni K. Action does not enhance but attenuates predicted touch. eLife 2023; 12:e90912. [PMID: 38099521 PMCID: PMC10723797 DOI: 10.7554/elife.90912] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2023] [Accepted: 11/19/2023] [Indexed: 12/17/2023] Open
Abstract
Dominant motor control theories propose that the brain predicts and attenuates the somatosensory consequences of actions, referred to as somatosensory attenuation. Support comes from psychophysical and neuroimaging studies showing that touch applied on a passive hand elicits attenuated perceptual and neural responses if it is actively generated by one's other hand, compared to an identical touch from an external origin. However, recent experimental findings have challenged this view by providing psychophysical evidence that the perceived intensity of touch on the passive hand is enhanced if the active hand does not receive touch simultaneously with the passive hand (somatosensory enhancement) and by further attributing attenuation to the double tactile stimulation of the hands upon contact. Here, we directly contrasted the hypotheses of the attenuation and enhancement models regarding how action influences somatosensory perception by manipulating whether the active hand contacts the passive hand. We further assessed somatosensory perception in the absence of any predictive cues in a condition that turned out to be essential for interpreting the experimental findings. In three pre-registered experiments, we demonstrate that action does not enhance the predicted touch (Experiment 1), that the previously reported 'enhancement' effects are driven by the reference condition used (Experiment 2), and that self-generated touch is robustly attenuated regardless of whether the two hands make contact (Experiment 3). Our results provide conclusive evidence that action does not enhance but attenuates predicted touch and prompt a reappraisal of recent experimental findings upon which theoretical frameworks proposing a perceptual enhancement by action prediction are based.
Collapse
|
84
|
Kalyani A, Contier O, Klemm L, Azañon E, Schreiber S, Speck O, Reichert C, Kuehn E. Reduced dimension stimulus decoding and column-based modeling reveal architectural differences of primary somatosensory finger maps between younger and older adults. Neuroimage 2023; 283:120430. [PMID: 37923281 DOI: 10.1016/j.neuroimage.2023.120430] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2023] [Revised: 09/28/2023] [Accepted: 10/25/2023] [Indexed: 11/07/2023] Open
Abstract
The primary somatosensory cortex (SI) contains fine-grained tactile representations of the body, arranged in an orderly fashion. The use of ultra-high resolution fMRI data to detect group differences, for example between younger and older adults' SI maps, is challenging, because group alignment often does not preserve the high spatial detail of the data. Here, we use robust-shared response modeling (rSRM) that allows group analyses by mapping individual stimulus-driven responses to a lower dimensional shared feature space, to detect age-related differences in tactile representations between younger and older adults using 7T-fMRI data. Using this method, we show that finger representations are more precise in Brodmann-Area (BA) 3b and BA1 compared to BA2 and motor areas, and that this hierarchical processing is preserved across age groups. By combining rSRM with column-based decoding (C-SRM), we further show that the number of columns that optimally describes finger maps in SI is higher in younger compared to older adults in BA1, indicating a greater columnar size in older adults' SI. Taken together, we conclude that rSRM is suitable for finding fine-grained group differences in ultra-high resolution fMRI data, and we provide first evidence that the columnar architecture in SI changes with increasing age.
Collapse
|
85
|
Montanari R, Alegre-Cortés J, Alonso-Andrés A, Cabrera-Moreno J, Navarro I, García-Frigola C, Sáez M, Reig R. Callosal inputs generate side-invariant receptive fields in the barrel cortex. SCIENCE ADVANCES 2023; 9:eadi3728. [PMID: 38019920 PMCID: PMC10686559 DOI: 10.1126/sciadv.adi3728] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/21/2023] [Accepted: 10/27/2023] [Indexed: 12/01/2023]
Abstract
Barrel cortex integrates contra- and ipsilateral whiskers' inputs. While contralateral inputs depend on the thalamocortical innervation, ipsilateral ones are thought to rely on callosal axons. These are more abundant in the barrel cortex region bordering with S2 and containing the row A-whiskers representation, the row lying nearest to the facial midline. Here, we ask what role this callosal axonal arrangement plays in ipsilateral tactile signaling. We found that novel object exploration with ipsilateral whiskers confines c-Fos expression within the highly callosal subregion. Targeting this area with in vivo patch-clamp recordings revealed neurons with uniquely strong ipsilateral responses dependent on the corpus callosum, as assessed by tetrodotoxin silencing and by optogenetic activation of the contralateral hemisphere. Still, in this area, stimulation of contra- or ipsilateral row A-whiskers evoked an indistinguishable response in some neurons, mostly located in layers 5/6, indicating their involvement in the midline representation of the whiskers' sensory space.
Collapse
|
86
|
Villena-Gonzalez M. Caresses, whispers and affective faces: A theoretical framework for a multimodal interoceptive mechanism underlying ASMR and affective touch: An evolutionary and developmental perspective for understanding ASMR and affective touch as complementary processes within affiliative interactions. Bioessays 2023; 45:e2300095. [PMID: 37800564 DOI: 10.1002/bies.202300095] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/05/2023] [Revised: 09/22/2023] [Accepted: 09/26/2023] [Indexed: 10/07/2023]
Abstract
Autonomous sensory meridian response (ASMR) and affective touch (AT) are two phenomena that have been independently investigated from separate lines of research. In this article, I provide a unified theoretical framework for understanding and studying them as complementary processes. I highlight their shared biological basis and positive effects on emotional and psychophysiological regulation. Drawing from evolutionary and developmental theories, I propose that ASMR results from the development of biological mechanisms associated with early affiliative behaviour and self-regulation, similar to AT. I also propose a multimodal interoceptive mechanism underlying both phenomena, suggesting that different sensory systems could specifically respond to affective stimulation (caresses, whispers and affective faces), where the integration of those inputs occurs in the brain's interoceptive hubs, allowing physiological regulation. The implications of this proposal are discussed with a view to future research that jointly examines ASMR and AT, and their potential impact on improving emotional well-being and mental health.
Collapse
|
87
|
Cataldo A, Frier W, Haggard P. Quantifying spatial acuity of frequency resolved midair ultrasound vibrotactile stimuli. Sci Rep 2023; 13:21149. [PMID: 38036579 PMCID: PMC10689848 DOI: 10.1038/s41598-023-48037-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2023] [Accepted: 11/21/2023] [Indexed: 12/02/2023] Open
Abstract
Spatial acuity is a fundamental property of any sensory system. In the case of the somatosensory system, the two-point discrimination (2PD) test has long been used to investigate tactile spatial resolution. However, the somatosensory system comprises three main mechanoreceptive channels: the slowly adapting channel (SA) responds to steady pressure, the rapidly adapting channel (RA) responds to low-frequency vibration, and the Pacinian channel (PC) responds to high-frequency vibration. The use of mechanical stimuli in the classical 2PD test means that previous studies on tactile acuity have primarily focussed on the pressure-sensitive channel alone, while neglecting other submodalities. Here, we used a novel ultrasound stimulation to systematically investigate the spatial resolution of the two main vibrotactile channels. Contrary to the textbook view of poor spatial resolution for PC-like stimuli, across four experiments we found that high-frequency vibration produced surprisingly good spatial acuity. This effect remained after controlling for interchannel differences in stimulus detectability and perceived intensity. Laser doppler vibrometry experiments confirmed that the acuity of the PC channel was not simply an artifact of the skin's resonance to high-frequency mechanical stimulation. Thus, PC receptors may transmit substantial spatial information, despite their sparse distribution, deep location, and large receptive fields.
Collapse
|
88
|
Stark LR, Shiraishi K, Sommerfeld T. Stationary Haptic Stimuli Do not Produce Ocular Accommodation in Most Individuals. Multisens Res 2023; 37:25-45. [PMID: 38018137 DOI: 10.1163/22134808-bja10115] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2023] [Accepted: 11/14/2023] [Indexed: 11/30/2023]
Abstract
This study aimed to determine the extent to which haptic stimuli can influence ocular accommodation, either alone or in combination with vision. Accommodation was measured objectively in 15 young adults as they read stationary targets containing Braille letters. These cards were presented at four distances in the range 20-50 cm. In the Touch condition, the participant read by touch with their dominant hand in a dark room. Afterward, they estimated card distance with their non-dominant hand. In the Vision condition, they read by sight binocularly without touch in a lighted room. In the Touch with Vision condition, they read by sight binocularly and with touch in a lighted room. Sensory modality had a significant overall effect on the slope of the accommodative stimulus-response function. The slope in the Touch condition was not significantly different from zero, even though depth perception from touch was accurate. Nevertheless, one atypical participant had a moderate accommodative slope in the Touch condition. The accommodative slope in the Touch condition was significantly poorer than in the Vision condition. The accommodative slopes in the Vision condition and Touch with Vision condition were not significantly different. For most individuals, haptic stimuli for stationary objects do not influence the accommodation response, alone or in combination with vision. These haptic stimuli provide accurate distance perception, thus questioning the general validity of Heath's model of proximal accommodation as driven by perceived distance. Instead, proximally induced accommodation relies on visual rather than touch stimuli.
Collapse
|
89
|
Orioli G, Parisi I, van Velzen JL, Bremner AJ. Visual objects approaching the body modulate subsequent somatosensory processing at 4 months of age. Sci Rep 2023; 13:19300. [PMID: 37989781 PMCID: PMC10663495 DOI: 10.1038/s41598-023-45897-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2022] [Accepted: 10/25/2023] [Indexed: 11/23/2023] Open
Abstract
We asked whether, in the first year of life, the infant brain can support the dynamic crossmodal interactions between vision and somatosensation that are required to represent peripersonal space. Infants aged 4 (n = 20, 9 female) and 8 (n = 20, 10 female) months were presented with a visual object that moved towards their body or receded away from it. This was presented in the bottom half of the screen and not fixated upon by the infants, who were instead focusing on an attention getter at the top of the screen. The visual moving object then disappeared and was followed by a vibrotactile stimulus occurring later in time and in a different location in space (on their hands). The 4-month-olds' somatosensory evoked potentials (SEPs) were enhanced when tactile stimuli were preceded by unattended approaching visual motion, demonstrating that the dynamic visual-somatosensory cortical interactions underpinning representations of the body and peripersonal space begin early in the first year of life. Within the 8-month-olds' sample, SEPs were increasingly enhanced by (unexpected) tactile stimuli following receding visual motion as age in days increased, demonstrating changes in the neural underpinnings of the representations of peripersonal space across the first year of life.
Collapse
|
90
|
Kuc A, Skorokhodov I, Semirechenko A, Khayrullina G, Maksimenko V, Varlamov A, Gordleeva S, Hramov A. Oscillatory Responses to Tactile Stimuli of Different Intensity. SENSORS (BASEL, SWITZERLAND) 2023; 23:9286. [PMID: 38005672 PMCID: PMC10675731 DOI: 10.3390/s23229286] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/13/2023] [Revised: 11/15/2023] [Accepted: 11/16/2023] [Indexed: 11/26/2023]
Abstract
Tactile perception encompasses several submodalities that are realized with distinct sensory subsystems. The processing of those submodalities and their interactions remains understudied. We developed a paradigm consisting of three types of touch tuned in terms of their force and velocity for different submodalities: discriminative touch (haptics), affective touch (C-tactile touch), and knismesis (alerting tickle). Touch was delivered with a high-precision robotic rotary touch stimulation device. A total of 39 healthy individuals participated in the study. EEG cluster analysis revealed a decrease in alpha and beta range (mu-rhythm) as well as theta and delta increase most pronounced to the most salient and fastest type of stimulation. The participants confirmed that slower stimuli targeted to affective touch low-threshold receptors were the most pleasant ones, and less intense stimuli aimed at knismesis were indeed the most ticklish ones, but those sensations did not form an EEG cluster, probably implying their processing involves deeper brain structures that are less accessible with EEG.
Collapse
|
91
|
Bellard A, Trotter PD, McGlone FL, Cazzato V. Role of medial prefrontal cortex and primary somatosensory cortex in self and other-directed vicarious social touch: a TMS study. Soc Cogn Affect Neurosci 2023; 18:nsad060. [PMID: 37837378 PMCID: PMC10640852 DOI: 10.1093/scan/nsad060] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2023] [Revised: 08/11/2023] [Accepted: 10/05/2023] [Indexed: 10/16/2023] Open
Abstract
Conflicting evidence points to the contribution of several key nodes of the 'social brain' to the processing of both discriminatory and affective qualities of interpersonal touch. Whether the primary somatosensory cortex (S1) and the medial prefrontal cortex (mPFC), two brain areas vital for tactile mirroring and affective mentalizing, play a functional role in shared representations of C-tactile (CT) targeted affective touch is still a matter of debate. Here, we used offline continuous theta-burst transcranial magnetic stimulation (cTBS) to mPFC, S1 and vertex (control) prior to participants providing ratings of vicarious touch pleasantness for self and others delivered across several body sites at CT-targeted velocities. We found that S1-cTBS led to a significant increase in touch ratings to the self, with this effect being positively associated to levels of interoceptive awareness. Conversely, mPFC-cTBS reduced pleasantness ratings for touch to another person. These effects were not specific for CT-optimal (slow) stroking velocities, but rather they applied to all types of social touch. Overall, our findings challenge the causal role of the S1 and mPFC in vicarious affective touch and suggest that self- vs other-directed vicarious touch responses might crucially depend on the specific involvement of key social networks in gentle tactile interactions.
Collapse
|
92
|
Sutter C, Fabre M, Massi F, Blouin J, Mouchnino L. When mechanical engineering inspired from physiology improves postural-related somatosensory processes. Sci Rep 2023; 13:19495. [PMID: 37945691 PMCID: PMC10636053 DOI: 10.1038/s41598-023-45381-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2023] [Accepted: 10/18/2023] [Indexed: 11/12/2023] Open
Abstract
Despite numerous studies uncovering the neural signature of tactile processing, tactile afferent inputs relating to the contact surface has not been studied so far. Foot tactile receptors being the first stimulated by the relative movement of the foot skin and the underneath moving support play an important role in the sensorimotor transformation giving rise to a postural reaction. A biomimetic surface, i.e., complying with the skin dermatoglyphs and tactile receptors characteristics should facilitate the cortical processes. Participants (n = 15) stood either on a biomimetic surface or on two control surfaces, when a sudden acceleration of the supporting surface was triggered (experiment 1). A larger intensity and shorter somatosensory response (i.e., SEP) was evoked by the biomimetic surface motion. This result and the associated decrease of theta activity (5-7 Hz) over the posterior parietal cortex suggest that increasing the amount of sensory input processing could make the balance task less challenging when standing on a biomimetic surface. This key point was confirmed by a second experiment (n = 21) where a cognitive task was added, hence decreasing the attentional resources devoted to the balance motor task. Greater efficiency of the postural reaction was observed while standing on the biomimetic than on the control surfaces.
Collapse
|
93
|
Lee Masson H, Isik L. Rapid Processing of Observed Touch through Social Perceptual Brain Regions: An EEG-fMRI Fusion Study. J Neurosci 2023; 43:7700-7711. [PMID: 37871963 PMCID: PMC10634570 DOI: 10.1523/jneurosci.0995-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2023] [Revised: 08/09/2023] [Accepted: 08/31/2023] [Indexed: 10/25/2023] Open
Abstract
Seeing social touch triggers a strong social-affective response that involves multiple brain networks, including visual, social perceptual, and somatosensory systems. Previous studies have identified the specific functional role of each system, but little is known about the speed and directionality of the information flow. Is this information extracted via the social perceptual system or from simulation from somatosensory cortex? To address this, we examined the spatiotemporal neural processing of observed touch. Twenty-one human participants (seven males) watched 500-ms video clips showing social and nonsocial touch during electroencephalogram (EEG) recording. Visual and social-affective features were rapidly extracted in the brain, beginning at 90 and 150 ms after video onset, respectively. Combining the EEG data with functional magnetic resonance imaging (fMRI) data from our prior study with the same stimuli reveals that neural information first arises in early visual cortex (EVC), then in the temporoparietal junction and posterior superior temporal sulcus (TPJ/pSTS), and finally in the somatosensory cortex. EVC and TPJ/pSTS uniquely explain EEG neural patterns, while somatosensory cortex does not contribute to EEG patterns alone, suggesting that social-affective information may flow from TPJ/pSTS to somatosensory cortex. Together, these findings show that social touch is processed quickly, within the timeframe of feedforward visual processes, and that the social-affective meaning of touch is first extracted by a social perceptual pathway. Such rapid processing of social touch may be vital to its effective use during social interaction.SIGNIFICANCE STATEMENT Seeing physical contact between people evokes a strong social-emotional response. Previous research has identified the brain systems responsible for this response, but little is known about how quickly and in what direction the information flows. We demonstrated that the brain processes the social-emotional meaning of observed touch quickly, starting as early as 150 ms after the stimulus onset. By combining electroencephalogram (EEG) data with functional magnetic resonance imaging (fMRI) data, we show for the first time that the social-affective meaning of touch is first extracted by a social perceptual pathway and followed by the later involvement of somatosensory simulation. This rapid processing of touch through the social perceptual route may play a pivotal role in effective usage of touch in social communication and interaction.
Collapse
|
94
|
Zhong Y, Yao L, Wang Y. Enhanced Motor Imagery Decoding by Calibration Model-Assisted With Tactile ERD. IEEE Trans Neural Syst Rehabil Eng 2023; 31:4295-4305. [PMID: 37883287 DOI: 10.1109/tnsre.2023.3327788] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2023]
Abstract
OBJECTIVE In this study, we propose a tactile-assisted calibration method for a motor imagery (MI) based Brain-Computer Interface (BCI) system. METHOD In the proposed calibration, tactile stimulation was applied to the hand wrist to assist the subjects in the MI task, which is named SA-MI task. Then, classifier training in the SA-MI Calibration was performed using the SA-MI data, while the Conventional Calibration employed the MI data. After the classifiers were trained, the performance was evaluated on a common MI dataset. RESULTS Our study demonstrated that the SA-MI Calibration significantly improved the performance as compared with the Conventional Calibration, with a decoding accuracy of (78.3% vs. 71.3%). Moreover, the average calibration time could be reduced by 40%. This benefit of the SA-MI Calibration effect was further validated by an independent control group, which showed no improvement when tactile stimulation was not applied during the calibration phase. Further analysis showed that when compared with MI, greater motor-related cortical activation and higher R 2 value in the alpha-beta frequency band were induced in SA-MI. CONCLUSION Indeed, the SA-MI Calibration could significantly improve the performance and reduce the calibration time as compared with the Conventional Calibration. SIGNIFICANCE The proposed tactile stimulation-assisted MI Calibration method holds great potential for a faster and more accurate system setup at the beginning of BCI usage.
Collapse
|
95
|
Duan S, Wei X, Zhao F, Yang H, Wang Y, Chen P, Hong J, Xiang S, Luo M, Shi Q, Shen G, Wu J. Bioinspired Young's Modulus-Hierarchical E-Skin with Decoupling Multimodality and Neuromorphic Encoding Outputs to Biosystems. ADVANCED SCIENCE (WEINHEIM, BADEN-WURTTEMBERG, GERMANY) 2023; 10:e2304121. [PMID: 37679093 PMCID: PMC10625104 DOI: 10.1002/advs.202304121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/21/2023] [Revised: 08/07/2023] [Indexed: 09/09/2023]
Abstract
As key interfaces for the disabled, optimal prosthetics should elicit natural sensations of skin touch or proprioception, by unambiguously delivering the multimodal signals acquired by the prosthetics to the nervous system, which still remains challenging. Here, a bioinspired temperature-pressure electronic skin with decoupling capability (TPD e-skin), inspired by the high-low modulus hierarchical structure of human skin, is developed to restore such functionality. Due to the bionic dual-state amplifying microstructure and contact resistance modulation, the MXene TPD e-skin exhibits high sensitivity over a wide pressure range and excellent temperature insensitivity (91.2% reduction). Additionally, the high-low modulus structural configuration enables the pressure insensitivity of the thermistor. Furthermore, a neural model is proposed to neutrally code the temperature-pressure signals into three types of nerve-acceptable frequency signals, corresponding to thermoreceptors, slow-adapting receptors, and fast-adapting receptors. Four operational states in the time domain are also distinguished after the neural coding in the frequency domain. Besides, a brain-like machine learning-based fusion process for frequency signals is also constructed to analyze the frequency pattern and achieve object recognition with a high accuracy of 98.7%. The TPD neural system offers promising potential to enable advanced prosthetic devices with the capability of multimodality-decoupling sensing and deep neural integration.
Collapse
|
96
|
Ciancone-Chama AG, Bonaldo V, Biasini E, Bozzi Y, Balasco L. Gene Expression Profiling in Trigeminal Ganglia from Cntnap2 -/- and Shank3b -/- Mouse Models of Autism Spectrum Disorder. Neuroscience 2023; 531:75-85. [PMID: 37699442 DOI: 10.1016/j.neuroscience.2023.08.028] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2023] [Revised: 08/19/2023] [Accepted: 08/22/2023] [Indexed: 09/14/2023]
Abstract
Sensory difficulties represent a crucial issue in the life of autistic individuals. The diagnostic and statistical manual of mental disorders describes both hyper- and hypo-responsiveness to sensory stimulation as a criterion for the diagnosis autism spectrum disorders (ASD). Among the sensory domain affected in ASD, altered responses to tactile stimulation represent the most commonly reported sensory deficits. Although tactile abnormalities have been reported in monogenic cohorts of patients and genetic mouse models of ASD, the underlying mechanisms are still unknown. Traditionally, autism research has focused on the central nervous system as the target to infer the neurobiological bases of such tactile abnormalities. Nonetheless, the peripheral nervous system represents the initial site of processing of sensory information and a potential site of dysfunction in the sensory cascade. Here we investigated the gene expression deregulation in the trigeminal ganglion (which directly receives tactile information from whiskers) in two genetic models of syndromic autism (Shank3b and Cntnap2 mutant mice) at both adult and juvenile ages. We found several neuronal and non-neuronal markers involved in inhibitory, excitatory, neuroinflammatory and sensory neurotransmission to be differentially regulated within the trigeminal ganglia of both adult and juvenile Shank3b and Cntnap2 mutant mice. These results may help in disentangling the multifaced complexity of sensory abnormalities in autism and open avenues for the development of peripherally targeted treatments for tactile sensory deficits exhibited in ASD.
Collapse
|
97
|
Weber M, Marshall A, Timircan R, McGlone F, Watt SJ, Onyekwelu O, Booth L, Jesudason E, Lees V, Valyear KF. Touch localization after nerve repair in the hand: insights from a new measurement tool. J Neurophysiol 2023; 130:1126-1141. [PMID: 37728568 PMCID: PMC10994642 DOI: 10.1152/jn.00271.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2023] [Revised: 09/19/2023] [Accepted: 09/19/2023] [Indexed: 09/21/2023] Open
Abstract
Errors of touch localization after hand nerve injuries are common, and their measurement is important for evaluating functional recovery. Available empirical accounts have significant methodological limitations, however, and a quantitatively rigorous and detailed description of touch localization in nerve injury is lacking. Here, we develop a new method of measuring touch localization and evaluate its value for use in nerve injury. Eighteen patients with transection injuries to the median/ulnar nerves and 33 healthy controls were examined. The hand was blocked from the participant's view and points were marked on the volar surface using an ultraviolet (UV) pen. These points served as targets for touch stimulation. Two photographs were taken, one with and one without UV lighting, rendering targets seen and unseen, respectively. The experimenter used the photograph with visible targets to register their locations, and participants reported the felt position of each stimulation on the photograph with unseen targets. The error of localization and its directional components were measured, separate from misreferrals-errors made across digits, or from a digit to the palm. Nerve injury was found to significantly increase the error of localization. These effects were specific to the territory of the repaired nerve and showed considerable variability at the individual level, with some patients showing no evidence of impairment. A few patients also made abnormally high numbers of misreferrals, and the pattern of misreferrals in patients differed from that observed in healthy controls.NEW & NOTEWORTHY We provide a more rigorous and comprehensive account of touch localization in nerve injury than previously available. Our results show that touch localization is significantly impaired following median/ulnar nerve transection injuries and that these impairments are specific to the territory of the repaired nerve(s), vary considerably between patients, and can involve frequent errors spanning between digits.
Collapse
|
98
|
Guilleminot P, Graef C, Butters E, Reichenbach T. Audiotactile Stimulation Can Improve Syllable Discrimination through Multisensory Integration in the Theta Frequency Band. J Cogn Neurosci 2023; 35:1760-1772. [PMID: 37677062 DOI: 10.1162/jocn_a_02045] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/09/2023]
Abstract
Syllables are an essential building block of speech. We recently showed that tactile stimuli linked to the perceptual centers of syllables in continuous speech can improve speech comprehension. The rate of syllables lies in the theta frequency range, between 4 and 8 Hz, and the behavioral effect appears linked to multisensory integration in this frequency band. Because this neural activity may be oscillatory, we hypothesized that a behavioral effect may also occur not only while but also after this activity has been evoked or entrained through vibrotactile pulses. Here, we show that audiotactile integration regarding the perception of single syllables, both on the neural and on the behavioral level, is consistent with this hypothesis. We first stimulated participants with a series of vibrotactile pulses and then presented them with a syllable in background noise. We show that, at a delay of 200 msec after the last vibrotactile pulse, audiotactile integration still occurred in the theta band and syllable discrimination was enhanced. Moreover, the dependence of both the neural multisensory integration as well as of the behavioral discrimination on the delay of the audio signal with respect to the last tactile pulse was consistent with a damped oscillation. In addition, the multisensory gain is correlated with the syllable discrimination score. Our results therefore evidence the role of the theta band in audiotactile integration and provide evidence that these effects may involve oscillatory activity that still persists after the tactile stimulation.
Collapse
|
99
|
Sanfeliu-Cerdán N, Català-Castro F, Mateos B, Garcia-Cabau C, Ribera M, Ruider I, Porta-de-la-Riva M, Canals-Calderón A, Wieser S, Salvatella X, Krieg M. A MEC-2/stomatin condensate liquid-to-solid phase transition controls neuronal mechanotransduction during touch sensing. Nat Cell Biol 2023; 25:1590-1599. [PMID: 37857834 PMCID: PMC10635833 DOI: 10.1038/s41556-023-01247-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2022] [Accepted: 09/01/2023] [Indexed: 10/21/2023]
Abstract
A growing body of work suggests that the material properties of biomolecular condensates ensuing from liquid-liquid phase separation change with time. How this aging process is controlled and whether the condensates with distinct material properties can have different biological functions is currently unknown. Using Caenorhabditis elegans as a model, we show that MEC-2/stomatin undergoes a rigidity phase transition from fluid-like to solid-like condensates that facilitate transport and mechanotransduction, respectively. This switch is triggered by the interaction between the SH3 domain of UNC-89 (titin/obscurin) and MEC-2. We suggest that this rigidity phase transition has a physiological role in frequency-dependent force transmission in mechanosensitive neurons during body wall touch. Our data demonstrate a function for the liquid and solid phases of MEC-2/stomatin condensates in facilitating transport or mechanotransduction, and a previously unidentified role for titin homologues in neurons.
Collapse
|
100
|
Zhou J, Fu C, Fang J, Shang K, Pu X, Zhang Y, Jiang Z, Lu X, He C, Jia L, Yao Y, Qian L, Yang T. Prosthetic finger for fingertip tactile sensing via flexible chromatic optical waveguides. MATERIALS HORIZONS 2023; 10:4940-4951. [PMID: 37609940 DOI: 10.1039/d3mh00921a] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 08/24/2023]
Abstract
Building prosthetics indistinguishable from human limbs to accurately receive and transmit sensory information to users not only promises to radically improve the lives of amputees, but also shows potential in a range of robotic applications. Currently, a mainstream approach is to embed electrical or optical sensors with force/thermal sensing functions on the surface or inside of prosthetic fingers. Compared with electrical sensing technologies, tactile sensors based on stretchable optical waveguides have the advantages of easy fabrication, chemical safety, environmental stability, and compatibility with prosthetic structural materials. However, so far, research has mainly focused on the perception of finger joint motion or external press, and there is still a lack of study on optical sensors with fingertip tactile capabilities (such as texture, hardness, slip detection, etc.). Here we report a 3D printing prosthetic finger with flexible chromatic optical waveguides implanted at the fingertip. The finger achieves distributed displacement/force sensing detection, and exhibits high sensitivity, fast response and good stability. The finger can be used to conduct active sensory experiments, and the detection parameters include object contour, hardness, slip direction and speed, temperature, etc. Finally, exploratory research on identifying and manipulating objects is carried out with this finger. The developed prosthetic finger can artificially recreate touch perception and realize complex functions such as note-writing analysis and braille recognition.
Collapse
|