1
|
Bulusu V, Lazar L. Crossmodal associations between naturally occurring tactile and sound textures. Perception 2024; 53:219-239. [PMID: 38304994 DOI: 10.1177/03010066231224557] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/03/2024]
Abstract
This study investigates the crossmodal associations between naturally occurring sound textures and tactile textures. Previous research has demonstrated the association between low-level sensory features of sound and touch, as well as higher-level, cognitively mediated associations involving language, emotions, and metaphors. However, stimuli like textures, which are found in both modalities have received less attention. In this study, we conducted two experiments: a free association task and a two alternate forced choice task using everyday tactile textures and sound textures selected from natural sound categories. The results revealed consistent crossmodal associations reported by participants between the textures of the two modalities. They tended to associate more sound textures (e.g., wood shavings and sandpaper) with tactile surfaces that were rated as harder, rougher, and intermediate on the sticky-slippery scale. While some participants based the auditory-tactile association on sensory features, others made the associations based on semantic relationships, co-occurrence in nature, and emotional mediation. Interestingly, the statistical features of the sound textures (mean, variance, kurtosis, power, autocorrelation, and correlation) did not show significant correlations with the crossmodal associations, indicating a higher-level association. This study provides insights into auditory-tactile associations by highlighting the role of sensory and emotional (or cognitive) factors in prompting these associations.
Collapse
Affiliation(s)
| | - Leslee Lazar
- Indian Institute of Technology Gandhinagar, India
| |
Collapse
|
2
|
Bourdier A, Abriat A, Jiang T. Impacts of sensory multimodality congruence and familiarity with short use on cosmetic product evaluation. Int J Cosmet Sci 2023; 45:592-603. [PMID: 37073417 DOI: 10.1111/ics.12863] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2022] [Revised: 03/03/2023] [Accepted: 04/04/2023] [Indexed: 04/20/2023]
Abstract
Cross-modal association between sensory modalities is a natural phenomenon in the perception of our environment. For cosmetic evaluation, touch and smell are the two major sensory modalities involved in the whole product perception. In this study, we investigate whether a specific cosmetic texture is preferentially associated with a specific fragrance: congruence between texture and fragrance. In addition, we investigate whether 1-week use of a fragrance-texture congruent or non-congruent product can influence user's whole product appreciation and well-being. We have conducted a four-test experiment with 29 participants; first in the laboratory to evaluate: six fragrances and four textures individually with free description (test 1); the same stimuli with a description with cross-modal descriptors (test 2); 10 fragrance-texture combined products (test 3); and secondly at home, two fragrance-texture combined products: one congruent and one non-congruent (test 4). Results showed that: (1) For a given texture type, specific olfactory notes are necessary to lead to a congruent cross-modal pairing product. (2) Sensory modal congruent products produce the highest hedonic response. (3) Real-life use or familiarisation with a product can influence not only the degree of cross-modal congruence but also overall cosmetic product appreciation.
Collapse
Affiliation(s)
- Alice Bourdier
- The Smell and Taste Lab, Geneva, Switzerland
- Centre de Recherche en Neurosciences de Lyon CRNL, Burgundy University, Lyon, France
| | - Anne Abriat
- The Smell and Taste Lab, Geneva, Switzerland
| | - Tao Jiang
- Centre de Recherche en Neurosciences de Lyon CRNL, Burgundy University, Lyon, France
| |
Collapse
|
3
|
Montoya S, Badde S. Only visible flicker helps flutter: Tactile-visual integration breaks in the absence of visual awareness. Cognition 2023; 238:105528. [PMID: 37354787 DOI: 10.1016/j.cognition.2023.105528] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2023] [Revised: 06/15/2023] [Accepted: 06/16/2023] [Indexed: 06/26/2023]
Abstract
Combining information from multiple senses enhances our perception of the world. Whether we need to be aware of all stimuli to benefit from multisensory integration, however, is still under investigation. Here, we tested whether tactile frequency perception benefits from the presence of congruent visual flicker even if the flicker is so rapid that it is perceptually fused into a steady light and therefore invisible. Our participants completed a tactile frequency discrimination task given either unisensory tactile or congruent tactile-visual stimulation. Tactile and tactile-visual test frequencies ranged from far below to far above participants' flicker fusion threshold (determined separately). For frequencies distinctively below their flicker fusion threshold, participants performed significantly better given tactile-visual stimulation than when presented with only tactile stimuli. Yet, for frequencies above their flicker fusion threshold, participants' tactile frequency perception did not profit from the presence of congruent but likely fused and thus invisible visual flicker. The results matched the predictions of an ideal-observer model in which tactile-visual integration is conditional on awareness of both stimuli. In contrast, it was impossible to reproduce the observed results with a model that assumed tactile-visual integration proceeds irrespective of stimulus awareness. In sum, we revealed that the benefits of congruent visual stimulation for tactile flutter frequency perception depend on the visibility of the visual flicker, suggesting that multisensory integration requires awareness.
Collapse
Affiliation(s)
- Sofia Montoya
- Department of Psychology, Tufts University, 490 Boston Avenue, 02155 Medford, MA, USA
| | - Stephanie Badde
- Department of Psychology, Tufts University, 490 Boston Avenue, 02155 Medford, MA, USA.
| |
Collapse
|
4
|
Lohse M, Zimmer-Harwood P, Dahmen JC, King AJ. Integration of somatosensory and motor-related information in the auditory system. Front Neurosci 2022; 16:1010211. [PMID: 36330342 PMCID: PMC9622781 DOI: 10.3389/fnins.2022.1010211] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Accepted: 09/28/2022] [Indexed: 11/30/2022] Open
Abstract
An ability to integrate information provided by different sensory modalities is a fundamental feature of neurons in many brain areas. Because visual and auditory inputs often originate from the same external object, which may be located some distance away from the observer, the synthesis of these cues can improve localization accuracy and speed up behavioral responses. By contrast, multisensory interactions occurring close to the body typically involve a combination of tactile stimuli with other sensory modalities. Moreover, most activities involving active touch generate sound, indicating that stimuli in these modalities are frequently experienced together. In this review, we examine the basis for determining sound-source distance and the contribution of auditory inputs to the neural encoding of space around the body. We then consider the perceptual consequences of combining auditory and tactile inputs in humans and discuss recent evidence from animal studies demonstrating how cortical and subcortical areas work together to mediate communication between these senses. This research has shown that somatosensory inputs interface with and modulate sound processing at multiple levels of the auditory pathway, from the cochlear nucleus in the brainstem to the cortex. Circuits involving inputs from the primary somatosensory cortex to the auditory midbrain have been identified that mediate suppressive effects of whisker stimulation on auditory thalamocortical processing, providing a possible basis for prioritizing the processing of tactile cues from nearby objects. Close links also exist between audition and movement, and auditory responses are typically suppressed by locomotion and other actions. These movement-related signals are thought to cancel out self-generated sounds, but they may also affect auditory responses via the associated somatosensory stimulation or as a result of changes in brain state. Together, these studies highlight the importance of considering both multisensory context and movement-related activity in order to understand how the auditory cortex operates during natural behaviors, paving the way for future work to investigate auditory-somatosensory interactions in more ecological situations.
Collapse
|
5
|
Sathian K, Lacey S. Cross-Modal Interactions of the Tactile System. CURRENT DIRECTIONS IN PSYCHOLOGICAL SCIENCE 2022; 31:411-418. [PMID: 36408466 PMCID: PMC9674209 DOI: 10.1177/09637214221101877] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/29/2023]
Abstract
The sensory systems responsible for perceptions of touch, vision, hearing, etc. have traditionally been regarded as mostly separate, only converging at late stages of processing. Contrary to this dogma, recent work has shown that interactions between the senses are robust and abundant. Touch and vision are both commonly used to obtain information about a number of object properties, and share perceptual and neural representations in many domains. Additionally, visuotactile interactions are implicated in the sense of body ownership, as revealed by powerful illusions that can be evoked by manipulating these interactions. Touch and hearing both rely in part on temporal frequency information, leading to a number of audiotactile interactions reflecting a good deal of perceptual and neural overlap. The focus in sensory neuroscience and psychophysics is now on characterizing the multisensory interactions that lead to our panoply of perceptual experiences.
Collapse
Affiliation(s)
- K. Sathian
- Department of Neurology, Penn State Health Milton S. Hershey Medical Center
- Department of Neural & Behavioral Sciences, Penn State College of Medicine
- Department of Psychology, Penn State College of Liberal Arts
| | - Simon Lacey
- Department of Neurology, Penn State Health Milton S. Hershey Medical Center
- Department of Neural & Behavioral Sciences, Penn State College of Medicine
| |
Collapse
|
6
|
Sharma D, Ng KKW, Birznieks I, Vickery RM. Auditory clicks elicit equivalent temporal frequency perception to tactile pulses: A cross-modal psychophysical study. Front Neurosci 2022; 16:1006185. [PMID: 36161171 PMCID: PMC9500524 DOI: 10.3389/fnins.2022.1006185] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2022] [Accepted: 08/24/2022] [Indexed: 12/02/2022] Open
Abstract
Both hearing and touch are sensitive to the frequency of mechanical oscillations—sound waves and tactile vibrations, respectively. The mounting evidence of parallels in temporal frequency processing between the two sensory systems led us to directly address the question of perceptual frequency equivalence between touch and hearing using stimuli of simple and more complex temporal features. In a cross-modal psychophysical paradigm, subjects compared the perceived frequency of pulsatile mechanical vibrations to that elicited by pulsatile acoustic (click) trains, and vice versa. Non-invasive pulsatile stimulation designed to excite a fixed population of afferents was used to induce desired temporal spike trains at frequencies spanning flutter up to vibratory hum (>50 Hz). The cross-modal perceived frequency for regular test pulse trains of either modality was a close match to the presented stimulus physical frequency up to 100 Hz. We then tested whether the recently discovered “burst gap” temporal code for frequency, that is shared by the two senses, renders an equivalent cross-modal frequency perception. When subjects compared trains comprising pairs of pulses (bursts) in one modality against regular trains in the other, the cross-sensory equivalent perceptual frequency best corresponded to the silent interval between the successive bursts in both auditory and tactile test stimuli. These findings suggest that identical acoustic and vibrotactile pulse trains, regardless of pattern, elicit equivalent frequencies, and imply analogous temporal frequency computation strategies in both modalities. This perceptual correspondence raises the possibility of employing a cross-modal comparison as a robust standard to overcome the prevailing methodological limitations in psychophysical investigations and strongly encourages cross-modal approaches for transmitting sensory information such as translating pitch into a similar pattern of vibration on the skin.
Collapse
Affiliation(s)
- Deepak Sharma
- School of Biomedical Sciences, The University of New South Wales (UNSW Sydney), Sydney, NSW, Australia
- Neuroscience Research Australia, Sydney, NSW, Australia
- *Correspondence: Deepak Sharma,
| | - Kevin K. W. Ng
- Center for Social and Affective Neuroscience, Department of Biomedical and Clinical Sciences, Linköping University, Linköping, Sweden
| | - Ingvars Birznieks
- School of Biomedical Sciences, The University of New South Wales (UNSW Sydney), Sydney, NSW, Australia
- Neuroscience Research Australia, Sydney, NSW, Australia
- Bionics and Bio-Robotics, Tyree Foundation Institute of Health Engineering, The University of New South Wales (UNSW Sydney), Sydney, NSW, Australia
| | - Richard M. Vickery
- School of Biomedical Sciences, The University of New South Wales (UNSW Sydney), Sydney, NSW, Australia
- Neuroscience Research Australia, Sydney, NSW, Australia
- Bionics and Bio-Robotics, Tyree Foundation Institute of Health Engineering, The University of New South Wales (UNSW Sydney), Sydney, NSW, Australia
| |
Collapse
|
7
|
The burst gap is a peripheral temporal code for pitch perception that is shared across audition and touch. Sci Rep 2022; 12:11014. [PMID: 35773321 PMCID: PMC9246943 DOI: 10.1038/s41598-022-15269-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2022] [Accepted: 06/21/2022] [Indexed: 11/08/2022] Open
Abstract
When tactile afferents were manipulated to fire in periodic bursts of spikes, we discovered that the perceived pitch corresponded to the inter-burst interval (burst gap) in a spike train, rather than the spike rate or burst periodicity as previously thought. Given that tactile frequency mechanisms have many analogies to audition, and indications that temporal frequency channels are linked across the two modalities, we investigated whether there is burst gap temporal encoding in the auditory system. To link this putative neural code to perception, human subjects (n = 13, 6 females) assessed pitch elicited by trains of temporally-structured acoustic pulses in psychophysical experiments. Each pulse was designed to excite a fixed population of cochlear neurons, precluding place of excitation cues, and to elicit desired temporal spike trains in activated afferents. We tested periodicities up to 150 Hz using a variety of burst patterns and found striking deviations from periodicity-predicted pitch. Like the tactile system, the duration of the silent gap between successive bursts of neural activity best predicted perceived pitch, emphasising the role of peripheral temporal coding in shaping pitch. This suggests that temporal patterning of stimulus pulses in cochlear implant users might improve pitch perception.
Collapse
|
8
|
Cléry JC, Hori Y, Schaeffer DJ, Gati JS, Pruszynski JA, Everling S. Whole brain mapping of somatosensory responses in awake marmosets investigated with ultra-high-field fMRI. J Neurophysiol 2020; 124:1900-1913. [PMID: 33112698 DOI: 10.1152/jn.00480.2020] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/16/2023] Open
Abstract
The common marmoset (Callithrix jacchus) is a small-bodied New World primate that is becoming an important model to study brain functions. Despite several studies exploring the somatosensory system of marmosets, all results have come from anesthetized animals using invasive techniques and postmortem analyses. Here, we demonstrate the feasibility for getting high-quality and reproducible somatosensory mapping in awake marmosets with functional magnetic resonance imaging (fMRI). We acquired fMRI sequences in four animals, while they received tactile stimulation (via air-puffs), delivered to the face, arm, or leg. We found a topographic body representation with the leg representation in the most medial part, the face representation in the most lateral part, and the arm representation between leg and face representation within areas 3a, 3b, and 1/2. A similar sequence from leg to face from caudal to rostral sites was identified in areas S2 and PV. By generating functional connectivity maps of seeds defined in the primary and second somatosensory regions, we identified two clusters of tactile representation within the posterior and midcingulate cortex. However, unlike humans and macaques, no clear somatotopic maps were observed. At the subcortical level, we found a somatotopic body representation in the thalamus and, for the first time in marmosets, in the putamen. These maps have similar organizations, as those previously found in Old World macaque monkeys and humans, suggesting that these subcortical somatotopic organizations were already established before Old and New World primates diverged. Our results show the first whole brain mapping of somatosensory responses acquired in a noninvasive way in awake marmosets.NEW & NOTEWORTHY We used somatosensory stimulation combined with functional MRI (fMRI) in awake marmosets to reveal the topographic body representation in areas S1, S2, thalamus, and putamen. We showed the existence of a body representation organization within the thalamus and the cingulate cortex by computing functional connectivity maps from seeds defined in S1/S2, using resting-state fMRI data. This noninvasive approach will be essential for chronic studies by guiding invasive recording and manipulation techniques.
Collapse
Affiliation(s)
- Justine C Cléry
- Centre for Functional and Metabolic Mapping, Robarts Research Institute, The University of Western Ontario, London, Ontario, Canada
| | - Yuki Hori
- Centre for Functional and Metabolic Mapping, Robarts Research Institute, The University of Western Ontario, London, Ontario, Canada
| | - David J Schaeffer
- Centre for Functional and Metabolic Mapping, Robarts Research Institute, The University of Western Ontario, London, Ontario, Canada
| | - Joseph S Gati
- Centre for Functional and Metabolic Mapping, Robarts Research Institute, The University of Western Ontario, London, Ontario, Canada.,Department of Medical Biophysics, The University of Western Ontario, London, Ontario, Canada
| | - J Andrew Pruszynski
- Centre for Functional and Metabolic Mapping, Robarts Research Institute, The University of Western Ontario, London, Ontario, Canada.,Department of Physiology and Pharmacology, The University of Western Ontario, London, Ontario, Canada
| | - Stefan Everling
- Centre for Functional and Metabolic Mapping, Robarts Research Institute, The University of Western Ontario, London, Ontario, Canada.,Department of Physiology and Pharmacology, The University of Western Ontario, London, Ontario, Canada
| |
Collapse
|
9
|
Rahman MS, Barnes KA, Crommett LE, Tommerdahl M, Yau JM. Auditory and tactile frequency representations are co-embedded in modality-defined cortical sensory systems. Neuroimage 2020; 215:116837. [PMID: 32289461 PMCID: PMC7292761 DOI: 10.1016/j.neuroimage.2020.116837] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2019] [Revised: 03/17/2020] [Accepted: 04/06/2020] [Indexed: 11/18/2022] Open
Abstract
Sensory information is represented and elaborated in hierarchical cortical systems that are thought to be dedicated to individual sensory modalities. This traditional view of sensory cortex organization has been challenged by recent evidence of multimodal responses in primary and association sensory areas. Although it is indisputable that sensory areas respond to multiple modalities, it remains unclear whether these multimodal responses reflect selective information processing for particular stimulus features. Here, we used fMRI adaptation to identify brain regions that are sensitive to the temporal frequency information contained in auditory, tactile, and audiotactile stimulus sequences. A number of brain regions distributed over the parietal and temporal lobes exhibited frequency-selective temporal response modulation for both auditory and tactile stimulus events, as indexed by repetition suppression effects. A smaller set of regions responded to crossmodal adaptation sequences in a frequency-dependent manner. Despite an extensive overlap of multimodal frequency-selective responses across the parietal and temporal lobes, representational similarity analysis revealed a cortical "regional landscape" that clearly reflected distinct somatosensory and auditory processing systems that converged on modality-invariant areas. These structured relationships between brain regions were also evident in spontaneous signal fluctuation patterns measured at rest. Our results reveal that multimodal processing in human cortex can be feature-specific and that multimodal frequency representations are embedded in the intrinsically hierarchical organization of cortical sensory systems.
Collapse
Affiliation(s)
- Md Shoaibur Rahman
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, 77030, USA
| | - Kelly Anne Barnes
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, 77030, USA; Department of Behavioral and Social Sciences, San Jacinto College - South, Houston, 13735 Beamer Rd, S13.269, Houston, TX, 77089, USA
| | - Lexi E Crommett
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, 77030, USA
| | - Mark Tommerdahl
- Department of Biomedical Engineering, University of North Carolina at Chapel Hill, CB No. 7575, Chapel Hill, NC, 27599, USA
| | - Jeffrey M Yau
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, 77030, USA.
| |
Collapse
|