1
|
Uemura M, Katagiri Y, Imai E, Kawahara Y, Otani Y, Ichinose T, Kondo K, Kowa H. Dorsal Anterior Cingulate Cortex Coordinates Contextual Mental Imagery for Single-Beat Manipulation during Rhythmic Sensorimotor Synchronization. Brain Sci 2024; 14:757. [PMID: 39199452 PMCID: PMC11352649 DOI: 10.3390/brainsci14080757] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2024] [Revised: 07/17/2024] [Accepted: 07/23/2024] [Indexed: 09/01/2024] Open
Abstract
Flexible pulse-by-pulse regulation of sensorimotor synchronization is crucial for voluntarily showing rhythmic behaviors synchronously with external cueing; however, the underpinning neurophysiological mechanisms remain unclear. We hypothesized that the dorsal anterior cingulate cortex (dACC) plays a key role by coordinating both proactive and reactive motor outcomes based on contextual mental imagery. To test our hypothesis, a missing-oddball task in finger-tapping paradigms was conducted in 33 healthy young volunteers. The dynamic properties of the dACC were evaluated by event-related deep-brain activity (ER-DBA), supported by event-related potential (ERP) analysis and behavioral evaluation based on signal detection theory. We found that ER-DBA activation/deactivation reflected a strategic choice of motor control modality in accordance with mental imagery. Reverse ERP traces, as omission responses, confirmed that the imagery was contextual. We found that mental imagery was updated only by environmental changes via perceptual evidence and response-based abductive reasoning. Moreover, stable on-pulse tapping was achievable by maintaining proactive control while creating an imagery of syncopated rhythms from simple beat trains, whereas accuracy was degraded with frequent erroneous tapping for missing pulses. We conclude that the dACC voluntarily regulates rhythmic sensorimotor synchronization by utilizing contextual mental imagery based on experience and by creating novel rhythms.
Collapse
Affiliation(s)
- Maho Uemura
- Department of Rehabilitation Science, Kobe University Graduate School of Health Sciences, Kobe 654-0142, Japan; (Y.O.); (H.K.)
- School of Music, Mukogawa Women’s University, Nishinomiya 663-8558, Japan;
| | - Yoshitada Katagiri
- Department of Bioengineering, School of Engineering, The University of Tokyo, Tokyo 113-8655, Japan;
| | - Emiko Imai
- Department of Biophysics, Kobe University Graduate School of Health Sciences, Kobe 654-0142, Japan;
| | - Yasuhiro Kawahara
- Department of Human life and Health Sciences, Division of Arts and Sciences, The Open University of Japan, Chiba 261-8586, Japan;
| | - Yoshitaka Otani
- Department of Rehabilitation Science, Kobe University Graduate School of Health Sciences, Kobe 654-0142, Japan; (Y.O.); (H.K.)
- Faculty of Rehabilitation, Kobe International University, Kobe 658-0032, Japan
| | - Tomoko Ichinose
- School of Music, Mukogawa Women’s University, Nishinomiya 663-8558, Japan;
| | | | - Hisatomo Kowa
- Department of Rehabilitation Science, Kobe University Graduate School of Health Sciences, Kobe 654-0142, Japan; (Y.O.); (H.K.)
| |
Collapse
|
2
|
Fu X, Smulders FTY, Riecke L. Touch Helps Hearing: Evidence From Continuous Audio-Tactile Stimulation. Ear Hear 2024:00003446-990000000-00318. [PMID: 39046790 DOI: 10.1097/aud.0000000000001566] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/25/2024]
Abstract
OBJECTIVES Identifying target sounds in challenging environments is crucial for daily experiences. It is important to note that it can be enhanced by nonauditory stimuli, for example, through lip-reading in an ongoing conversation. However, how tactile stimuli affect auditory processing is still relatively unclear. Recent studies have shown that brief tactile stimuli can reliably facilitate auditory perception, while studies using longer-lasting audio-tactile stimulation yielded conflicting results. This study aimed to investigate the impact of ongoing pulsating tactile stimulation on basic auditory processing. DESIGN In experiment 1, the electroencephalogram (EEG) was recorded while 24 participants performed a loudness-discrimination task on a 4-Hz modulated tone-in-noise and received either in-phase, anti-phase, or no 4-Hz electrotactile stimulation above the median nerve. In experiment 2, another 24 participants were presented with the same tactile stimulation as before, but performed a tone-in-noise detection task while their selective auditory attention was manipulated. RESULTS We found that in-phase tactile stimulation enhanced EEG responses to the tone, whereas anti-phase tactile stimulation suppressed these responses. No corresponding tactile effects on loudness-discrimination performance were observed in experiment 1. Using a yes/no paradigm in experiment 2, we found that in-phase tactile stimulation, but not anti-phase tactile stimulation, improved detection thresholds. Selective attention also improved thresholds but did not modulate the observed benefit from in-phase tactile stimulation. CONCLUSIONS Our study highlights that ongoing in-phase tactile input can enhance basic auditory processing as reflected in scalp EEG and detection thresholds. This might have implications for the development of hearing enhancement technologies and interventions.
Collapse
Affiliation(s)
- Xueying Fu
- Faculty of Psychology and Neuroscience, Department of Cognitive Neuroscience, Maastricht University, Maastricht, the Netherlands
| | | | | |
Collapse
|
3
|
Fu X, Riecke L. Effects of continuous tactile stimulation on auditory-evoked cortical responses depend on the audio-tactile phase. Neuroimage 2023; 274:120140. [PMID: 37120042 DOI: 10.1016/j.neuroimage.2023.120140] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2023] [Accepted: 04/27/2023] [Indexed: 05/01/2023] Open
Abstract
Auditory perception can benefit from stimuli in non-auditory sensory modalities, as for example in lip-reading. Compared with such visual influences, tactile influences are still poorly understood. It has been shown that single tactile pulses can enhance the perception of auditory stimuli depending on their relative timing, but whether and how such brief auditory enhancements can be stretched in time with more sustained, phase-specific periodic tactile stimulation is still unclear. To address this question, we presented tactile stimulation that fluctuated coherently and continuously at 4Hz with an auditory noise (either in-phase or anti-phase) and assessed its effect on the cortical processing and perception of an auditory signal embedded in that noise. Scalp-electroencephalography recordings revealed an enhancing effect of in-phase tactile stimulation on cortical responses phase-locked to the noise and a suppressive effect of anti-phase tactile stimulation on responses evoked by the auditory signal. Although these effects appeared to follow well-known principles of multisensory integration of discrete audio-tactile events, they were not accompanied by corresponding effects on behavioral measures of auditory signal perception. Our results indicate that continuous periodic tactile stimulation can enhance cortical processing of acoustically-induced fluctuations and mask cortical responses to an ongoing auditory signal. They further suggest that such sustained cortical effects can be insufficient for inducing sustained bottom-up auditory benefits.
Collapse
Affiliation(s)
- Xueying Fu
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, the Netherlands.
| | - Lars Riecke
- Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, the Netherlands
| |
Collapse
|
4
|
Bernard C, Kronland-Martinet R, Fery M, Ystad S, Thoret E. Tactile perception of auditory roughness. JASA EXPRESS LETTERS 2022; 2:123201. [PMID: 36586960 DOI: 10.1121/10.0016603] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
Auditory roughness resulting from fast temporal beatings is often studied by summing two pure tones with close frequencies. Interestingly, the tactile counterpart of auditory roughness can be provided through touch with vibrotactile actuators. However, whether auditory roughness could also be perceived through touch and whether it exhibits similar characteristics are unclear. Here, auditory roughness perception and its tactile counterpart were evaluated using pairs of pure tone stimuli. Results revealed similar roughness curves in both modalities, suggesting similar sensory processing. This study attests to the relevance of such a paradigm for investigating auditory and tactile roughness in a multisensory fashion.
Collapse
Affiliation(s)
- Corentin Bernard
- Aix-Marseille University, CNRS, UMR7061 PRISM, Marseille 13009, France , , , ,
| | | | - Madeline Fery
- Aix-Marseille University, CNRS, UMR7061 PRISM, Marseille 13009, France , , , ,
| | - Sølvi Ystad
- Aix-Marseille University, CNRS, UMR7061 PRISM, Marseille 13009, France , , , ,
| | - Etienne Thoret
- Aix-Marseille University, CNRS, UMR7061 PRISM, Marseille 13009, France , , , ,
| |
Collapse
|
5
|
Aker SC, Innes-Brown H, Faulkner KF, Vatti M, Marozeau J. Effect of audio-tactile congruence on vibrotactile music enhancement. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2022; 152:3396. [PMID: 36586853 DOI: 10.1121/10.0016444] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/31/2022] [Accepted: 11/21/2022] [Indexed: 06/17/2023]
Abstract
Music listening experiences can be enhanced with tactile vibrations. However, it is not known which parameters of the tactile vibration must be congruent with the music to enhance it. Devices that aim to enhance music with tactile vibrations often require coding an acoustic signal into a congruent vibrotactile signal. Therefore, understanding which of these audio-tactile congruences are important is crucial. Participants were presented with a simple sine wave melody through supra-aural headphones and a haptic actuator held between the thumb and forefinger. Incongruent versions of the stimuli were made by randomizing physical parameters of the tactile stimulus independently of the auditory stimulus. Participants were instructed to rate the stimuli against the incongruent stimuli based on preference. It was found making the intensity of the tactile stimulus incongruent with the intensity of the auditory stimulus, as well as misaligning the two modalities in time, had the biggest negative effect on ratings for the melody used. Future vibrotactile music enhancement devices can use time alignment and intensity congruence as a baseline coding strategy, which improved strategies can be tested against.
Collapse
Affiliation(s)
- Scott C Aker
- Music and Cochlear Implant Lab, Department of Health Technology, Technical University of Denmark, Kongens Lyngby, 2800, Denmark
| | | | | | | | - Jeremy Marozeau
- Music and Cochlear Implant Lab, Department of Health Technology, Technical University of Denmark, Kongens Lyngby, 2800, Denmark
| |
Collapse
|
6
|
Audiotactile integration in the Pacinian corpusclel's maximum sensitivity frequency range. Atten Percept Psychophys 2020; 82:3250-3257. [PMID: 32342343 DOI: 10.3758/s13414-020-02025-y] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
It has been shown that vibrotactile stimuli elicit sound perception either on their own or by enhancing otherwise inaudible sounds. For taking advantage of this phenomenon in the design of vibrotactile interfaces, it is important to identify its properties with respect to the level of the excitation frequency. The aim of this work is to further substantiate previous research results that indicate a prevalence of this phenomenon at a specific range of frequencies (200-390 Hz), which roughly pertains to the Pacinian corpuscle's maximum sensitivity range. Thirteen young adults participated in the study, which included comparison between sound-and-vibration versus sound-only signals. Masking background noise and no-touch control experiments were included to further support the outcome. The results validate the hypothesis that vibrotactile excitation at the index fingertip can enhance otherwise inaudible tones in the specific range of frequencies.
Collapse
|
7
|
Won HI, Altinsoy ME. Effect of Auditory Feedback on Tactile Intensity Perception in a Touchscreen Application. IEEE TRANSACTIONS ON HAPTICS 2020; 13:343-353. [PMID: 31634144 DOI: 10.1109/toh.2019.2947553] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
This article presents the effect of auditory feedback on tactile intensity perception, which may be of interest to haptic or audiotactile interaction engineers. An experimental setup consisted of a touchscreen, an electrodynamic shaker, and a closed-back headphone for a subject to interact with the touchscreen and to feel audiotactile feedback. In the experiment, participants were asked to judge perceived tactile intensity, using the magnitude estimation method, in the absence and presence of simultaneous auditory feedback. All data collected from the subjects were analyzed statistically, and then the effect of auditory feedback was investigated focusing on the following aspects: whether the presence of auditory feedback changes perceived tactile intensity, whether the frequency component of auditory feedback affects tactile intensity perception, and whether the coincidence of tactile and auditory frequencies influences on tactile intensity perception. Besides, changes in Stevens's exponent were analyzed to discuss how tactile intensity perception varies due to the auditory feedback. Finally, an equal intensity contour, in the domain of sensation level and frequency of tactile stimulation, was drawn. It can be applied to adjust the level of tactile stimuli for haptic feedback designers to provide a constant perceived tactile intensity considering the presence of auditory feedback.
Collapse
|
8
|
Pérez-Bellido A, Anne Barnes K, Crommett LE, Yau JM. Auditory Frequency Representations in Human Somatosensory Cortex. Cereb Cortex 2019; 28:3908-3921. [PMID: 29045579 DOI: 10.1093/cercor/bhx255] [Citation(s) in RCA: 29] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2017] [Indexed: 01/01/2023] Open
Abstract
Recent studies have challenged the traditional notion of modality-dedicated cortical systems by showing that audition and touch evoke responses in the same sensory brain regions. While much of this work has focused on somatosensory responses in auditory regions, fewer studies have investigated sound responses and representations in somatosensory regions. In this functional magnetic resonance imaging (fMRI) study, we measured BOLD signal changes in participants performing an auditory frequency discrimination task and characterized activation patterns related to stimulus frequency using both univariate and multivariate analysis approaches. Outside of bilateral temporal lobe regions, we observed robust and frequency-specific responses to auditory stimulation in classically defined somatosensory areas. Moreover, using representational similarity analysis to define the relationships between multi-voxel activation patterns for all sound pairs, we found clear similarity patterns for auditory responses in the parietal lobe that correlated significantly with perceptual similarity judgments. Our results demonstrate that auditory frequency representations can be distributed over brain regions traditionally considered to be dedicated to somatosensation. The broad distribution of auditory and tactile responses over parietal and temporal regions reveals a number of candidate brain areas that could support general temporal frequency processing and mediate the extensive and robust perceptual interactions between audition and touch.
Collapse
Affiliation(s)
- Alexis Pérez-Bellido
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, USA
| | - Kelly Anne Barnes
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, USA
| | - Lexi E Crommett
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, USA
| | - Jeffrey M Yau
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, USA
| |
Collapse
|
9
|
Riecke L, Snipes S, van Bree S, Kaas A, Hausfeld L. Audio-tactile enhancement of cortical speech-envelope tracking. Neuroimage 2019; 202:116134. [DOI: 10.1016/j.neuroimage.2019.116134] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2019] [Revised: 08/07/2019] [Accepted: 08/26/2019] [Indexed: 11/25/2022] Open
|
10
|
Abstract
Speech research during recent years has moved progressively away from its traditional focus on audition toward a more multisensory approach. In addition to audition and vision, many somatosenses including proprioception, pressure, vibration and aerotactile sensation are all highly relevant modalities for experiencing and/or conveying speech. In this article, we review both long-standing cross-modal effects stemming from decades of audiovisual speech research as well as new findings related to somatosensory effects. Cross-modal effects in speech perception to date are found to be constrained by temporal congruence and signal relevance, but appear to be unconstrained by spatial congruence. Far from taking place in a one-, two- or even three-dimensional space, the literature reveals that speech occupies a highly multidimensional sensory space. We argue that future research in cross-modal effects should expand to consider each of these modalities both separately and in combination with other modalities in speech.
Collapse
Affiliation(s)
- Megan Keough
- Interdisciplinary Speech Research Lab, Department of Linguistics, University of British Columbia, Vancouver, British Columbia V6T 1Z4, Canada
| | - Donald Derrick
- New Zealand Institute of Brain and Behaviour, University of Canterbury, Christchurch 8140, New Zealand
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, New South Wales 2751, Australia
| | - Bryan Gick
- Interdisciplinary Speech Research Lab, Department of Linguistics, University of British Columbia, Vancouver, British Columbia V6T 1Z4, Canada
- Haskins Laboratories, Yale University, New Haven, CT 06511, USA
| |
Collapse
|
11
|
Ghai S, Schmitz G, Hwang TH, Effenberg AO. Auditory Proprioceptive Integration: Effects of Real-Time Kinematic Auditory Feedback on Knee Proprioception. Front Neurosci 2018; 12:142. [PMID: 29568259 PMCID: PMC5852112 DOI: 10.3389/fnins.2018.00142] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2017] [Accepted: 02/22/2018] [Indexed: 01/23/2023] Open
Abstract
The purpose of the study was to assess the influence of real-time auditory feedback on knee proprioception. Thirty healthy participants were randomly allocated to control (n = 15), and experimental group I (15). The participants performed an active knee-repositioning task using their dominant leg, with/without additional real-time auditory feedback where the frequency was mapped in a convergent manner to two different target angles (40 and 75°). Statistical analysis revealed significant enhancement in knee re-positioning accuracy for the constant and absolute error with real-time auditory feedback, within and across the groups. Besides this convergent condition, we established a second divergent condition. Here, a step-wise transposition of frequency was performed to explore whether a systematic tuning between auditory-proprioceptive repositioning exists. No significant effects were identified in this divergent auditory feedback condition. An additional experimental group II (n = 20) was further included. Here, we investigated the influence of a larger magnitude and directional change of step-wise transposition of the frequency. In a first step, results confirm the findings of experiment I. Moreover, significant effects on knee auditory-proprioception repositioning were evident when divergent auditory feedback was applied. During the step-wise transposition participants showed systematic modulation of knee movements in the opposite direction of transposition. We confirm that knee re-positioning accuracy can be enhanced with concurrent application of real-time auditory feedback and that knee re-positioning can modulated in a goal-directed manner with step-wise transposition of frequency. Clinical implications are discussed with respect to joint position sense in rehabilitation settings.
Collapse
Affiliation(s)
- Shashank Ghai
- Institute of Sports Science, Leibniz University Hannover, Hannover, Germany
| | | | | | | |
Collapse
|
12
|
Timora J, Budd T. Steady-State EEG and Psychophysical Measures of Multisensory Integration to Cross-Modally Synchronous and Asynchronous Acoustic and Vibrotactile Amplitude Modulation Rate. Multisens Res 2018; 31:391-418. [DOI: 10.1163/22134808-00002549] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2016] [Accepted: 01/16/2017] [Indexed: 11/19/2022]
Abstract
According to thetemporal principleof multisensory integration, cross-modal synchronisation of stimulus onset facilitates multisensory integration. This is typically observed as a greater response to multisensory stimulation relative to the sum of the constituent unisensory responses (i.e.,superadditivity). The aim of the present study was to examine whether the temporal principle extends to the cross-modal synchrony of amplitude-modulation (AM) rate. It is well established that psychophysical sensitivity to AM stimulation is strongly influenced by AM rate where the optimum rate differs according to sensory modality. This rate-dependent sensitivity is also apparent from EEG steady-state response (SSR) activity, which becomes entrained to the stimulation rate and is thought to reflect neural processing of the temporal characteristics of AM stimulation. In this study we investigated whether cross-modal congruence of AM rate reveals both psychophysical and EEG evidence of enhanced multisensory integration. To achieve this, EEG SSR and psychophysical sensitivity to simultaneous acoustic and/or vibrotactile AM stimuli were measured at cross-modally congruent and incongruent AM rates. While the results provided no evidence of superadditive multisensory SSR activity or psychophysical sensitivity, the complex pattern of results did reveal a consistent correspondence between SSR activity and psychophysical sensitivity to AM stimulation. This indicates that entrained EEG activity may provide a direct measure of cortical activity underlying multisensory integration. Consistent with the temporal principle of multisensory integration, increased vibrotactile SSR responses and psychophysical sensitivity were found for cross-modally congruent relative to incongruent AM rate. However, no corresponding increase in auditory SSR or psychophysical sensitivity was observed for cross-modally congruent AM rates. This complex pattern of results can be understood in terms of the likely influence of theprinciple of inverse effectivenesswhere the temporal principle of multisensory integration was only evident in the context of reduced perceptual sensitivity for the vibrotactile but not the auditory modality.
Collapse
Affiliation(s)
- Justin R. Timora
- Brain Imaging Lab, School of Psychology, University of Newcastle, Ourimbah, NSW, Australia
| | - Timothy W. Budd
- Brain Imaging Lab, School of Psychology, University of Newcastle, Ourimbah, NSW, Australia
| |
Collapse
|
13
|
Fontana F, Papetti S, Järveläinen H, Avanzini F. Detection of keyboard vibrations and effects on perceived piano quality. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2017; 142:2953. [PMID: 29195444 DOI: 10.1121/1.5009659] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
Two experiments were conducted on an upright and a grand piano, both either producing string vibrations or conversely being silent after the initial keypress, while pianists were listening to the feedback from a synthesizer through insulating headphones. In a quality experiment, participants unaware of the silent mode were asked to play freely and then rate the instrument according to a set of attributes and general preference. Participants preferred the vibrating over the silent setup, and preference ratings were associated to auditory attributes of richness and naturalness in the low and middle ranges. Another experiment on the same setup measured the detection of vibrations at the keyboard, while pianists played notes and chords of varying dynamics and duration. Sensitivity to string vibrations was highest in the lowest register and gradually decreased up to note D5. After the percussive transient, the tactile stimuli exhibited spectral peaks of acceleration whose perceptibility was demonstrated by tests conducted in active touch conditions. The two experiments confirm that piano performers perceive vibratory cues of strings mediated by spectral and spatial summations occurring in the Pacinian system in their fingertips, and suggest that such cues play a role in the evaluation of quality of the musical instrument.
Collapse
Affiliation(s)
- Federico Fontana
- Department of Computer Science, Mathematics and Physics, Università di Udine, 206 via delle Scienze, Udine 33100, Italy
| | - Stefano Papetti
- Institute for Computer Music and Sound Technology, Zürcher Hochschule der Künste, 96 Pfingstweidstrasse, Zurich 8048, Switzerland
| | - Hanna Järveläinen
- Institute for Computer Music and Sound Technology, Zürcher Hochschule der Künste, 96 Pfingstweidstrasse, Zurich 8048, Switzerland
| | - Federico Avanzini
- Department of Information Engineering, Università di Padova, 6/A Via Gradenigo, Padova 35121, Italy
| |
Collapse
|
14
|
Roy C, Lagarde J, Dotov D, Dalla Bella S. Walking to a multisensory beat. Brain Cogn 2017; 113:172-183. [PMID: 28257971 DOI: 10.1016/j.bandc.2017.02.002] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2016] [Revised: 02/03/2017] [Accepted: 02/09/2017] [Indexed: 12/29/2022]
Abstract
Living in a complex and multisensory environment demands constant interaction between perception and action. In everyday life it is common to combine efficiently simultaneous signals coming from different modalities. There is evidence of a multisensory benefit in a variety of laboratory tasks (temporal judgement, reaction time tasks). It is less clear if this effect extends to ecological tasks, such as walking. Furthermore, benefits of multimodal stimulation are linked to temporal properties such as the temporal window of integration and temporal recalibration. These properties have been examined in tasks involving single, non-repeating stimulus presentations. Here we investigate the same temporal properties in the context of a rhythmic task, namely audio-tactile stimulation during walking. The effect of audio-tactile rhythmic cues on gait variability and the ability to synchronize to the cues was studied in young adults. Participants walked with rhythmic cues presented at different stimulus-onset asynchronies. We observed a multisensory benefit by comparing audio-tactile to unimodal stimulation. Moreover, both the temporal window of integration and temporal recalibration mediated the response to multimodal stimulation. In sum, rhythmic behaviours obey the same principles as temporal discrimination and detection behaviours and thus can also benefit from multimodal stimulation.
Collapse
Affiliation(s)
- Charlotte Roy
- EuroMov Laboratory, Montpellier University, 700 Avenue du Pic Saint Loup, 34090 Montpellier, France.
| | - Julien Lagarde
- EuroMov Laboratory, Montpellier University, 700 Avenue du Pic Saint Loup, 34090 Montpellier, France
| | - Dobromir Dotov
- Instituto de Neurobiología, Juriquilla, Universidad Nacional Autonoma de México, Mexico
| | - Simone Dalla Bella
- EuroMov Laboratory, Montpellier University, 700 Avenue du Pic Saint Loup, 34090 Montpellier, France; Institut Universitaire de France, Paris, France; International Laboratory for Brain, Music, and Sound Research (BRAMS), Montreal, Canada; Department of Cognitive Psychology, WSFiZ, Warsaw, Poland
| |
Collapse
|
15
|
Crommett LE, Pérez-Bellido A, Yau JM. Auditory adaptation improves tactile frequency perception. J Neurophysiol 2017; 117:1352-1362. [PMID: 28077668 PMCID: PMC5350269 DOI: 10.1152/jn.00783.2016] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2016] [Revised: 01/06/2017] [Accepted: 01/06/2017] [Indexed: 11/22/2022] Open
Abstract
Our ability to process temporal frequency information by touch underlies our capacity to perceive and discriminate surface textures. Auditory signals, which also provide extensive temporal frequency information, can systematically alter the perception of vibrations on the hand. How auditory signals shape tactile processing is unclear; perceptual interactions between contemporaneous sounds and vibrations are consistent with multiple neural mechanisms. Here we used a crossmodal adaptation paradigm, which separated auditory and tactile stimulation in time, to test the hypothesis that tactile frequency perception depends on neural circuits that also process auditory frequency. We reasoned that auditory adaptation effects would transfer to touch only if signals from both senses converge on common representations. We found that auditory adaptation can improve tactile frequency discrimination thresholds. This occurred only when adaptor and test frequencies overlapped. In contrast, auditory adaptation did not influence tactile intensity judgments. Thus auditory adaptation enhances touch in a frequency- and feature-specific manner. A simple network model in which tactile frequency information is decoded from sensory neurons that are susceptible to auditory adaptation recapitulates these behavioral results. Our results imply that the neural circuits supporting tactile frequency perception also process auditory signals. This finding is consistent with the notion of supramodal operators performing canonical operations, like temporal frequency processing, regardless of input modality.NEW & NOTEWORTHY Auditory signals can influence the tactile perception of temporal frequency. Multiple neural mechanisms could account for the perceptual interactions between contemporaneous auditory and tactile signals. Using a crossmodal adaptation paradigm, we found that auditory adaptation causes frequency- and feature-specific improvements in tactile perception. This crossmodal transfer of aftereffects between audition and touch implies that tactile frequency perception relies on neural circuits that also process auditory frequency.
Collapse
Affiliation(s)
- Lexi E Crommett
- Department of Neuroscience, Baylor College of Medicine, Houston, Texas
| | | | - Jeffrey M Yau
- Department of Neuroscience, Baylor College of Medicine, Houston, Texas
| |
Collapse
|
16
|
Roy C, Dalla Bella S, Lagarde J. To bridge or not to bridge the multisensory time gap: bimanual coordination to sound and touch with temporal lags. Exp Brain Res 2016; 235:135-151. [DOI: 10.1007/s00221-016-4776-4] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2016] [Accepted: 09/13/2016] [Indexed: 11/28/2022]
|
17
|
Pannunzi M, Pérez-Bellido A, Pereda-Baños A, López-Moliner J, Deco G, Soto-Faraco S. Deconstructing multisensory enhancement in detection. J Neurophysiol 2014; 113:1800-18. [PMID: 25520431 DOI: 10.1152/jn.00341.2014] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
The mechanisms responsible for the integration of sensory information from different modalities have become a topic of intense interest in psychophysics and neuroscience. Many authors now claim that early, sensory-based cross-modal convergence improves performance in detection tasks. An important strand of supporting evidence for this claim is based on statistical models such as the Pythagorean model or the probabilistic summation model. These models establish statistical benchmarks representing the best predicted performance under the assumption that there are no interactions between the two sensory paths. Following this logic, when observed detection performances surpass the predictions of these models, it is often inferred that such improvement indicates cross-modal convergence. We present a theoretical analyses scrutinizing some of these models and the statistical criteria most frequently used to infer early cross-modal interactions during detection tasks. Our current analysis shows how some common misinterpretations of these models lead to their inadequate use and, in turn, to contradictory results and misleading conclusions. To further illustrate the latter point, we introduce a model that accounts for detection performances in multimodal detection tasks but for which surpassing of the Pythagorean or probabilistic summation benchmark can be explained without resorting to early cross-modal interactions. Finally, we report three experiments that put our theoretical interpretation to the test and further propose how to adequately measure multimodal interactions in audiotactile detection tasks.
Collapse
Affiliation(s)
| | | | | | - Joan López-Moliner
- Universitat de Barcelona, Barcelona, Spain; Institute for Brain, Cognition and Behaviour (IR3C), Barcelona, Spain; and
| | - Gustavo Deco
- Universitat Pompeu Fabra, Barcelona, Spain; Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain
| | - Salvador Soto-Faraco
- Universitat Pompeu Fabra, Barcelona, Spain; Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain
| |
Collapse
|
18
|
Tonetto LM, Klanovicz CP, Spence C. Modifying action sounds influences people's emotional responses and bodily sensations. Iperception 2014; 5:153-63. [PMID: 25469221 PMCID: PMC4249985 DOI: 10.1068/i0653] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2014] [Revised: 06/26/2014] [Indexed: 11/22/2022] Open
Abstract
We report an experiment designed to investigate the effect of modifying the sound of high-heeled shoes on women's self-reported valence, arousal, and dominance scores, as well as any changes to a variety of measures of bodily sensation. We also assessed whether self-evaluated personality traits and the enjoyment associated with wearing heels were correlated with these effects. Forty-eight women walked down a “virtual runway” while listening to four interaction sounds (leather- and polypropylene-soled high-heeled shoes contacting ceramic flooring or carpet). Analysis of the questionnaires that the participants completed indicated that the type of sonic interaction impacted valence, arousal, and dominance scores, as well as the evaluated bodily sensations. There were also correlations between these scores and both self-evaluated personality traits and the reported enjoyment associated with wearing high heels. These results demonstrate the effect that the sound of a woman's physical interaction with the environment can have, especially when her contact with the ground while walking makes a louder sound. More generally, these results demonstrate that the manipulation of product extrinsic sounds can modify people's evaluation of their emotional outcomes (valence, arousal, and dominance), as well as their bodily sensations.
Collapse
Affiliation(s)
- Leandro Miletto Tonetto
- Universidade do Vale do Rio dos Sinos, Rua Luis Manoel Gonzaga, 744, Porto Alegre, CEP 90480-200, Brazil; e-mail:
| | - Cristiano Porto Klanovicz
- Universidade do Vale do Rio dos Sinos, Rua Luis Manoel Gonzaga, 744, Porto Alegre, CEP 90480-200, Brazil; e-mail:
| | - Charles Spence
- Crossmodal Research Laboratory, Department of Experimental Psychology, University of Oxford, South Parks Road, Oxford, OX1 3UD, England; e-mail:
| |
Collapse
|
19
|
Abstract
Spatial ventriloquism refers to the phenomenon that a visual stimulus such as a flash can attract the perceived location of a spatially discordant but temporally synchronous sound. An analogous example of mutual attraction between audition and vision has been found in the temporal domain, where temporal aspects of a visual event, such as its onset, frequency, or duration, can be biased by a slightly asynchronous sound. In this review, we examine various manifestations of spatial and temporal attraction between the senses (both direct effects and aftereffects), and we discuss important constraints on the occurrence of these effects. Factors that potentially modulate ventriloquism-such as attention, synesthetic correspondence, and other cognitive factors-are described. We trace theories and models of spatial and temporal ventriloquism, from the traditional unity assumption and modality appropriateness hypothesis to more recent Bayesian and neural network approaches. Finally, we summarize recent evidence probing the underlying neural mechanisms of spatial and temporal ventriloquism.
Collapse
|
20
|
Desloge JG, Reed CM, Braida LD, Perez ZD, Delhorne LA, Villabona TJ. Auditory and tactile gap discrimination by observers with normal and impaired hearing. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2014; 135:838-50. [PMID: 25234892 PMCID: PMC3985970 DOI: 10.1121/1.4861246] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/11/2013] [Revised: 12/11/2013] [Accepted: 12/12/2013] [Indexed: 06/03/2023]
Abstract
Temporal processing ability for the senses of hearing and touch was examined through the measurement of gap-duration discrimination thresholds (GDDTs) employing the same low-frequency sinusoidal stimuli in both modalities. GDDTs were measured in three groups of observers (normal-hearing, hearing-impaired, and normal-hearing with simulated hearing loss) covering an age range of 21-69 yr. GDDTs for a baseline gap of 6 ms were measured for four different combinations of 100-ms leading and trailing markers (250-250, 250-400, 400-250, and 400-400 Hz). Auditory measurements were obtained for monaural presentation over headphones and tactile measurements were obtained using sinusoidal vibrations presented to the left middle finger. The auditory GDDTs of the hearing-impaired listeners, which were larger than those of the normal-hearing observers, were well-reproduced in the listeners with simulated loss. The magnitude of the GDDT was generally independent of modality and showed effects of age in both modalities. The use of different-frequency compared to same-frequency markers led to a greater deterioration in auditory GDDTs compared to tactile GDDTs and may reflect differences in bandwidth properties between the two sensory systems.
Collapse
Affiliation(s)
- Joseph G Desloge
- Research Laboratory of Electronics, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139
| | - Charlotte M Reed
- Research Laboratory of Electronics, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139
| | - Louis D Braida
- Research Laboratory of Electronics, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139
| | - Zachary D Perez
- Research Laboratory of Electronics, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139
| | - Lorraine A Delhorne
- Research Laboratory of Electronics, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139
| | - Timothy J Villabona
- Research Laboratory of Electronics, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139
| |
Collapse
|
21
|
Dissociation of psychophysical and EEG steady-state response measures of cross-modal temporal correspondence for amplitude modulated acoustic and vibrotactile stimulation. Int J Psychophysiol 2013; 89:433-43. [PMID: 23770083 DOI: 10.1016/j.ijpsycho.2013.06.006] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2013] [Revised: 05/29/2013] [Accepted: 06/04/2013] [Indexed: 11/21/2022]
Abstract
Research examining multisensory integration suggests that the correspondence of stimulus characteristics across modalities (cross-modal correspondence) can have a dramatic influence on both neurophysiological and perceptual responses to multimodal stimulation. The current study extends prior research by examining the cross-modal correspondence of amplitude modulation rate for simultaneous acoustic and vibrotactile stimulation using EEG and perceptual measures of sensitivity to amplitude modulation. To achieve this, psychophysical thresholds and steady-state responses (SSRs) were measured for acoustic and vibrotactile amplitude modulated (AM) stimulation for 21 and 40 Hz AM rates as a function of the cross-modal correspondence. The study design included three primary conditions to determine whether the changes in the SSR and psychophysical thresholds were due to the cross-modal temporal correspondence of amplitude modulated stimuli: NONE (AM in one modality only), SAME (the same AM rate for each modality) and DIFF (different AM rates for each modality). The results of the psychophysical analysis showed that AM detection thresholds for the simultaneous AM conditions (i.e., SAME and DIFF) were significantly higher (i.e., lower sensitivity) than AM detection thresholds for the stimulation of a single modality (i.e., NONE). SSR results showed significant effects of SAME and DIFF conditions on SSR activity. The different pattern of results for perceptual and SSR measures of cross-modal correspondence of AM rate indicates a dissociation between entrained cortical activity (i.e., SSR) and perception.
Collapse
|
22
|
Multisensory representation of frequency across audition and touch: high density electrical mapping reveals early sensory-perceptual coupling. J Neurosci 2013; 32:15338-44. [PMID: 23115172 DOI: 10.1523/jneurosci.1796-12.2012] [Citation(s) in RCA: 45] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
The frequency of environmental vibrations is sampled by two of the major sensory systems, audition and touch, notwithstanding that these signals are transduced through very different physical media and entirely separate sensory epithelia. Psychophysical studies have shown that manipulating frequency in audition or touch can have a significant cross-sensory impact on perceived frequency in the other sensory system, pointing to intimate links between these senses during computation of frequency. In this regard, the frequency of a vibratory event can be thought of as a multisensory perceptual construct. In turn, electrophysiological studies point to temporally early multisensory interactions that occur in hierarchically early sensory regions where convergent inputs from the auditory and somatosensory systems are to be found. A key question pertains to the level of processing at which the multisensory integration of featural information, such as frequency, occurs. Do the sensory systems calculate frequency independently before this information is combined, or is this feature calculated in an integrated fashion during preattentive sensory processing? The well characterized mismatch negativity, an electrophysiological response that indexes preattentive detection of a change within the context of a regular pattern of stimulation, served as our dependent measure. High-density electrophysiological recordings were made in humans while they were presented with separate blocks of somatosensory, auditory, and audio-somatosensory "standards" and "deviants," where the deviant differed in frequency. Multisensory effects were identified beginning at ∼200 ms, with the multisensory mismatch negativity (MMN) significantly different from the sum of the unisensory MMNs. This provides compelling evidence for preattentive coupling between the somatosensory and auditory channels in the cortical representation of frequency.
Collapse
|
23
|
Abstract
Traditionally only speech communicates emotions via mobile phone. However, in daily communication the sense of touch mediates emotional information during conversation. The present aim was to study if tactile stimulation affects emotional ratings of speech when measured with scales of pleasantness, arousal, approachability, and dominance. In the Experiment 1 participants rated speech-only and speech-tactile stimuli. The tactile signal mimicked the amplitude changes of the speech. In the Experiment 2 the aim was to study whether the way the tactile signal was produced affected the ratings. The tactile signal either mimicked the amplitude changes of the speech sample in question, or the amplitude changes of another speech sample. Also, concurrent static vibration was included. The results showed that the speech-tactile stimuli were rated as more arousing and dominant than the speech-only stimuli. The speech-only stimuli were rated as more approachable than the speech-tactile stimuli, but only in the Experiment 1. Variations in tactile stimulation also affected the ratings. When the tactile stimulation was static vibration the speech-tactile stimuli were rated as more arousing than when the concurrent tactile stimulation was mimicking speech samples. The results suggest that tactile stimulation offers new ways of modulating and enriching the interpretation of speech.
Collapse
|
24
|
Yau JM, Weber AI, Bensmaia SJ. Separate mechanisms for audio-tactile pitch and loudness interactions. Front Psychol 2010; 1:160. [PMID: 21887147 PMCID: PMC3157934 DOI: 10.3389/fpsyg.2010.00160] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2010] [Accepted: 09/09/2010] [Indexed: 11/13/2022] Open
Abstract
A major goal in perceptual neuroscience is to understand how signals from different sensory modalities are combined to produce stable and coherent representations. We previously investigated interactions between audition and touch, motivated by the fact that both modalities are sensitive to environmental oscillations. In our earlier study, we characterized the effect of auditory distractors on tactile frequency and intensity perception. Here, we describe the converse experiments examining the effect of tactile distractors on auditory processing. Because the two studies employ the same psychophysical paradigm, we combined their results for a comprehensive view of how auditory and tactile signals interact and how these interactions depend on the perceptual task. Together, our results show that temporal frequency representations are perceptually linked regardless of the attended modality. In contrast, audio-tactile loudness interactions depend on the attended modality: Tactile distractors influence judgments of auditory intensity, but judgments of tactile intensity are impervious to auditory distraction. Lastly, we show that audio-tactile loudness interactions depend critically on stimulus timing, while pitch interactions do not. These results reveal that auditory and tactile inputs are combined differently depending on the perceptual task. That distinct rules govern the integration of auditory and tactile signals in pitch and loudness perception implies that the two are mediated by separate neural mechanisms. These findings underscore the complexity and specificity of multisensory interactions.
Collapse
Affiliation(s)
- Jeffrey M Yau
- Department of Neurology, Division of Cognitive Neuroscience, Johns Hopkins University School of Medicine Baltimore, MD, USA
| | | | | |
Collapse
|
25
|
Wilson EC, Reed CM, Braida LD. Integration of auditory and vibrotactile stimuli: effects of frequency. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2010; 127:3044-59. [PMID: 21117754 PMCID: PMC2882664 DOI: 10.1121/1.3365318] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/15/2023]
Abstract
Perceptual integration of vibrotactile and auditory sinusoidal tone pulses was studied in detection experiments as a function of stimulation frequency. Vibrotactile stimuli were delivered through a single channel vibrator to the left middle fingertip. Auditory stimuli were presented diotically through headphones in a background of 50 dB sound pressure level broadband noise. Detection performance for combined auditory-tactile presentations was measured using stimulus levels that yielded 63% to 77% correct unimodal performance. In Experiment 1, the vibrotactile stimulus was 250 Hz and the auditory stimulus varied between 125 and 2000 Hz. In Experiment 2, the auditory stimulus was 250 Hz and the tactile stimulus varied between 50 and 400 Hz. In Experiment 3, the auditory and tactile stimuli were always equal in frequency and ranged from 50 to 400 Hz. The highest rates of detection for the combined-modality stimulus were obtained when stimulating frequencies in the two modalities were equal or closely spaced (and within the Pacinian range). Combined-modality detection for closely spaced frequencies was generally consistent with an algebraic sum model of perceptual integration; wider-frequency spacings were generally better fit by a Pythagorean sum model. Thus, perceptual integration of auditory and tactile stimuli at near-threshold levels appears to depend both on absolute frequency and relative frequency of stimulation within each modality.
Collapse
Affiliation(s)
- E Courtenay Wilson
- Research Laboratory of Electronics, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139
| | | | | |
Collapse
|
26
|
Wilson EC, Braida LD, Reed CM. Perceptual interactions in the loudness of combined auditory and vibrotactile stimuli. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2010; 127:3038-3043. [PMID: 21117753 PMCID: PMC2882663 DOI: 10.1121/1.3377116] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/12/2009] [Revised: 02/18/2010] [Accepted: 03/11/2010] [Indexed: 05/30/2023]
Abstract
The loudness of auditory (A), tactile (T), and auditory-tactile (A+T) stimuli was measured at supra-threshold levels. Auditory stimuli were pure tones presented binaurally through headphones; tactile stimuli were sinusoids delivered through a single-channel vibrator to the left middle fingertip. All stimuli were presented together with a broadband auditory noise. The A and T stimuli were presented at levels that were matched in loudness to that of the 200-Hz auditory tone at 25 dB sensation level. The 200-Hz auditory tone was then matched in loudness to various combinations of auditory and tactile stimuli (A+T), and purely auditory stimuli (A+A). The results indicate that the matched intensity of the 200-Hz auditory tone is less when the A+T and A+A stimuli are close together in frequency than when they are separated by an octave or more. This suggests that A+T integration may operate in a manner similar to that found in auditory critical band studies, further supporting a strong frequency relationship between the auditory and somatosensory systems.
Collapse
Affiliation(s)
- E Courtenay Wilson
- Research Laboratory of Electronics, Massachusetts Institute of Technology, and Harvard-MIT Division of Health Sciences and Technology, Speech and Hearing Bioscience and Technology Program, Cambridge, Massachusetts 02139, USA
| | | | | |
Collapse
|