1
|
Kang H, Auksztulewicz R, Chan CH, Cappotto D, Rajendran VG, Schnupp JWH. Cross-modal implicit learning of random time patterns. Hear Res 2023; 438:108857. [PMID: 37639922 DOI: 10.1016/j.heares.2023.108857] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/20/2022] [Revised: 07/12/2023] [Accepted: 07/21/2023] [Indexed: 08/31/2023]
Abstract
Perception is sensitive to statistical regularities in the environment, including temporal characteristics of sensory inputs. Interestingly, implicit learning of temporal patterns in one modality can also improve their processing in another modality. However, it is unclear how cross-modal learning transfer affects neural responses to sensory stimuli. Here, we recorded neural activity of human volunteers using electroencephalography (EEG), while participants were exposed to brief sequences of randomly timed auditory or visual pulses. Some trials consisted of a repetition of the temporal pattern within the sequence, and subjects were tasked with detecting these trials. Unknown to the participants, some trials reappeared throughout the experiment across both modalities (Transfer) or only within a modality (Control), enabling implicit learning in one modality and its transfer. Using a novel method of analysis of single-trial EEG responses, we showed that learning temporal structures within and across modalities is reflected in neural learning curves. These putative neural correlates of learning transfer were similar both when temporal information learned in audition was transferred to visual stimuli and vice versa. The modality-specific mechanisms for learning of temporal information and general mechanisms which mediate learning transfer across modalities had distinct physiological signatures: temporal learning within modalities relied on modality-specific brain regions while learning transfer affected beta-band activity in frontal regions.
Collapse
Affiliation(s)
- HiJee Kang
- Department of Neuroscience, City University of Hong Kong, Hong Kong S.A.R; Department of Biomedical Engineering, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Ryszard Auksztulewicz
- Department of Neuroscience, City University of Hong Kong, Hong Kong S.A.R; Center for Cognitive Neuroscience Berlin, Free University Berlin, Berlin, Germany
| | - Chi Hong Chan
- Department of Neuroscience, City University of Hong Kong, Hong Kong S.A.R
| | - Drew Cappotto
- Department of Neuroscience, City University of Hong Kong, Hong Kong S.A.R; UCL Ear Institute, University College London, London, United Kingdom
| | - Vani G Rajendran
- Department of Neuroscience, City University of Hong Kong, Hong Kong S.A.R; Department of Cognitive Neuroscience, Instituto de Neurobiología, Universidad Nacional Autónoma de México, Querétaro, NM
| | - Jan W H Schnupp
- Department of Neuroscience, City University of Hong Kong, Hong Kong S.A.R.
| |
Collapse
|
2
|
Li Y, Ye B, Bao Y. The same phase creates a unique visual rhythm unifying moving elements in time. Psych J 2023; 12:500-506. [PMID: 36916772 DOI: 10.1002/pchj.636] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2022] [Accepted: 12/02/2022] [Indexed: 03/16/2023]
Abstract
Attention can be selectively tuned to particular features at different spatial locations or objects. The deployment of attention can be guided by properties, such as color, orientation, and so forth, as guiding features. What might be such guiding features for visual stimuli under dynamic rhythmic conditions? We asked specifically what might be the parameters that attract attention when perceiving a visual rhythm. We used a visual search paradigm, in which a dynamic search display consisted of vertically "bouncing balls" with regular rhythms. The search target was defined by a unique visual rhythm (i.e., with either a shorter or longer period) among rhythmic distractors sharing an identical period. We modulated amplitudes and phases of the distractor balls systematically. The results showed a crucial factor of the phase, not the amplitude. If the phase is violated, the target suddenly "pops out" as an "oddball," showing an efficient parallel search. The findings indicate in general the essential role of the phase in conjunction with amplitude and period for visual rhythm perception. Furthermore, a higher saliency of moving objects with a higher frequency component has also been disclosed.
Collapse
Affiliation(s)
- Yao Li
- School of Psychological and Cognitive Sciences, Peking University, Beijing, China
- Peking-Tsinghua Center for Life Sciences, Peking University, Beijing, China
| | - Biyi Ye
- School of Psychological and Cognitive Sciences, Peking University, Beijing, China
| | - Yan Bao
- School of Psychological and Cognitive Sciences, Peking University, Beijing, China
- Institute of Medical Psychology, Ludwig Maximilian University Munich, Munich, Germany
- Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing, China
| |
Collapse
|
3
|
Breitinger E, Dundon NM, Pokorny L, Wunram HL, Roessner V, Bender S. Contingent negative variation to tactile stimuli - differences in anticipatory and preparatory processes between participants with and without blindness. Cereb Cortex 2023; 33:7582-7594. [PMID: 36977633 DOI: 10.1093/cercor/bhad062] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2022] [Revised: 02/09/2023] [Accepted: 02/10/2023] [Indexed: 03/30/2023] Open
Abstract
People who are blind demonstrate remarkable abilities within the spared senses and compensatory enhancement of cognitive skills, underscored by substantial plastic reorganization in relevant neural areas. However, little is known about whether people with blindness form top-down models of the world on short timescales more efficiently to guide goal-oriented behavior. This electroencephalography study investigates this hypothesis at the neurophysiological level, focusing on contingent negative variation (CNV) as a marker of anticipatory and preparatory processes prior to expected events. In sum, 20 participants with blindness and 27 sighted participants completed a classic CNV task and a memory CNV task, both containing tactile stimuli to exploit the expertise of the former group. Although the reaction times in the classic CNV task did not differ between groups, participants who are blind reached higher performance rates in the memory task. This superior performance co-occurred with a distinct neurophysiological profile, relative to controls: greater late CNV amplitudes over central areas, suggesting enhanced stimulus expectancy and motor preparation prior to key events. Controls, in contrast, recruited more frontal sites, consistent with inefficient sensory-aligned control. We conclude that in more demanding cognitive contexts exploiting the spared senses, people with blindness efficiently generate task-relevant internal models to facilitate behavior.
Collapse
Affiliation(s)
- Eva Breitinger
- Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, University of Cologne, Faculty of Medicine and University Hospital Cologne, Germany
| | - Neil M Dundon
- Department of Child and Adolescent Psychiatry, Psychotherapy, and Psychosomatics, University of Freiburg, Germany
- Department of Psychological and Brain Sciences, University of California, Santa Barbara, CA
| | - Lena Pokorny
- Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, University of Cologne, Faculty of Medicine and University Hospital Cologne, Germany
| | - Heidrun L Wunram
- Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, University of Cologne, Faculty of Medicine and University Hospital Cologne, Germany
- Department of Pediatrics, University of Cologne, Faculty of Medicine and University Hospital Cologne, Germany
| | - Veit Roessner
- Department of Child and Adolescent Psychiatry and Psychotherapy, Technische Universität Dresden, Faculty of Medicine, University Hospital C. G. Carus, Germany
| | - Stephan Bender
- Department of Child and Adolescent Psychiatry, Psychosomatics, and Psychotherapy, University of Cologne, Faculty of Medicine and University Hospital Cologne, Germany
| |
Collapse
|
4
|
Feng Z, Zhu S, Duan J, Lu Y, Li L. Cross-modality effect in implicit learning of temporal sequence. CURRENT PSYCHOLOGY 2023. [DOI: 10.1007/s12144-022-04228-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
|
5
|
The Role of Auditory and Visual Components in Reading Training: No Additional Effect of Synchronized Visual Cue in a Rhythm-Based Intervention for Dyslexia. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12073360] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/03/2023]
Abstract
Based on the transfer effects of music training on the phonological and reading abilities of children with dyslexia, a computerized rhythmic intervention—the Rhythmic Reading Training (RRT)—was developed, in which reading exercises are combined with a rhythmic synchronization task. This rehabilitation program was previously tested in multiple controlled clinical trials, which confirmed its effectiveness in improving the reading skills of children and adolescents with dyslexia. In order to assess the specific contribution of the visual component of the training, namely, the presence of a visual cue supporting rhythmic synchronization, a controlled experimental study was conducted. Fifty-eight students with dyslexia aged 8 to 13 years were assigned to three conditions: (a) RRT auditory and visual condition, in which a visual cue was synchronized with the rhythmic stimulation; (b) RRT auditory-only condition, in which the visual cue was excluded; (c) no intervention. Comparisons of the participants’ performance before, after, and 3 months after the end of the intervention period revealed the significant immediate and long-term effect of both RRT conditions on reading, rapid naming, phonological, rhythmic, and attentional abilities. No significant differences were found between visual and auditory conditions, therefore showing no additional contribution of the visual component to the improvements induced by the RRT. Clinical Trial ID: NCT04995991.
Collapse
|
6
|
Holmes E, Parr T, Griffiths TD, Friston KJ. Active inference, selective attention, and the cocktail party problem. Neurosci Biobehav Rev 2021; 131:1288-1304. [PMID: 34687699 DOI: 10.1016/j.neubiorev.2021.09.038] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2021] [Revised: 08/27/2021] [Accepted: 09/17/2021] [Indexed: 11/25/2022]
Abstract
In this paper, we introduce a new generative model for an active inference account of preparatory and selective attention, in the context of a classic 'cocktail party' paradigm. In this setup, pairs of words are presented simultaneously to the left and right ears and an instructive spatial cue directs attention to the left or right. We use this generative model to test competing hypotheses about the way that human listeners direct preparatory and selective attention. We show that assigning low precision to words at attended-relative to unattended-locations can explain why a listener reports words from a competing sentence. Under this model, temporal changes in sensory precision were not needed to account for faster reaction times with longer cue-target intervals, but were necessary to explain ramping effects on event-related potentials (ERPs)-resembling the contingent negative variation (CNV)-during the preparatory interval. These simulations reveal that different processes are likely to underlie the improvement in reaction times and the ramping of ERPs that are associated with spatial cueing.
Collapse
Affiliation(s)
- Emma Holmes
- Department of Speech Hearing and Phonetic Sciences, UCL, London, WC1N 1PF, UK; Wellcome Centre for Human Neuroimaging, UCL, London, WC1N 3AR, UK.
| | - Thomas Parr
- Wellcome Centre for Human Neuroimaging, UCL, London, WC1N 3AR, UK
| | - Timothy D Griffiths
- Wellcome Centre for Human Neuroimaging, UCL, London, WC1N 3AR, UK; Biosciences Institute, Newcastle University, Newcastle upon Tyne, NE2 4HH, UK
| | - Karl J Friston
- Wellcome Centre for Human Neuroimaging, UCL, London, WC1N 3AR, UK
| |
Collapse
|
7
|
The influence of auditory rhythms on the speed of inferred motion. Atten Percept Psychophys 2021; 84:2360-2383. [PMID: 34435321 DOI: 10.3758/s13414-021-02364-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/26/2021] [Indexed: 12/24/2022]
Abstract
The present research explored the influence of isochronous auditory rhythms on the timing of movement-related prediction in two experiments. In both experiments, participants observed a moving disc that was visible for a predetermined period before disappearing behind a small, medium, or large occluded area for the remainder of its movement. In Experiment 1, the disc was visible for 1 s. During this period, participants were exposed to either a fast or slow auditory rhythm, or they heard nothing. They were instructed to press a key to indicate when they believed the moving disc had reached a specified location on the other side of the occluded area. The procedure measured the (signed) error in participants' estimate of the time it would take for a moving object to contact a stationary one. The principal results of Experiment 1 were main effects of the rate of the auditory rhythm and of the size of the occlusion on participants' judgments. In Experiment 2, the period of visibility was varied with size of the occlusion area to keep the total movement time constant for all three levels of occlusion. The results replicated the main effect of rhythm found in Experiment 1 and showed a small, significant interaction, but indicated no main effect of occlusion size. Overall, the results indicate that exposure to fast isochronous auditory rhythms during an interval of inferred motion can influence the imagined rate of such motion and suggest a possible role of an internal rhythmicity in the maintenance of temporally accurate dynamic mental representations.
Collapse
|
8
|
The interplay between multisensory integration and perceptual decision making. Neuroimage 2020; 222:116970. [PMID: 32454204 DOI: 10.1016/j.neuroimage.2020.116970] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2019] [Revised: 03/23/2020] [Accepted: 05/15/2020] [Indexed: 01/15/2023] Open
Abstract
Facing perceptual uncertainty, the brain combines information from different senses to make optimal perceptual decisions and to guide behavior. However, decision making has been investigated mostly in unimodal contexts. Thus, how the brain integrates multisensory information during decision making is still unclear. Two opposing, but not mutually exclusive, scenarios are plausible: either the brain thoroughly combines the signals from different modalities before starting to build a supramodal decision, or unimodal signals are integrated during decision formation. To answer this question, we devised a paradigm mimicking naturalistic situations where human participants were exposed to continuous cacophonous audiovisual inputs containing an unpredictable signal cue in one or two modalities and had to perform a signal detection task or a cue categorization task. First, model-based analyses of behavioral data indicated that multisensory integration takes place alongside perceptual decision making. Next, using supervised machine learning on concurrently recorded EEG, we identified neural signatures of two processing stages: sensory encoding and decision formation. Generalization analyses across experimental conditions and time revealed that multisensory cues were processed faster during both stages. We further established that acceleration of neural dynamics during sensory encoding and decision formation was directly linked to multisensory integration. Our results were consistent across both signal detection and categorization tasks. Taken together, the results revealed a continuous dynamic interplay between multisensory integration and decision making processes (mixed scenario), with integration of multimodal information taking place both during sensory encoding as well as decision formation.
Collapse
|
9
|
Wilsch A, Mercier MR, Obleser J, Schroeder CE, Haegens S. Spatial Attention and Temporal Expectation Exert Differential Effects on Visual and Auditory Discrimination. J Cogn Neurosci 2020; 32:1562-1576. [PMID: 32319865 DOI: 10.1162/jocn_a_01567] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Anticipation of an impending stimulus shapes the state of the sensory systems, optimizing neural and behavioral responses. Here, we studied the role of brain oscillations in mediating spatial and temporal anticipations. Because spatial attention and temporal expectation are often associated with visual and auditory processing, respectively, we directly contrasted the visual and auditory modalities and asked whether these anticipatory mechanisms are similar in both domains. We recorded the magnetoencephalogram in healthy human participants performing an auditory and visual target discrimination task, in which cross-modal cues provided both temporal and spatial information with regard to upcoming stimulus presentation. Motivated by prior findings, we were specifically interested in delta (1-3 Hz) and alpha (8-13 Hz) band oscillatory state in anticipation of target presentation and their impact on task performance. Our findings support the view that spatial attention has a stronger effect in the visual domain, whereas temporal expectation effects are more prominent in the auditory domain. For the spatial attention manipulation, we found a typical pattern of alpha lateralization in the visual system, which correlated with response speed. Providing a rhythmic temporal cue led to increased postcue synchronization of low-frequency rhythms, although this effect was more broadband in nature, suggesting a general phase reset rather than frequency-specific neural entrainment. In addition, we observed delta-band synchronization with a frontal topography, which correlated with performance, especially in the auditory task. Combined, these findings suggest that spatial and temporal anticipations operate via a top-down modulation of the power and phase of low-frequency oscillations, respectively.
Collapse
Affiliation(s)
| | - Manuel R Mercier
- University of Toulouse Paul Sabatier.,Aix Marseille University, Inserm, INS, Institut de Neurosciences des Systèmes, Marseille, France
| | - Jonas Obleser
- University of Lübeck.,Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Charles E Schroeder
- Columbia University College of Physicians and Surgeons.,Nathan Kline Institute, Orangeburg, SC
| | - Saskia Haegens
- Columbia University College of Physicians and Surgeons.,Radboud University Nijmegen
| |
Collapse
|
10
|
Kang H, Lancelin D, Pressnitzer D. Memory for Random Time Patterns in Audition, Touch, and Vision. Neuroscience 2018; 389:118-132. [PMID: 29577997 DOI: 10.1016/j.neuroscience.2018.03.017] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2017] [Revised: 03/09/2018] [Accepted: 03/13/2018] [Indexed: 11/28/2022]
Abstract
Perception deals with temporal sequences of events, like series of phonemes for audition, dynamic changes in pressure for touch textures, or moving objects for vision. Memory processes are thus needed to make sense of the temporal patterning of sensory information. Recently, we have shown that auditory temporal patterns could be learned rapidly and incidentally with repeated exposure [Kang et al., 2017]. Here, we tested whether rapid incidental learning of temporal patterns was specific to audition, or if it was a more general property of sensory systems. We used a same behavioral task in three modalities: audition, touch, and vision, for stimuli having identical temporal statistics. Participants were presented with sequences of acoustic pulses for audition, motion pulses to the fingertips for touch, or light pulses for vision. Pulses were randomly and irregularly spaced, with all inter-pulse intervals in the sub-second range and all constrained to be longer than the temporal acuity in any modality. This led to pulse sequences with an average inter-pulse interval of 166 ms, a minimum inter-pulse interval of 60 ms, and a total duration of 1.2 s. Results showed that, if a random temporal pattern re-occurred at random times during an experimental block, it was rapidly learned, whatever the sensory modality. Moreover, patterns first learned in the auditory modality displayed transfer of learning to either touch or vision. This suggests that sensory systems may be exquisitely tuned to incidentally learn re-occurring temporal patterns, with possible cross-talk between the senses.
Collapse
Affiliation(s)
- HiJee Kang
- Laboratoire des Systèmes Perceptifs, Département d'études cognitives, École Normale Supérieure, PSL Research University, CNRS, 29 rue d'Ulm, 75005 Paris, France.
| | - Denis Lancelin
- Laboratoire des Systèmes Perceptifs, Département d'études cognitives, École Normale Supérieure, PSL Research University, CNRS, 29 rue d'Ulm, 75005 Paris, France
| | - Daniel Pressnitzer
- Laboratoire des Systèmes Perceptifs, Département d'études cognitives, École Normale Supérieure, PSL Research University, CNRS, 29 rue d'Ulm, 75005 Paris, France.
| |
Collapse
|
11
|
Benau EM, DeLoretta LC, Moelter ST. The time is "right:" Electrophysiology reveals right parietal electrode dominance in time perception. Brain Cogn 2018; 123:92-102. [PMID: 29550507 DOI: 10.1016/j.bandc.2018.03.008] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2017] [Revised: 03/06/2018] [Accepted: 03/11/2018] [Indexed: 11/19/2022]
Abstract
In the present study, healthy undergraduates were asked to identify if a visual stimulus appeared on screen for the same duration as a memorized target (2 s) while event-related potentials (ERP) were recorded. Trials consisted of very short (1.25 s), short (1.6 s), target (2 s), long (2.5 s) or very long (3.125 s) durations, and a yes or no response was required on each trial. We examined behavioral response as signal detection (d') and response bias via a Generalized Accuracy Coefficient (GAC). We examined the mean amplitude as well as the change in amplitude of the initial Contingent Negative Variation (iCNV) and overall CNV (oCNV) and P350 (a P300-like component that follows stimulus extinction) potentials in paired, lateralized posterior electrodes. Results showed a bias to identifying shorter trials as the target more than longer trials via negative GAC scores. The slope and amplitudes of the iCNV and oCNV were consistently greater in right parietal electrodes. Also in right parietal electrodes, the iCNV correlated to d' scores while greater P350 amplitudes in the short condition correlated with more negative GAC scores. The results indicate dominance in the right hemisphere in temporal processing for durations exceeding 1 s. The P350 should also be studied further.
Collapse
Affiliation(s)
- Erik M Benau
- Department of Behavioral and Social Sciences, University of the Sciences, Philadelphia, PA, USA
| | - Laura C DeLoretta
- Department of Behavioral and Social Sciences, University of the Sciences, Philadelphia, PA, USA
| | - Stephen T Moelter
- Department of Behavioral and Social Sciences, University of the Sciences, Philadelphia, PA, USA.
| |
Collapse
|
12
|
Peripheral hearing loss reduces the ability of children to direct selective attention during multi-talker listening. Hear Res 2017; 350:160-172. [PMID: 28505526 DOI: 10.1016/j.heares.2017.05.005] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/31/2016] [Revised: 04/28/2017] [Accepted: 05/08/2017] [Indexed: 11/23/2022]
Abstract
Restoring normal hearing requires knowledge of how peripheral and central auditory processes are affected by hearing loss. Previous research has focussed primarily on peripheral changes following sensorineural hearing loss, whereas consequences for central auditory processing have received less attention. We examined the ability of hearing-impaired children to direct auditory attention to a voice of interest (based on the talker's spatial location or gender) in the presence of a common form of background noise: the voices of competing talkers (i.e. during multi-talker, or "Cocktail Party" listening). We measured brain activity using electro-encephalography (EEG) when children prepared to direct attention to the spatial location or gender of an upcoming target talker who spoke in a mixture of three talkers. Compared to normally-hearing children, hearing-impaired children showed significantly less evidence of preparatory brain activity when required to direct spatial attention. This finding is consistent with the idea that hearing-impaired children have a reduced ability to prepare spatial attention for an upcoming talker. Moreover, preparatory brain activity was not restored when hearing-impaired children listened with their acoustic hearing aids. An implication of these findings is that steps to improve auditory attention alongside acoustic hearing aids may be required to improve the ability of hearing-impaired children to understand speech in the presence of competing talkers.
Collapse
|
13
|
Silva S, Castro SL. Moving Stimuli Facilitate Synchronization But Not Temporal Perception. Front Psychol 2016; 7:1798. [PMID: 27909419 PMCID: PMC5112270 DOI: 10.3389/fpsyg.2016.01798] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2016] [Accepted: 10/31/2016] [Indexed: 11/13/2022] Open
Abstract
Recent studies have shown that a moving visual stimulus (e.g., a bouncing ball) facilitates synchronization compared to a static stimulus (e.g., a flashing light), and that it can even be as effective as an auditory beep. We asked a group of participants to perform different tasks with four stimulus types: beeps, siren-like sounds, visual flashes (static) and bouncing balls. First, participants performed synchronization with isochronous sequences (stimulus-guided synchronization), followed by a continuation phase in which the stimulus was internally generated (imagery-guided synchronization). Then they performed a perception task, in which they judged whether the final part of a temporal sequence was compatible with the previous beat structure (stimulus-guided perception). Similar to synchronization, an imagery-guided variant was added, in which sequences contained a gap in between (imagery-guided perception). Balls outperformed flashes and matched beeps (powerful ball effect) in stimulus-guided synchronization but not in perception (stimulus- or imagery-guided). In imagery-guided synchronization, performance accuracy decreased for beeps and balls, but not for flashes and sirens. Our findings suggest that the advantages of moving visual stimuli over static ones are grounded in action rather than perception, and they support the hypothesis that the sensorimotor coupling mechanisms for auditory (beeps) and moving visual stimuli (bouncing balls) overlap.
Collapse
Affiliation(s)
- Susana Silva
- Neurocognition and Language Research Group, Center for Psychology at University of Porto, Faculty of Psychology and Educational Sciences, University of Porto Porto, Portugal
| | - São Luís Castro
- Neurocognition and Language Research Group, Center for Psychology at University of Porto, Faculty of Psychology and Educational Sciences, University of Porto Porto, Portugal
| |
Collapse
|
14
|
Araneda R, Renier L, Ebner-Karestinos D, Dricot L, De Volder AG. Hearing, feeling or seeing a beat recruits a supramodal network in the auditory dorsal stream. Eur J Neurosci 2016; 45:1439-1450. [PMID: 27471102 DOI: 10.1111/ejn.13349] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2015] [Revised: 06/13/2016] [Accepted: 07/23/2016] [Indexed: 10/21/2022]
Abstract
Hearing a beat recruits a wide neural network that involves the auditory cortex and motor planning regions. Perceiving a beat can potentially be achieved via vision or even touch, but it is currently not clear whether a common neural network underlies beat processing. Here, we used functional magnetic resonance imaging (fMRI) to test to what extent the neural network involved in beat processing is supramodal, that is, is the same in the different sensory modalities. Brain activity changes in 27 healthy volunteers were monitored while they were attending to the same rhythmic sequences (with and without a beat) in audition, vision and the vibrotactile modality. We found a common neural network for beat detection in the three modalities that involved parts of the auditory dorsal pathway. Within this network, only the putamen and the supplementary motor area (SMA) showed specificity to the beat, while the brain activity in the putamen covariated with the beat detection speed. These results highlighted the implication of the auditory dorsal stream in beat detection, confirmed the important role played by the putamen in beat detection and indicated that the neural network for beat detection is mostly supramodal. This constitutes a new example of convergence of the same functional attributes into one centralized representation in the brain.
Collapse
Affiliation(s)
- Rodrigo Araneda
- Université catholique de Louvain, 54 Avenue Hippocrate UCL B1.54.09, 1200, Brussels, Belgium
| | - Laurent Renier
- Université catholique de Louvain, 54 Avenue Hippocrate UCL B1.54.09, 1200, Brussels, Belgium
| | | | - Laurence Dricot
- Université catholique de Louvain, 54 Avenue Hippocrate UCL B1.54.09, 1200, Brussels, Belgium
| | - Anne G De Volder
- Université catholique de Louvain, 54 Avenue Hippocrate UCL B1.54.09, 1200, Brussels, Belgium
| |
Collapse
|
15
|
|