1
|
Nguyen T, Lagacé-Cusiac R, Everling JC, Henry MJ, Grahn JA. Audiovisual integration of rhythm in musicians and dancers. Atten Percept Psychophys 2024; 86:1400-1416. [PMID: 38557941 DOI: 10.3758/s13414-024-02874-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/23/2024] [Indexed: 04/04/2024]
Abstract
Music training is associated with better beat processing in the auditory modality. However, it is unknown how rhythmic training that emphasizes visual rhythms, such as dance training, might affect beat processing, nor whether training effects in general are modality specific. Here we examined how music and dance training interacted with modality during audiovisual integration and synchronization to auditory and visual isochronous sequences. In two experiments, musicians, dancers, and controls completed an audiovisual integration task and an audiovisual target-distractor synchronization task using dynamic visual stimuli (a bouncing figure). The groups performed similarly on the audiovisual integration tasks (Experiments 1 and 2). However, in the finger-tapping synchronization task (Experiment 1), musicians were more influenced by auditory distractors when synchronizing to visual sequences, while dancers were more influenced by visual distractors when synchronizing to auditory sequences. When participants synchronized with whole-body movements instead of finger-tapping (Experiment 2), all groups were more influenced by the visual distractor than the auditory distractor. Taken together, this study highlights how training is associated with audiovisual processing, and how different types of visual rhythmic stimuli and different movements alter beat perception and production outcome measures. Implications for the modality appropriateness hypothesis are discussed.
Collapse
Affiliation(s)
- Tram Nguyen
- Brain and Mind Institute and Department of Psychology, University of Western Ontario, London, Ontario, Canada
| | - Rebekka Lagacé-Cusiac
- Brain and Mind Institute and Department of Psychology, University of Western Ontario, London, Ontario, Canada
| | - J Celina Everling
- Brain and Mind Institute and Department of Psychology, University of Western Ontario, London, Ontario, Canada
| | - Molly J Henry
- Max Planck Institute for Empirical Aesthetics, Frankfurt, Germany
- Department of Psychology, Toronto Metropolitan University, Toronto, Ontario, Canada
| | - Jessica A Grahn
- Brain and Mind Institute and Department of Psychology, University of Western Ontario, London, Ontario, Canada.
| |
Collapse
|
2
|
Exposure to multisensory and visual static or moving stimuli enhances processing of nonoptimal visual rhythms. Atten Percept Psychophys 2022; 84:2655-2669. [PMID: 36241841 PMCID: PMC9630188 DOI: 10.3758/s13414-022-02569-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/05/2022] [Indexed: 11/25/2022]
Abstract
Research has shown that visual moving and multisensory stimuli can efficiently mediate rhythmic information. It is possible, therefore, that the previously reported auditory dominance in rhythm perception is due to the use of nonoptimal visual stimuli. Yet it remains unknown whether exposure to multisensory or visual-moving rhythms would benefit the processing of rhythms consisting of nonoptimal static visual stimuli. Using a perceptual learning paradigm, we tested whether the visual component of the multisensory training pair can affect processing of metric simple two integer-ratio nonoptimal visual rhythms. Participants were trained with static (AVstat), moving-inanimate (AVinan), or moving-animate (AVan) visual stimuli along with auditory tones and a regular beat. In the pre- and posttraining tasks, participants responded whether two static-visual rhythms differed or not. Results showed improved posttraining performance for all training groups irrespective of the type of visual stimulation. To assess whether this benefit was auditory driven, we introduced visual-only training with a moving or static stimulus and a regular beat (Vinan). Comparisons between Vinan and Vstat showed that, even in the absence of auditory information, training with visual-only moving or static stimuli resulted in an enhanced posttraining performance. Overall, our findings suggest that audiovisual and visual static or moving training can benefit processing of nonoptimal visual rhythms.
Collapse
|
3
|
Fiveash A, Burger B, Canette LH, Bedoin N, Tillmann B. When Visual Cues Do Not Help the Beat: Evidence for a Detrimental Effect of Moving Point-Light Figures on Rhythmic Priming. Front Psychol 2022; 13:807987. [PMID: 35185727 PMCID: PMC8855071 DOI: 10.3389/fpsyg.2022.807987] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2021] [Accepted: 01/10/2022] [Indexed: 11/13/2022] Open
Abstract
Rhythm perception involves strong auditory-motor connections that can be enhanced with movement. However, it is unclear whether just seeing someone moving to a rhythm can enhance auditory-motor coupling, resulting in stronger entrainment. Rhythmic priming studies show that presenting regular rhythms before naturally spoken sentences can enhance grammaticality judgments compared to irregular rhythms or other baseline conditions. The current study investigated whether introducing a point-light figure moving in time with regular rhythms could enhance the rhythmic priming effect. Three experiments revealed that the addition of a visual cue did not benefit rhythmic priming in comparison to auditory conditions with a static image. In Experiment 1 (27 7–8-year-old children), grammaticality judgments were poorer after audio-visual regular rhythms (with a bouncing point-light figure) compared to auditory-only regular rhythms. In Experiments 2 (31 adults) and 3 (31 different adults), there was no difference in grammaticality judgments after audio-visual regular rhythms compared to auditory-only irregular rhythms for either a bouncing point-light figure (Experiment 2) or a swaying point-light figure (Experiment 3). Comparison of the observed performance with previous data suggested that the audio-visual component removed the regular prime benefit. These findings suggest that the visual cues used in this study do not enhance rhythmic priming and could hinder the effect by potentially creating a dual-task situation. In addition, individual differences in sensory-motor and social scales of music reward influenced the effect of the visual cue. Implications for future audio-visual experiments aiming to enhance beat processing, and the importance of individual differences will be discussed.
Collapse
Affiliation(s)
- Anna Fiveash
- Lyon Neuroscience Research Center, CNRS, UMR 5292, INSERM, U1028, Lyon, France
- University of Lyon 1, Lyon, France
- *Correspondence: Anna Fiveash,
| | - Birgitta Burger
- Institute for Systematic Musicology, University of Hamburg, Hamburg, Germany
| | - Laure-Hélène Canette
- Lyon Neuroscience Research Center, CNRS, UMR 5292, INSERM, U1028, Lyon, France
- University of Lyon 1, Lyon, France
- University of Burgundy, F-21000, LEAD-CNRS UMR 5022, Dijon, France
| | - Nathalie Bedoin
- Lyon Neuroscience Research Center, CNRS, UMR 5292, INSERM, U1028, Lyon, France
- University of Lyon 1, Lyon, France
- University of Lyon 2, Lyon, France
| | - Barbara Tillmann
- Lyon Neuroscience Research Center, CNRS, UMR 5292, INSERM, U1028, Lyon, France
- University of Lyon 1, Lyon, France
| |
Collapse
|
4
|
Gilmore SA, Russo FA. Neural and Behavioral Evidence for Vibrotactile Beat Perception and Bimodal Enhancement. J Cogn Neurosci 2021; 33:635-650. [PMID: 33475449 DOI: 10.1162/jocn_a_01673] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The ability to synchronize movements to a rhythmic stimulus, referred to as sensorimotor synchronization (SMS), is a behavioral measure of beat perception. Although SMS is generally superior when rhythms are presented in the auditory modality, recent research has demonstrated near-equivalent SMS for vibrotactile presentations of isochronous rhythms [Ammirante, P., Patel, A. D., & Russo, F. A. Synchronizing to auditory and tactile metronomes: A test of the auditory-motor enhancement hypothesis. Psychonomic Bulletin & Review, 23, 1882-1890, 2016]. The current study aimed to replicate and extend this study by incorporating a neural measure of beat perception. Nonmusicians were asked to tap to rhythms or to listen passively while EEG data were collected. Rhythmic complexity (isochronous, nonisochronous) and presentation modality (auditory, vibrotactile, bimodal) were fully crossed. Tapping data were consistent with those observed by Ammirante et al. (2016), revealing near-equivalent SMS for isochronous rhythms across modality conditions and a drop-off in SMS for nonisochronous rhythms, especially in the vibrotactile condition. EEG data revealed a greater degree of neural entrainment for isochronous compared to nonisochronous trials as well as for auditory and bimodal compared to vibrotactile trials. These findings led us to three main conclusions. First, isochronous rhythms lead to higher levels of beat perception than nonisochronous rhythms across modalities. Second, beat perception is generally enhanced for auditory presentations of rhythm but still possible under vibrotactile presentation conditions. Finally, exploratory analysis of neural entrainment at harmonic frequencies suggests that beat perception may be enhanced for bimodal presentations of rhythm.
Collapse
|
5
|
Khan O, Ahmed I, Cottingham J, Rahhal M, Arvanitis TN, Elliott MT. Timing and correction of stepping movements with a virtual reality avatar. PLoS One 2020; 15:e0229641. [PMID: 32109252 PMCID: PMC7048307 DOI: 10.1371/journal.pone.0229641] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2019] [Accepted: 02/11/2020] [Indexed: 12/17/2022] Open
Abstract
Research into the ability to coordinate one's movements with external cues has focussed on the use of simple rhythmic, auditory and visual stimuli, or interpersonal coordination with another person. Coordinating movements with a virtual avatar has not been explored, in the context of responses to temporal cues. To determine whether cueing of movements using a virtual avatar is effective, people's ability to accurately coordinate with the stimuli needs to be investigated. Here we focus on temporal cues, as we know from timing studies that visual cues can be difficult to follow in the timing context. Real stepping movements were mapped onto an avatar using motion capture data. Healthy participants were then motion captured whilst stepping in time with the avatar's movements, as viewed through a virtual reality headset. The timing of one of the avatar step cycles was accelerated or decelerated by 15% to create a temporal perturbation, for which participants would need to correct to, in order to remain in time. Step onset times of participants relative to the corresponding step-onsets of the avatar were used to measure the timing errors (asynchronies) between them. Participants completed either a visual-only condition, or auditory-visual with footstep sounds included, at two stepping tempo conditions (Fast: 400ms interval, Slow: 800ms interval). Participants' asynchronies exhibited slow drift in the Visual-Only condition, but became stable in the Auditory-Visual condition. Moreover, we observed a clear corrective response to the phase perturbation in both the fast and slow tempo auditory-visual conditions. We conclude that an avatar's movements can be used to influence a person's own motion, but should include relevant auditory cues congruent with the movement to ensure a suitable level of entrainment is achieved. This approach has applications in physiotherapy, where virtual avatars present an opportunity to provide the guidance to assist patients in adhering to prescribed exercises.
Collapse
Affiliation(s)
- Omar Khan
- Warwick Manufacturing Group, Institute of Digital Healthcare, University of Warwick, Coventry, United Kingdom
| | - Imran Ahmed
- Warwick Medical School, University of Warwick, Coventry, United Kingdom
| | - Joshua Cottingham
- Department of Computer Science, University of Warwick, Coventry, United Kingdom
| | - Musa Rahhal
- School of Engineering, University of Warwick, Coventry, United Kingdom
| | - Theodoros N. Arvanitis
- Warwick Manufacturing Group, Institute of Digital Healthcare, University of Warwick, Coventry, United Kingdom
| | - Mark T. Elliott
- Warwick Manufacturing Group, Institute of Digital Healthcare, University of Warwick, Coventry, United Kingdom
- * E-mail:
| |
Collapse
|
6
|
Affiliation(s)
- Daniel J. Levitin
- Department of Psychology, McGill University, Montreal, QC H3A 1G1, Canada
| | - Jessica A. Grahn
- Department of Psychology and Brain and Mind Institute, Western University, London, Ontario N6A 5B7, Canada
| | - Justin London
- Departments of Music and Cognitive Science, Carleton College, Northfield, Minnesota 55057
| |
Collapse
|
7
|
Bishop L, Goebl W. Beating time: How ensemble musicians' cueing gestures communicate beat position and tempo. PSYCHOLOGY OF MUSIC 2018; 46:84-106. [PMID: 29276332 PMCID: PMC5718341 DOI: 10.1177/0305735617702971] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
Ensemble musicians typically exchange visual cues to coordinate piece entrances. "Cueing-in" gestures indicate when to begin playing and at what tempo. This study investigated how timing information is encoded in musicians' cueing-in gestures. Gesture acceleration patterns were expected to indicate beat position, while gesture periodicity, duration, and peak gesture velocity were expected to indicate tempo. Same-instrument ensembles (e.g., piano-piano) were expected to synchronize more successfully than mixed-instrument ensembles (e.g., piano-violin). Duos performed short passages as their head and (for violinists) bowing hand movements were tracked with accelerometers and Kinect sensors. Performers alternated between leader/follower roles; leaders heard a tempo via headphones and cued their partner in nonverbally. Violin duos synchronized more successfully than either piano duos or piano-violin duos, possibly because violinists were more experienced in ensemble playing than pianists. Peak acceleration indicated beat position in leaders' head-nodding gestures. Gesture duration and periodicity in leaders' head and bowing hand gestures indicated tempo. The results show that the spatio-temporal characteristics of cueing-in gestures guide beat perception, enabling synchronization with visual gestures that follow a range of spatial trajectories.
Collapse
Affiliation(s)
- Laura Bishop
- Austrian Research Institute for Artificial Intelligence (OFAI), Vienna, Austria
| | - Werner Goebl
- Austrian Research Institute for Artificial Intelligence (OFAI), Vienna, Austria
- Department of Music Acoustics (IWK), University of Music and Performing Arts Vienna, Austria
| |
Collapse
|
8
|
Su YH. Visual Enhancement of Illusory Phenomenal Accents in Non-Isochronous Auditory Rhythms. PLoS One 2016; 11:e0166880. [PMID: 27880850 PMCID: PMC5120798 DOI: 10.1371/journal.pone.0166880] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2016] [Accepted: 11/04/2016] [Indexed: 11/19/2022] Open
Abstract
Musical rhythms encompass temporal patterns that often yield regular metrical accents (e.g., a beat). There have been mixed results regarding perception as a function of metrical saliency, namely, whether sensitivity to a deviant was greater in metrically stronger or weaker positions. Besides, effects of metrical position have not been examined in non-isochronous rhythms, or with respect to multisensory influences. This study was concerned with two main issues: (1) In non-isochronous auditory rhythms with clear metrical accents, how would sensitivity to a deviant be modulated by metrical positions? (2) Would the effects be enhanced by multisensory information? Participants listened to strongly metrical rhythms with or without watching a point-light figure dance to the rhythm in the same meter, and detected a slight loudness increment. Both conditions were presented with or without an auditory interference that served to impair auditory metrical perception. Sensitivity to a deviant was found greater in weak beat than in strong beat positions, consistent with the Predictive Coding hypothesis and the idea of metrically induced illusory phenomenal accents. The visual rhythm of dance hindered auditory detection, but more so when the latter was itself less impaired. This pattern suggested that the visual and auditory rhythms were perceptually integrated to reinforce metrical accentuation, yielding more illusory phenomenal accents and thus lower sensitivity to deviants, in a manner consistent with the principle of inverse effectiveness. Results were discussed in the predictive framework for multisensory rhythms involving observed movements and possible mediation of the motor system.
Collapse
Affiliation(s)
- Yi-Huang Su
- Department of Movement Science, Faculty of Sport and Health Sciences, Technical University of Munich, Munich, Germany
| |
Collapse
|
9
|
Silva S, Castro SL. Moving Stimuli Facilitate Synchronization But Not Temporal Perception. Front Psychol 2016; 7:1798. [PMID: 27909419 PMCID: PMC5112270 DOI: 10.3389/fpsyg.2016.01798] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2016] [Accepted: 10/31/2016] [Indexed: 11/13/2022] Open
Abstract
Recent studies have shown that a moving visual stimulus (e.g., a bouncing ball) facilitates synchronization compared to a static stimulus (e.g., a flashing light), and that it can even be as effective as an auditory beep. We asked a group of participants to perform different tasks with four stimulus types: beeps, siren-like sounds, visual flashes (static) and bouncing balls. First, participants performed synchronization with isochronous sequences (stimulus-guided synchronization), followed by a continuation phase in which the stimulus was internally generated (imagery-guided synchronization). Then they performed a perception task, in which they judged whether the final part of a temporal sequence was compatible with the previous beat structure (stimulus-guided perception). Similar to synchronization, an imagery-guided variant was added, in which sequences contained a gap in between (imagery-guided perception). Balls outperformed flashes and matched beeps (powerful ball effect) in stimulus-guided synchronization but not in perception (stimulus- or imagery-guided). In imagery-guided synchronization, performance accuracy decreased for beeps and balls, but not for flashes and sirens. Our findings suggest that the advantages of moving visual stimuli over static ones are grounded in action rather than perception, and they support the hypothesis that the sensorimotor coupling mechanisms for auditory (beeps) and moving visual stimuli (bouncing balls) overlap.
Collapse
Affiliation(s)
- Susana Silva
- Neurocognition and Language Research Group, Center for Psychology at University of Porto, Faculty of Psychology and Educational Sciences, University of Porto Porto, Portugal
| | - São Luís Castro
- Neurocognition and Language Research Group, Center for Psychology at University of Porto, Faculty of Psychology and Educational Sciences, University of Porto Porto, Portugal
| |
Collapse
|
10
|
|
11
|
Celma-Miralles A, de Menezes RF, Toro JM. Look at the Beat, Feel the Meter: Top-Down Effects of Meter Induction on Auditory and Visual Modalities. Front Hum Neurosci 2016; 10:108. [PMID: 27047358 PMCID: PMC4803728 DOI: 10.3389/fnhum.2016.00108] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2015] [Accepted: 02/28/2016] [Indexed: 11/13/2022] Open
Abstract
Recent research has demonstrated top-down effects on meter induction in the auditory modality. However, little is known about these effects in the visual domain, especially without the involvement of motor acts such as tapping. In the present study, we aim to assess whether the projection of meter on auditory beats is also present in the visual domain. We asked 16 musicians to internally project binary (i.e., a strong-weak pattern) and ternary (i.e., a strong-weak-weak pattern) meter onto separate, but analog, visual and auditory isochronous stimuli. Participants were presented with sequences of tones or blinking circular shapes (i.e., flashes) at 2.4 Hz while their electrophysiological responses were recorded. A frequency analysis of the elicited steady-state evoked potentials allowed us to compare the frequencies of the beat (2.4 Hz), its first harmonic (4.8 Hz), the binary subharmonic (1.2 Hz), and the ternary subharmonic (0.8 Hz) within and across modalities. Taking the amplitude spectra into account, we observed an enhancement of the amplitude at 0.8 Hz in the ternary condition for both modalities, suggesting meter induction across modalities. There was an interaction between modality and voltage at 2.4 and 4.8 Hz. Looking at the power spectra, we also observed significant differences from zero in the auditory, but not in the visual, binary condition at 1.2 Hz. These findings suggest that meter processing is modulated by top-down mechanisms that interact with our perception of rhythmic events and that such modulation can also be found in the visual domain. The reported cross-modal effects of meter may shed light on the origins of our timing mechanisms, partially developed in primates and allowing humans to synchronize across modalities accurately.
Collapse
Affiliation(s)
- Alexandre Celma-Miralles
- Information and Communication Technologies Engineering (ETIC), Language and Comparative Cognition Group - Center for Brain and Cognition, Universitat Pompeu Fabra Barcelona, Spain
| | - Robert F de Menezes
- Information and Communication Technologies Engineering (ETIC), Language and Comparative Cognition Group - Center for Brain and Cognition, Universitat Pompeu Fabra Barcelona, Spain
| | - Juan M Toro
- Information and Communication Technologies Engineering (ETIC), Language and Comparative Cognition Group - Center for Brain and Cognition, Universitat Pompeu FabraBarcelona, Spain; Institució Catalana de Recerca i Estudis AvançatsBarcelona, Spain
| |
Collapse
|
12
|
Booth AJ, Elliott MT. Early, but not late visual distractors affect movement synchronization to a temporal-spatial visual cue. Front Psychol 2015; 6:866. [PMID: 26157412 PMCID: PMC4478893 DOI: 10.3389/fpsyg.2015.00866] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2015] [Accepted: 06/12/2015] [Indexed: 11/13/2022] Open
Abstract
The ease of synchronizing movements to a rhythmic cue is dependent on the modality of the cue presentation: timing accuracy is much higher when synchronizing with discrete auditory rhythms than an equivalent visual stimulus presented through flashes. However, timing accuracy is improved if the visual cue presents spatial as well as temporal information (e.g., a dot following an oscillatory trajectory). Similarly, when synchronizing with an auditory target metronome in the presence of a second visual distracting metronome, the distraction is stronger when the visual cue contains spatial-temporal information rather than temporal only. The present study investigates individuals' ability to synchronize movements to a temporal-spatial visual cue in the presence of same-modality temporal-spatial distractors. Moreover, we investigated how increasing the number of distractor stimuli impacted on maintaining synchrony with the target cue. Participants made oscillatory vertical arm movements in time with a vertically oscillating white target dot centered on a large projection screen. The target dot was surrounded by 2, 8, or 14 distractor dots, which had an identical trajectory to the target but at a phase lead or lag of 0, 100, or 200 ms. We found participants' timing performance was only affected in the phase-lead conditions and when there were large numbers of distractors present (8 and 14). This asymmetry suggests participants still rely on salient events in the stimulus trajectory to synchronize movements. Subsequently, distractions occurring in the window of attention surrounding those events have the maximum impact on timing performance.
Collapse
Affiliation(s)
- Ashley J. Booth
- School of Psychology, University of Birmingham, Edgbaston, UK
| | - Mark T. Elliott
- School of Psychology, University of Birmingham, Edgbaston, UK
- Institute of Digital Healthcare, Warwick Manufacturing Group, University of Warwick, Coventry, UK
| |
Collapse
|
13
|
Su YH. Content congruency and its interplay with temporal synchrony modulate integration between rhythmic audiovisual streams. Front Integr Neurosci 2014; 8:92. [PMID: 25538576 PMCID: PMC4259108 DOI: 10.3389/fnint.2014.00092] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2014] [Accepted: 11/17/2014] [Indexed: 11/23/2022] Open
Abstract
Both lower-level stimulus factors (e.g., temporal proximity) and higher-level cognitive factors (e.g., content congruency) are known to influence multisensory integration. The former can direct attention in a converging manner, and the latter can indicate whether information from the two modalities belongs together. The present research investigated whether and how these two factors interacted in the perception of rhythmic, audiovisual (AV) streams derived from a human movement scenario. Congruency here was based on sensorimotor correspondence pertaining to rhythm perception. Participants attended to bimodal stimuli consisting of a humanlike figure moving regularly to a sequence of auditory beat, and detected a possible auditory temporal deviant. The figure moved either downwards (congruently) or upwards (incongruently) to the downbeat, while in both situations the movement was either synchronous with the beat, or lagging behind it. Greater cross-modal binding was expected to hinder deviant detection. Results revealed poorer detection for congruent than for incongruent streams, suggesting stronger integration in the former. False alarms increased in asynchronous stimuli only for congruent streams, indicating greater tendency for deviant report due to visual capture of asynchronous auditory events. In addition, a greater increase in perceived synchrony was associated with a greater reduction in false alarms for congruent streams, while the pattern was reversed for incongruent ones. These results demonstrate that content congruency as a top-down factor not only promotes integration, but also modulates bottom-up effects of synchrony. Results are also discussed regarding how theories of integration and attentional entrainment may be combined in the context of rhythmic multisensory stimuli.
Collapse
Affiliation(s)
- Yi-Huang Su
- Department of Movement Science, Faculty of Sport and Health Sciences, Technical University of Munich Munich, Germany
| |
Collapse
|
14
|
Su YH. Visual enhancement of auditory beat perception across auditory interference levels. Brain Cogn 2014; 90:19-31. [DOI: 10.1016/j.bandc.2014.05.003] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2014] [Revised: 04/11/2014] [Accepted: 05/08/2014] [Indexed: 11/16/2022]
|