1
|
Large EW, Roman I, Kim JC, Cannon J, Pazdera JK, Trainor LJ, Rinzel J, Bose A. Dynamic models for musical rhythm perception and coordination. Front Comput Neurosci 2023; 17:1151895. [PMID: 37265781 PMCID: PMC10229831 DOI: 10.3389/fncom.2023.1151895] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2023] [Accepted: 04/28/2023] [Indexed: 06/03/2023] Open
Abstract
Rhythmicity permeates large parts of human experience. Humans generate various motor and brain rhythms spanning a range of frequencies. We also experience and synchronize to externally imposed rhythmicity, for example from music and song or from the 24-h light-dark cycles of the sun. In the context of music, humans have the ability to perceive, generate, and anticipate rhythmic structures, for example, "the beat." Experimental and behavioral studies offer clues about the biophysical and neural mechanisms that underlie our rhythmic abilities, and about different brain areas that are involved but many open questions remain. In this paper, we review several theoretical and computational approaches, each centered at different levels of description, that address specific aspects of musical rhythmic generation, perception, attention, perception-action coordination, and learning. We survey methods and results from applications of dynamical systems theory, neuro-mechanistic modeling, and Bayesian inference. Some frameworks rely on synchronization of intrinsic brain rhythms that span the relevant frequency range; some formulations involve real-time adaptation schemes for error-correction to align the phase and frequency of a dedicated circuit; others involve learning and dynamically adjusting expectations to make rhythm tracking predictions. Each of the approaches, while initially designed to answer specific questions, offers the possibility of being integrated into a larger framework that provides insights into our ability to perceive and generate rhythmic patterns.
Collapse
Affiliation(s)
- Edward W. Large
- Department of Psychological Sciences, University of Connecticut, Mansfield, CT, United States
- Department of Physics, University of Connecticut, Mansfield, CT, United States
| | - Iran Roman
- Music and Audio Research Laboratory, New York University, New York, NY, United States
| | - Ji Chul Kim
- Department of Psychological Sciences, University of Connecticut, Mansfield, CT, United States
| | - Jonathan Cannon
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, ON, Canada
| | - Jesse K. Pazdera
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, ON, Canada
| | - Laurel J. Trainor
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, ON, Canada
| | - John Rinzel
- Center for Neural Science, New York University, New York, NY, United States
- Courant Institute of Mathematical Sciences, New York University, New York, NY, United States
| | - Amitabha Bose
- Department of Mathematical Sciences, New Jersey Institute of Technology, Newark, NJ, United States
| |
Collapse
|
2
|
Ono K. Enhancement of visuomotor synchronization by a regular pattern of stimulus presentation. Neurosci Lett 2022; 786:136798. [PMID: 35843470 DOI: 10.1016/j.neulet.2022.136798] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2022] [Revised: 06/26/2022] [Accepted: 07/11/2022] [Indexed: 11/19/2022]
Abstract
Stable synchronization with external auditory/visual events is important for cooperative behavior, such as playing music in an orchestra. One way to enhance synchronization in the auditory domain is by inserting different tones between tones to synchronize. Synchronized tapping for every other tone or more (1: n tapping) is less variable than that for each tone (1:1 tapping). This phenomenon is called the "subdivision benefit," which is interpreted as that additional temporal references by subdivided tones make synchronization more stable. However, it is unclear whether visuomotor synchronization becomes more stable by subdividing a stimulus sequence. To clarify this, the present study compared 1:3 tapping with a sequence of three-picture patterns and 1:1 tapping with a single picture repetition. When the inter-tap interval (ITI) was 1200 ms or more, the tapping variability showed a subdivision benefit, irrespective of the position of the pictures (1st, 2nd, or 3rd picture) in the three-picture pattern. However, when the ITI was <1000 ms, subdivision did not have any significant effect. These results imply that the subdivision benefit is due to the additional temporal reference provided by the subdivided stimuli, and the benefit depends on the ITI length.
Collapse
Affiliation(s)
- Kentaro Ono
- Center for Brain, Mind, and KANSEI Sciences Research, Hiroshima University, Japan.
| |
Collapse
|
3
|
Ono K, Hashimoto J, Sasaoka T. Intertap interval dependence of the subdivision effect in auditory-synchronised tapping. Eur J Neurosci 2021; 55:3391-3401. [PMID: 34766383 DOI: 10.1111/ejn.15529] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2020] [Revised: 08/01/2021] [Accepted: 11/03/2021] [Indexed: 11/30/2022]
Abstract
Precise temporal synchronisation between action and perception is crucial in daily life. Interestingly, synchronised tapping for every other tone or more (1:n tapping) is more precise than that for each tone (1:1 tapping), and this phenomenon is called 'subdivision benefit'. One hypothesis to explain this phenomenon is that there is a tendency to underestimate an empty interval, but the subdivision is used as an additional temporal reference and causes an illusionary longer intertap interval (ITI). The other hypothesis is based on strong/weak beats in a tone sequence made by subdivision. Because the strong beat improves the sensitivity of duration perception, synchronisation with strong beats should be better compared with other beats. Instead, the first hypothesis suggests that the subdivision benefit occurs irrespective of beat strength. The present study aimed to clarify this discrepancy using a 1:3 tapping task for a sequence of three-tone patterns and a 1:1 tapping task for a sequence of a single tone repetition. A further aim was to clarify the effect of musical experience. When the ITI was 900 ms or more, the variability of tapping showed the subdivision benefit, irrespective of beat strength. This result supports the first hypothesis, and musicians obtained more benefits than non-musicians. Instead, the timing of tap did not shorten by subdivision, except for the ITI of 900 ms. The findings implicate that the subdivision benefit is due to the additional temporal reference by the subdivided tones, and the benefit is dependent on the ITI length.
Collapse
Affiliation(s)
- Kentaro Ono
- Center for Brain, Mind, and KANSEI Sciences Research, Hiroshima University, Hiroshima, Japan
| | - Junya Hashimoto
- Graduate School of Education, Hiroshima University, Higashi-Hiroshima, Japan
| | - Takafumi Sasaoka
- Center for Brain, Mind, and KANSEI Sciences Research, Hiroshima University, Hiroshima, Japan
| |
Collapse
|
4
|
Lenc T, Merchant H, Keller PE, Honing H, Varlet M, Nozaradan S. Mapping between sound, brain and behaviour: four-level framework for understanding rhythm processing in humans and non-human primates. Philos Trans R Soc Lond B Biol Sci 2021; 376:20200325. [PMID: 34420381 PMCID: PMC8380981 DOI: 10.1098/rstb.2020.0325] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/14/2021] [Indexed: 12/16/2022] Open
Abstract
Humans perceive and spontaneously move to one or several levels of periodic pulses (a meter, for short) when listening to musical rhythm, even when the sensory input does not provide prominent periodic cues to their temporal location. Here, we review a multi-levelled framework to understanding how external rhythmic inputs are mapped onto internally represented metric pulses. This mapping is studied using an approach to quantify and directly compare representations of metric pulses in signals corresponding to sensory inputs, neural activity and behaviour (typically body movement). Based on this approach, recent empirical evidence can be drawn together into a conceptual framework that unpacks the phenomenon of meter into four levels. Each level highlights specific functional processes that critically enable and shape the mapping from sensory input to internal meter. We discuss the nature, constraints and neural substrates of these processes, starting with fundamental mechanisms investigated in macaque monkeys that enable basic forms of mapping between simple rhythmic stimuli and internally represented metric pulse. We propose that human evolution has gradually built a robust and flexible system upon these fundamental processes, allowing more complex levels of mapping to emerge in musical behaviours. This approach opens promising avenues to understand the many facets of rhythmic behaviours across individuals and species. This article is part of the theme issue 'Synchrony and rhythm interaction: from the brain to behavioural ecology'.
Collapse
Affiliation(s)
- Tomas Lenc
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, New South Wales 2751, Australia
- Institute of Neuroscience (IONS), Université Catholique de Louvain (UCL), Brussels 1200, Belgium
| | - Hugo Merchant
- Instituto de Neurobiologia, UNAM, Campus Juriquilla, Querétaro 76230, Mexico
| | - Peter E. Keller
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, New South Wales 2751, Australia
| | - Henkjan Honing
- Amsterdam Brain and Cognition (ABC), Institute for Logic, Language and Computation (ILLC), University of Amsterdam, Amsterdam 1090 GE, The Netherlands
| | - Manuel Varlet
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, New South Wales 2751, Australia
- School of Psychology, Western Sydney University, Penrith, New South Wales 2751, Australia
| | - Sylvie Nozaradan
- Institute of Neuroscience (IONS), Université Catholique de Louvain (UCL), Brussels 1200, Belgium
| |
Collapse
|
5
|
Møller C, Stupacher J, Celma-Miralles A, Vuust P. Beat perception in polyrhythms: Time is structured in binary units. PLoS One 2021; 16:e0252174. [PMID: 34415911 PMCID: PMC8378699 DOI: 10.1371/journal.pone.0252174] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2021] [Accepted: 08/01/2021] [Indexed: 11/19/2022] Open
Abstract
In everyday life, we group and subdivide time to understand the sensory environment surrounding us. Organizing time in units, such as diurnal rhythms, phrases, and beat patterns, is fundamental to behavior, speech, and music. When listening to music, our perceptual system extracts and nests rhythmic regularities to create a hierarchical metrical structure that enables us to predict the timing of the next events. Foot tapping and head bobbing to musical rhythms are observable evidence of this process. In the special case of polyrhythms, at least two metrical structures compete to become the reference for these temporal regularities, rendering several possible beats with which we can synchronize our movements. While there is general agreement that tempo, pitch, and loudness influence beat perception in polyrhythms, we focused on the yet neglected influence of beat subdivisions, i.e., the least common denominator of a polyrhythm ratio. In three online experiments, 300 participants listened to a range of polyrhythms and tapped their index fingers in time with the perceived beat. The polyrhythms consisted of two simultaneously presented isochronous pulse trains with different ratios (2:3, 2:5, 3:4, 3:5, 4:5, 5:6) and different tempi. For ratios 2:3 and 3:4, we additionally manipulated the pitch of the pulse trains. Results showed a highly robust influence of subdivision grouping on beat perception. This was manifested as a propensity towards beats that are subdivided into two or four equally spaced units, as opposed to beats with three or more complex groupings of subdivisions. Additionally, lower pitched pulse trains were more often perceived as the beat. Our findings suggest that subdivisions, not beats, are the basic unit of beat perception, and that the principle underlying the binary grouping of subdivisions reflects a propensity towards simplicity. This preference for simple grouping is widely applicable to human perception and cognition of time.
Collapse
Affiliation(s)
- Cecilie Møller
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Aarhus C, Denmark
| | - Jan Stupacher
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Aarhus C, Denmark
| | - Alexandre Celma-Miralles
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Aarhus C, Denmark
| | - Peter Vuust
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Aarhus C, Denmark
| |
Collapse
|
6
|
Colley ID, Varlet M, MacRitchie J, Keller PE. The influence of visual cues on temporal anticipation and movement synchronization with musical sequences. Acta Psychol (Amst) 2018; 191:190-200. [PMID: 30308442 DOI: 10.1016/j.actpsy.2018.09.014] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2018] [Revised: 08/24/2018] [Accepted: 09/28/2018] [Indexed: 12/29/2022] Open
Abstract
Music presents a complex case of movement timing, as one to several dozen musicians coordinate their actions at short time-scales. This process is often directed by a conductor who provides a visual beat and guides the ensemble through tempo changes. The current experiment tested the ways in which audio-motor coordination is influenced by visual cues from a conductor's gestures, and how this influence might manifest in two ways: movements used to produce sound related to the music, and movements of the upper-body that do not directly affect sound output. We designed a virtual conductor that was derived from morphed motion capture recordings of human conductors. Two groups of participants (29 musicians and 28 nonmusicians, to test the generalizability of visuo-motor synchronization to non-experts) were shown the virtual conductor, a simple visual metronome, or a stationary circle while completing a drumming task that required synchronization with tempo-changing musical sequences. We measured asynchronies and temporal anticipation in the drumming task, as well as participants' upper-body movement using motion capture. Drumming results suggest the conductor generally improves synchronization by facilitating anticipation of tempo changes in the music. Motion capture results showed that the conductor visual cue elicited more structured head movements than the other two visual cues for nonmusicians only. Multiple regression analysis showed that the nonmusicians with less rigid movement and high anticipation had lower asynchronies. Thus, the visual cues provided by a conductor might serve to facilitate temporal anticipation and more synchronous movement in the general population, but might also cause rigid ancillary movements in some non-experts.
Collapse
|
7
|
De Pretto M, Deiber MP, James CE. Steady-state evoked potentials distinguish brain mechanisms of self-paced versus synchronization finger tapping. Hum Mov Sci 2018; 61:151-166. [PMID: 30098488 DOI: 10.1016/j.humov.2018.07.007] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2018] [Revised: 07/12/2018] [Accepted: 07/18/2018] [Indexed: 10/28/2022]
Abstract
Sensorimotor synchronization (SMS) requires aligning motor actions to external events and represents a core part of both musical and dance performances. In the current study, to isolate the brain mechanisms involved in synchronizing finger tapping with a musical beat, we compared SMS to pure self-paced finger tapping and listen-only conditions at different tempi. We analyzed EEG data using frequency domain steady-state evoked potentials (SSEPs) to identify sustained electrophysiological brain activity during repetitive tasks. Behavioral results revealed different timing modes between SMS and self-paced finger tapping, associated with distinct scalp topographies, thus suggesting different underlying brain sources. After subtraction of the listen-only brain activity, SMS was compared to self-paced finger tapping. Resulting source estimations showed stronger activation of the left inferior frontal gyrus during SMS, and stronger activation of the bilateral inferior parietal lobule during self-paced finger tapping. These results point to the left inferior frontal gyrus as a pivot for perception-action coupling. We discuss our findings in the context of the ongoing debate about SSEPs interpretation given the variety of brain events contributing to SSEPs and similar EEG frequency responses.
Collapse
Affiliation(s)
- Michael De Pretto
- Faculty of Psychology and Educational Sciences, Department of Psychology, University of Geneva, 40 Boulevard du Pont-d'Arve, CH-1211 Geneva, Switzerland; Neurology Unit, Medicine Department, Faculty of Sciences, University of Fribourg, Chemin du Musée 5, CH-1700 Fribourg, Switzerland; School of Philosophy, Psychology and Language Sciences, Department of Psychology, University of Edinburgh, 7 George Square, Edinburgh EH8 9JZ, UK.
| | - Marie-Pierre Deiber
- Psychiatry Department, Division of Psychiatric Specialties, University Hospitals of Geneva, 20 bis rue de Lausanne, CH-1201 Geneva, Switzerland; NCCR Synapsy, 9 Chemin des Mines, CH-1202 Geneva, Switzerland
| | - Clara E James
- Faculty of Psychology and Educational Sciences, Department of Psychology, University of Geneva, 40 Boulevard du Pont-d'Arve, CH-1211 Geneva, Switzerland; School of Health Sciences Geneva, HES-SO University of Applied Sciences and Arts Western Switzerland, 47 Avenue de Champel, CH-1206 Geneva, Switzerland
| |
Collapse
|
8
|
Falk S, Volpi-Moncorger C, Dalla Bella S. Auditory-Motor Rhythms and Speech Processing in French and German Listeners. Front Psychol 2017; 8:395. [PMID: 28443036 PMCID: PMC5387104 DOI: 10.3389/fpsyg.2017.00395] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2016] [Accepted: 03/02/2017] [Indexed: 11/25/2022] Open
Abstract
Moving to a speech rhythm can enhance verbal processing in the listener by increasing temporal expectancies (Falk and Dalla Bella, 2016). Here we tested whether this hypothesis holds for prosodically diverse languages such as German (a lexical stress-language) and French (a non-stress language). Moreover, we examined the relation between motor performance and the benefits for verbal processing as a function of language. Sixty-four participants, 32 German and 32 French native speakers detected subtle word changes in accented positions in metrically structured sentences to which they previously tapped with their index finger. Before each sentence, they were cued by a metronome to tap either congruently (i.e., to accented syllables) or incongruently (i.e., to non-accented parts) to the following speech stimulus. Both French and German speakers detected words better when cued to tap congruently compared to incongruent tapping. Detection performance was predicted by participants' motor performance in the non-verbal cueing phase. Moreover, tapping rate while participants tapped to speech predicted detection differently for the two language groups, in particular in the incongruent tapping condition. We discuss our findings in light of the rhythmic differences of both languages and with respect to recent theories of expectancy-driven and multisensory speech processing.
Collapse
Affiliation(s)
- Simone Falk
- Institut für Deutsche Philologie, Ludwig-Maximilians-UniversityMunich, Germany.,Laboratoire Parole et Langage, UMR 7309, Centre National de la Recherche Scientifique, Aix-Marseille UniversityAix-en-Provence, France.,Laboratoire Phonétique et Phonologie, UMR 7018, CNRS, Université Sorbonne Nouvelle Paris-3Paris, France
| | - Chloé Volpi-Moncorger
- Laboratoire Parole et Langage, UMR 7309, Centre National de la Recherche Scientifique, Aix-Marseille UniversityAix-en-Provence, France
| | - Simone Dalla Bella
- EuroMov, University of MontpellierMontpellier, France.,Institut Universitaire de FranceParis, France.,International Laboratory for Brain, Music, and Sound ResearchMontreal, QC, Canada.,Department of Cognitive Psychology, Wyższa Szkoła Finansów i Zarządzania w Warszawie (WSFiZ)Warsaw, Poland
| |
Collapse
|
9
|
Su YH. Sensorimotor Synchronization with Different Metrical Levels of Point-Light Dance Movements. Front Hum Neurosci 2016; 10:186. [PMID: 27199709 PMCID: PMC4846664 DOI: 10.3389/fnhum.2016.00186] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2016] [Accepted: 04/12/2016] [Indexed: 11/13/2022] Open
Abstract
Rhythm perception and synchronization have been extensively investigated in the auditory domain, as they underlie means of human communication such as music and speech. Although recent studies suggest comparable mechanisms for synchronizing with periodically moving visual objects, the extent to which it applies to ecologically relevant information, such as the rhythm of complex biological motion, remains unknown. The present study addressed this issue by linking rhythm of music and dance in the framework of action-perception coupling. As a previous study showed that observers perceived multiple metrical periodicities in dance movements that embodied this structure, the present study examined whether sensorimotor synchronization (SMS) to dance movements resembles what is known of auditory SMS. Participants watched a point-light figure performing two basic steps of Swing dance cyclically, in which the trunk bounced at every beat and the limbs moved at every second beat, forming two metrical periodicities. Participants tapped synchronously to the bounce of the trunk with or without the limbs moving in the stimuli (Experiment 1), or tapped synchronously to the leg movements with or without the trunk bouncing simultaneously (Experiment 2). Results showed that, while synchronization with the bounce (lower-level pulse) was not influenced by the presence or absence of limb movements (metrical accent), synchronization with the legs (beat) was improved by the presence of the bounce (metrical subdivision) across different movement types. The latter finding parallels the “subdivision benefit” often demonstrated in auditory tasks, suggesting common sensorimotor mechanisms for visual rhythms in dance and auditory rhythms in music.
Collapse
Affiliation(s)
- Yi-Huang Su
- Department of Movement Science, Faculty of Sport and Health Sciences, Technical University of Munich Munich, Germany
| |
Collapse
|
10
|
Large EW, Herrera JA, Velasco MJ. Neural Networks for Beat Perception in Musical Rhythm. Front Syst Neurosci 2015; 9:159. [PMID: 26635549 PMCID: PMC4658578 DOI: 10.3389/fnsys.2015.00159] [Citation(s) in RCA: 119] [Impact Index Per Article: 13.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2015] [Accepted: 11/02/2015] [Indexed: 11/30/2022] Open
Abstract
Entrainment of cortical rhythms to acoustic rhythms has been hypothesized to be the neural correlate of pulse and meter perception in music. Dynamic attending theory first proposed synchronization of endogenous perceptual rhythms nearly 40 years ago, but only recently has the pivotal role of neural synchrony been demonstrated. Significant progress has since been made in understanding the role of neural oscillations and the neural structures that support synchronized responses to musical rhythm. Synchronized neural activity has been observed in auditory and motor networks, and has been linked with attentional allocation and movement coordination. Here we describe a neurodynamic model that shows how self-organization of oscillations in interacting sensory and motor networks could be responsible for the formation of the pulse percept in complex rhythms. In a pulse synchronization study, we test the model's key prediction that pulse can be perceived at a frequency for which no spectral energy is present in the amplitude envelope of the acoustic rhythm. The result shows that participants perceive the pulse at the theoretically predicted frequency. This model is one of the few consistent with neurophysiological evidence on the role of neural oscillation, and it explains a phenomenon that other computational models fail to explain. Because it is based on a canonical model, the predictions hold for an entire family of dynamical systems, not only a specific one. Thus, this model provides a theoretical link between oscillatory neurodynamics and the induction of pulse and meter in musical rhythm.
Collapse
Affiliation(s)
- Edward W Large
- Department of Psychological Sciences, University of Connecticut Storrs, CT, USA ; Department of Physics, University of Connecticut Storrs, CT, USA
| | - Jorge A Herrera
- Department of Music, Center for Computer Research in Music and Acoustics, Stanford University Stanford, CA, USA
| | - Marc J Velasco
- Center for Complex Systems and Brain Sciences, Florida Atlantic University Boca Raton, FL, USA
| |
Collapse
|
11
|
Understanding bimanual coordination across small time scales from an electrophysiological perspective. Neurosci Biobehav Rev 2014; 47:614-35. [DOI: 10.1016/j.neubiorev.2014.10.003] [Citation(s) in RCA: 32] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2014] [Revised: 09/16/2014] [Accepted: 10/01/2014] [Indexed: 01/20/2023]
|
12
|
Laroche J, Berardi AM, Brangier E. Embodiment of intersubjective time: relational dynamics as attractors in the temporal coordination of interpersonal behaviors and experiences. Front Psychol 2014; 5:1180. [PMID: 25400598 PMCID: PMC4215825 DOI: 10.3389/fpsyg.2014.01180] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2014] [Accepted: 09/29/2014] [Indexed: 11/23/2022] Open
Abstract
This paper addresses the issue of “being together,” and more specifically the issue of “being together in time.” We provide with an integrative framework that is inspired by phenomenology, the enactive approach and dynamical systems theories. To do so, we first define embodiment as a living and lived phenomenon that emerges from agent-world coupling. We then show that embodiment is essentially dynamical and therefore we describe experiential, behavioral and brain dynamics. Both lived temporality and the temporality of the living appear to be complex, multiscale phenomena. Next we discuss embodied dynamics in the context of interpersonal interactions, and briefly review the empirical literature on between-persons temporal coordination. Overall, we propose that being together in time emerges from the relational dynamics of embodied interactions and their flexible co-regulation.
Collapse
Affiliation(s)
- Julien Laroche
- Akoustic Arts R&D Laboratory Paris, France ; PErSEUs, Université de Lorraine Metz, France
| | | | | |
Collapse
|
13
|
Su YH. Visual enhancement of auditory beat perception across auditory interference levels. Brain Cogn 2014; 90:19-31. [DOI: 10.1016/j.bandc.2014.05.003] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2014] [Revised: 04/11/2014] [Accepted: 05/08/2014] [Indexed: 11/16/2022]
|
14
|
Su YH. Audiovisual beat induction in complex auditory rhythms: point-light figure movement as an effective visual beat. Acta Psychol (Amst) 2014; 151:40-50. [PMID: 24932996 DOI: 10.1016/j.actpsy.2014.05.016] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2013] [Revised: 05/12/2014] [Accepted: 05/26/2014] [Indexed: 10/25/2022] Open
Abstract
This study investigated whether explicit beat induction in the auditory, visual, and audiovisual (bimodal) modalities aided the perception of weakly metrical auditory rhythms, and whether it reinforced attentional entrainment to the beat of these rhythms. The visual beat-inducer was a periodically bouncing point-light figure, which aimed to examine whether an observed rhythmic human movement could induce a beat that would influence auditory rhythm perception. In two tasks, participants listened to three repetitions of an auditory rhythm that were preceded and accompanied by (1) an auditory beat, (2) a bouncing point-light figure, (3) a combination of (1) and (2) synchronously, or (4) a combination of (1) and (2), with the figure moving in anti-phase to the auditory beat. Participants reproduced the auditory rhythm subsequently (Experiment 1), or detected a possible temporal change in the third repetition (Experiment 2). While an explicit beat did not improve rhythm reproduction, possibly due to the syncopated rhythms when a beat was imposed, bimodal beat induction yielded greater sensitivity to a temporal deviant in on-beat than in off-beat positions. Moreover, the beat phase of the figure movement determined where on-beat accents were perceived during bimodal induction. Results are discussed with regard to constrained beat induction in complex auditory rhythms, visual modulation of auditory beat perception, and possible mechanisms underlying the preferred visual beat consisting of rhythmic human motions.
Collapse
|
15
|
Abstract
Sensorimotor synchronization (SMS) is the coordination of rhythmic movement with an external rhythm, ranging from finger tapping in time with a metronome to musical ensemble performance. An earlier review (Repp, 2005) covered tapping studies; two additional reviews (Repp, 2006a, b) focused on music performance and on rate limits of SMS, respectively. The present article supplements and extends these earlier reviews by surveying more recent research in what appears to be a burgeoning field. The article comprises four parts, dealing with (1) conventional tapping studies, (2) other forms of moving in synchrony with external rhythms (including dance and nonhuman animals' synchronization abilities), (3) interpersonal synchronization (including musical ensemble performance), and (4) the neuroscience of SMS. It is evident that much new knowledge about SMS has been acquired in the last 7 years.
Collapse
|
16
|
Bavassi ML, Tagliazucchi E, Laje R. Small perturbations in a finger-tapping task reveal inherent nonlinearities of the underlying error correction mechanism. Hum Mov Sci 2013; 32:21-47. [PMID: 23375111 DOI: 10.1016/j.humov.2012.06.002] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2011] [Revised: 05/08/2012] [Accepted: 06/28/2012] [Indexed: 10/27/2022]
Abstract
Time processing in the few hundred milliseconds range is involved in the human skill of sensorimotor synchronization, like playing music in an ensemble or finger tapping to an external beat. In finger tapping, a mechanistic explanation in biologically plausible terms of how the brain achieves synchronization is still missing despite considerable research. In this work we show that nonlinear effects are important for the recovery of synchronization following a perturbation (a step change in stimulus period), even for perturbation magnitudes smaller than 10% of the period, which is well below the amount of perturbation needed to evoke other nonlinear effects like saturation. We build a nonlinear mathematical model for the error correction mechanism and test its predictions, and further propose a framework that allows us to unify the description of the three common types of perturbations. While previous authors have used two different model mechanisms for fitting different perturbation types, or have fitted different parameter value sets for different perturbation magnitudes, we propose the first unified description of the behavior following all perturbation types and magnitudes as the dynamical response of a compound model with fixed terms and a single set of parameter values.
Collapse
Affiliation(s)
- M Luz Bavassi
- Departamento de Ciencia y Tecnología, Universidad Nacional de Quilmes, R.S. Peña 352, Bernal B1876BXD, Argentina
| | | | | |
Collapse
|
17
|
Abstract
In the present review, we focus on how commonalities in the ontogenetic development of the auditory and tactile sensory systems may inform the interplay between these signals in the temporal domain. In particular, we describe the results of behavioral studies that have investigated temporal resolution (in temporal order, synchrony/asynchrony, and simultaneity judgment tasks), as well as temporal numerosity perception, and similarities in the perception of frequency across touch and hearing. The evidence reviewed here highlights features of audiotactile temporal perception that are distinctive from those seen for other pairings of sensory modalities. For instance, audiotactile interactions are characterized in certain tasks (e.g., temporal numerosity judgments) by a more balanced reciprocal influence than are other modality pairings. Moreover, relative spatial position plays a different role in the temporal order and temporal recalibration processes for audiotactile stimulus pairings than for other modality pairings. The effect exerted by both the spatial arrangement of stimuli and attention on temporal order judgments is described. Moreover, a number of audiotactile interactions occurring during sensory-motor synchronization are highlighted. We also look at the audiotactile perception of rhythm and how it may be affected by musical training. The differences emerging from this body of research highlight the need for more extensive investigation into audiotactile temporal interactions. We conclude with a brief overview of some of the key issues deserving of further research in this area.
Collapse
|
18
|
Abstract
Differences in timing control processes between tapping and circle drawing have been extensively documented during continuation timing. Differences between event and emergent control processes have also been documented for synchronization timing using emergent tasks that have minimal event-related information. However, it is not known whether the original circle-drawing task also behaves differently than tapping during synchronization. In this experiment, 10 participants performed a table-tapping and a continuous circle-drawing task to an auditory metronome. Synchronization performance was assessed via the value and variability of asynchronies. Synchronization was substantially more difficult in circle drawing than in tapping. Participants drawing timed circles exhibited drift in synchronization error and did not maintain a consistent phase relationship with the metronome. An analysis of temporal anchoring revealed that timing to the timing target was not more accurate than timing to other locations on the circle trajectory. The authors conclude that participants were not able to synchronize movement with metronome tones in the circle-drawing task despite other findings that cyclical tasks do exhibit auditory motor synchronization, because the circle-drawing task is unique and absent of event and cycle position information.
Collapse
Affiliation(s)
- Breanna E Studenka
- Department of Kinesiology, the Pennsylvania State University, University Park, PA 16802, USA.
| | | |
Collapse
|
19
|
Studenka BE, Zelaznik HN. Synchronization in repetitive smooth movement requires perceptible events. Acta Psychol (Amst) 2011; 136:432-41. [PMID: 21300324 DOI: 10.1016/j.actpsy.2011.01.011] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2010] [Revised: 01/11/2011] [Accepted: 01/12/2011] [Indexed: 11/18/2022] Open
Abstract
Accurate timing performance during auditory-motor synchronization has been well documented for finger tapping tasks. It is believed that information pertaining to an event in movement production aids in detecting and correcting for errors between movement cycle completion and the metronome tone. Tasks with minimal event-related information exhibit more variable synchronization and less rapid error correction. Recent work from our laboratory has indicated that a task purportedly lacking an event structure (circle drawing) did not exhibit accurate synchronization or error correction (Studenka & Zelaznik, in press). In the present paper we report on two experiments examining synchronization in tapping and circle drawing tasks. In Experiment 1, error correction processes of an event-timed tapping timing task and an emergently timed circle drawing timing task were examined. Rapid and complete error correction was seen for the tapping, but not for the circle drawing task. In Experiment 2, a perceptual event was added to delineate a cycle in circle drawing, and the perceptual event of table contact was removed from the tapping task. The inclusion of an event produced a marked improvement in synchronization error correction for circle drawing, and the removal of tactile feedback (taking away an event) slightly reduced the error correction response of tapping. Furthermore, the task kinematics of circle drawing remained smooth providing evidence that event structure can be kinematic or perceptual in nature. Thus, synchronization and error correction, characteristic of event timing (Ivry, Spencer, Zelaznik, & Diedrichsen, 2002; Repp, 2005), depends upon the presence of a distinguishable source of sensory information at the timing goal.
Collapse
Affiliation(s)
- Breanna E Studenka
- Department of Health and Kinesiology, Purdue University, West Lafayette, IN 47907, USA.
| | | |
Collapse
|
20
|
Abstract
Synchronising movements with events in the surrounding environment is an ubiquitous aspect of everyday behaviour. Often, information about a stream of events is available across sensory modalities. While it is clear that we synchronise more accurately to auditory cues than other modalities, little is known about how the brain combines multisensory signals to produce accurately timed actions. Here, we investigate multisensory integration for sensorimotor synchronisation. We extend the prevailing linear phase correction model for movement synchronisation, describing asynchrony variance in terms of sensory, motor and timekeeper components. Then we assess multisensory cue integration, deriving predictions based on the optimal combination of event time, defined across different sensory modalities. Participants tapped in time with metronomes presented via auditory, visual and tactile modalities, under either unimodal or bimodal presentation conditions. Temporal regularity was manipulated between modalities by applying jitter to one of the metronomes. Results matched the model predictions closely for all except high jitter level conditions in audio-visual and audio-tactile combinations, where a bias for auditory signals was observed. We suggest that, in the production of repetitive timed actions, cues are optimally integrated in terms of both sensory and temporal reliability of events. However, when temporal discrepancy between cues is high they are treated independently, with movements timed to the cue with the highest sensory reliability.
Collapse
Affiliation(s)
- M T Elliott
- School of Psychology, University of Birmingham, Edgbaston, B15 2TT, UK.
| | | | | |
Collapse
|
21
|
Tallet J, Barral J, James C, Hauert CA. Stability-dependent behavioural and electro-cortical reorganizations during intentional switching between bimanual tapping modes. Neurosci Lett 2010; 483:118-22. [PMID: 20678541 DOI: 10.1016/j.neulet.2010.07.074] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2010] [Revised: 07/21/2010] [Accepted: 07/26/2010] [Indexed: 11/16/2022]
Abstract
This study investigated behavioural and electro-cortical reorganizations accompanying intentional switching between two distinct bimanual coordination tapping modes (In-phase and Anti-phase) that differ in stability when produced at the same movement rate. We expected that switching to a less stable tapping mode (In-to-Anti switching) would lead to larger behavioural perturbations and require supplementary neural resources than switching to a more stable tapping mode (Anti-to-In switching). Behavioural results confirmed that the In-to-Anti switching lasted longer than the Anti-to-In switching. A general increase in attention-related neural activity was found at the moment of switching for both conditions. Additionally, two condition-dependent EEG reorganizations were observed. First, a specific increase in cortico-cortical coherence appeared exclusively during the In-to-Anti switching. This result may reflect a strengthening in inter-regional communication in order to engage in the subsequent, less stable, tapping mode. Second, a decrease in motor-related neural activity (increased beta spectral power) was found for the Anti-to-In switching only. The latter effect may reflect the interruption of the previous, less stable, tapping mode. Given that previous results on spontaneous Anti-to-In switching revealing an inverse pattern of EEG reorganization (decreased beta spectral power), present findings give new insight on the stability-dependent neural correlates of intentional motor switching.
Collapse
|
22
|
Repp BH, London J, Keller PE. Perception-production relationships and phase correction in synchronization with two-interval rhythms. PSYCHOLOGICAL RESEARCH 2010; 75:227-42. [PMID: 20644955 DOI: 10.1007/s00426-010-0301-8] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2010] [Accepted: 07/05/2010] [Indexed: 11/27/2022]
Abstract
Two experiments investigated the effects of interval duration ratio on perception of local timing perturbations, accuracy of rhythm production, and phase correction in musicians listening to or tapping in synchrony with cyclically repeated auditory two-interval rhythms. Ratios ranged from simple (1:2) to complex (7:11, 5:13), and from small (5:13 = 0.38) to large (6:7 = 0.86). Rhythm production and perception exhibited similar ratio-dependent biases: rhythms with small ratios were produced with increased ratios, and timing perturbations in these rhythms tended to be harder to detect when they locally increased the ratio than when they reduced it. The opposite held for rhythms with large ratios. This demonstrates a close relation between rhythm perception and production. Unexpectedly, however, the neutral "attractor" was not the simplest ratio (1:2 = 0.50) but a complex ratio near 4:7 (= 0.57). Phase correction in response to perturbations was generally rapid and did not show the ratio-dependent biases observed in rhythm perception and production. Thus, phase correction operates efficiently and autonomously even in synchronization with rhythms exhibiting complex interval ratios.
Collapse
Affiliation(s)
- Bruno H Repp
- Haskins Laboratories, 300 George Street, New Haven, CT 06511-6624, USA.
| | | | | |
Collapse
|
23
|
Repp BH. Sensorimotor synchronization and perception of timing: Effects of music training and task experience. Hum Mov Sci 2010; 29:200-13. [PMID: 20074825 DOI: 10.1016/j.humov.2009.08.002] [Citation(s) in RCA: 126] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2009] [Revised: 08/12/2009] [Accepted: 08/29/2009] [Indexed: 11/29/2022]
|
24
|
Abstract
The experience of musical rhythm is a remarkable psychophysical phenomenon, in part because the perception of periodicities, namely pulse and meter, arise from stimuli that are not periodic. One possible function of such a transformation is to enable synchronization between individuals through perception of a common abstract temporal structure (e.g., during music performance). Thus, understanding the brain processes that underlie rhythm perception is fundamental to explaining musical behavior. Here, we propose that neural resonance provides an excellent account of many aspects of human rhythm perception. Our framework is consistent with recent brain-imaging studies showing neural correlates of rhythm perception in high-frequency oscillatory activity, and leads to the hypothesis that perception of pulse and meter result from rhythmic bursts of high-frequency neural activity in response to musical rhythms. High-frequency bursts of activity may enable communication between neural areas, such as auditory and motor cortices, during rhythm perception and production.
Collapse
Affiliation(s)
- Edward W Large
- Center for Complex Systems and Brain Sciences, Florida Atlantic University, Boca Raton, Florida 33431, USA.
| | | |
Collapse
|
25
|
Repp BH. Rhythmic sensorimotor coordination is resistant but not immune to auditory stream segregation. Q J Exp Psychol (Hove) 2009; 62:2306-12. [PMID: 19626575 DOI: 10.1080/17470210903118107] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
In a recent study of musicians' sensorimotor synchronization with auditory sequences composed either of beat and subdivision tones differing in pitch or of beat tones only, Repp (2009) found that the phase correction response (PCR) to perturbed beats was inhibited by the presence of subdivisions regardless of whether beats and subdivisions formed integrated or segregated perceptual streams. The present study used a different paradigm in which perturbed subdivisions triggered the PCR. At the slower of two sequence tempi, the PCR was equally large in integrated and segregated conditions, but at the faster tempo stream segregation reduced the PCR substantially. This new finding indicates that although the PCR is strongly resistant to auditory stream segregation, it is not totally immune to it.
Collapse
Affiliation(s)
- Bruno H Repp
- Haskins Laboratories, New Haven, CT 06511-6624, USA.
| |
Collapse
|
26
|
Flexibility of temporal expectations for triple subdivision of a beat. Adv Cogn Psychol 2009; 5:27-41. [PMID: 20523848 PMCID: PMC2865005 DOI: 10.2478/v10053-008-0063-7] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2008] [Accepted: 01/26/2009] [Indexed: 11/20/2022] Open
Abstract
When tapping in synchrony with an isochronous sequence of beats, participants respond automatically to an unexpectedly early or late beat by shifting their next tap; this is termed the phase correction response (PCR). A PCR has also been observed in response to unexpected perturbations of metrical subdivisions of a beat, which suggests that participants have temporal expectancies for subdivisions to occur at particular time points. It has been demonstrated that a latent temporal expectancy at 1/2 of the inter-beat interval (IBI) exists even in the absence of explicit duple subdivision in previous IBIs of a sequence. The present study asked whether latent expectancies at 1/3 and 2/3 of the IBI can be induced by a global experimental context of triple subdivision, and whether a local context of consistently phase-shifted triple subdivisions can induce different expectancies. Using the PCR as the dependent variable, we find weak evidence for latent expectancies but strong evidence for context-induced shifts in expectancies. These results suggest that temporal referents between beats, which typically are linked to simple ratios of time spans, are flexible and context-dependent. In addition, we show that the PCR, a response to expectancy violation, is independent of and sometimes contrary to the simultaneous phase adaptation required by a change in subdivision timing.
Collapse
|
27
|
Repp BH. Segregated in perception, integrated for action: immunity of rhythmic sensorimotor coordination to auditory stream segregation. Q J Exp Psychol (Hove) 2008; 62:426-34. [PMID: 19037831 DOI: 10.1080/17470210802479105] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
Auditory stream segregation can occur when tones of different pitch (A, B) are repeated cyclically: The larger the pitch separation and the faster the tempo, the more likely perception of two separate streams is to occur. The present study assessed stream segregation in perceptual and sensorimotor tasks, using identical ABBABB ... sequences. The perceptual task required detection of single phase-shifted A tones; this was expected to be facilitated by the presence of B tones unless segregation occurred. The sensorimotor task required tapping in synchrony with the A tones; here the phase correction response (PCR) to shifted A tones was expected to be inhibited by B tones unless segregation occurred. Two sequence tempi and three pitch separations (2, 10, and 48 semitones) were used with musically trained participants. Facilitation of perception occurred only at the smallest pitch separation, whereas the PCR was reduced equally at all separations. These results indicate that auditory action control is immune to perceptual stream segregation, at least in musicians. This may help musicians coordinate with diverse instruments in ensemble playing.
Collapse
Affiliation(s)
- Bruno H Repp
- Haskins Laboratories, New Haven, CT 06511-6624, USA.
| |
Collapse
|
28
|
Grube M, Griffiths TD. Metricality-enhanced temporal encoding and the subjective perception of rhythmic sequences. Cortex 2008; 45:72-9. [PMID: 19058797 DOI: 10.1016/j.cortex.2008.01.006] [Citation(s) in RCA: 54] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2007] [Revised: 07/25/2007] [Accepted: 01/10/2008] [Indexed: 10/21/2022]
Abstract
Feeling the beat of a musical piece is easier for some pieces than others, depending on the underlying metrical structure. The present study sought to determine whether increasing metricality, meaning the amount of information supporting an intended meter, would elicit a corresponding increase in the precision of the temporal encoding of rhythmic sequences. Metricality was varied i) by using the Povel and Essens (1985) model of temporal accent induction to create a strong or weak sense of meter and ii) by including metrically plausible (compact) or implausible (open) endings. Precision of temporal encoding as a function of degree of metricality was assessed in an adaptively controlled change detection task. The change to be detected was a perturbation of relative interval timing that affected sequences as a whole rather than at specific points only. Change detection thresholds were significantly lower for sequences featuring a strong compared to a weak meter, and a compact compared to an open ending. Subjective ratings of rhythmicality of sequences also yielded main effects of strength of meter and ending. The data support an increase in the precision of temporal pattern encoding for sequences with a higher-order metrical time framework.
Collapse
Affiliation(s)
- Manon Grube
- Newcastle Auditory Group, Medical School, Newcastle University, Framlington Place, Newcastle-upon-Tyne, UK.
| | | |
Collapse
|