1
|
Mares C, Echavarría Solana R, Assaneo MF. Auditory-motor synchronization varies among individuals and is critically shaped by acoustic features. Commun Biol 2023; 6:658. [PMID: 37344562 DOI: 10.1038/s42003-023-04976-y] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2022] [Accepted: 05/24/2023] [Indexed: 06/23/2023] Open
Abstract
The ability to synchronize body movements with quasi-regular auditory stimuli represents a fundamental trait in humans at the core of speech and music. Despite the long trajectory of the study of such ability, little attention has been paid to how acoustic features of the stimuli and individual differences can modulate auditory-motor synchrony. Here, by exploring auditory-motor synchronization abilities across different effectors and types of stimuli, we revealed that this capability is more restricted than previously assumed. While the general population can synchronize to sequences composed of the repetitions of the same acoustic unit, the synchrony in a subgroup of participants is impaired when the unit's identity varies across the sequence. In addition, synchronization in this group can be temporarily restored by being primed by a facilitator stimulus. Auditory-motor integration is stable across effectors, supporting the hypothesis of a central clock mechanism subserving the different articulators but critically shaped by the acoustic features of the stimulus and individual abilities.
Collapse
Affiliation(s)
- Cecilia Mares
- Institute of Neurobiology, National Autonomous University of Mexico, Juriquilla, Querétaro, Mexico
| | | | - M Florencia Assaneo
- Institute of Neurobiology, National Autonomous University of Mexico, Juriquilla, Querétaro, Mexico.
| |
Collapse
|
2
|
Roman IR, Roman AS, Kim JC, Large EW. Hebbian learning with elasticity explains how the spontaneous motor tempo affects music performance synchronization. PLoS Comput Biol 2023; 19:e1011154. [PMID: 37285380 DOI: 10.1371/journal.pcbi.1011154] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2022] [Accepted: 05/02/2023] [Indexed: 06/09/2023] Open
Abstract
A musician's spontaneous rate of movement, called spontaneous motor tempo (SMT), can be measured while spontaneously playing a simple melody. Data shows that the SMT influences the musician's tempo and synchronization. In this study we present a model that captures these phenomena. We review the results from three previously-published studies: solo musical performance with a pacing metronome tempo that is different from the SMT, solo musical performance without a metronome at a tempo that is faster or slower than the SMT, and duet musical performance between musicians with matching or mismatching SMTs. These studies showed, respectively, that the asynchrony between the pacing metronome and the musician's tempo grew as a function of the difference between the metronome tempo and the musician's SMT, musicians drifted away from the initial tempo toward the SMT, and the absolute asynchronies were smaller if musicians had matching SMTs. We hypothesize that the SMT constantly acts as a pulling force affecting musical actions at a tempo different from a musician's SMT. To test our hypothesis, we developed a model consisting of a non-linear oscillator with Hebbian tempo learning and a pulling force to the model's spontaneous frequency. While the model's spontaneous frequency emulates the SMT, elastic Hebbian learning allows for frequency learning to match a stimulus' frequency. To test our hypothesis, we first fit model parameters to match the data in the first of the three studies and asked whether this same model would explain the data the remaining two studies without further tuning. Results showed that the model's dynamics allowed it to explain all three experiments with the same set of parameters. Our theory offers a dynamical-systems explanation of how an individual's SMT affects synchronization in realistic music performance settings, and the model also enables predictions about performance settings not yet tested.
Collapse
Affiliation(s)
- Iran R Roman
- Center for Computer Research in Music and Acoustics, Department of Music, Stanford University, Stanford, California, United States of America
| | - Adrian S Roman
- Department of Mathematics, University of California Davis, Davis, California, United States of America
| | - Ji Chul Kim
- Department of Psychological Sciences, University of Connecticut, Storrs, Connecticut, United States of America
| | - Edward W Large
- Department of Psychological Sciences, University of Connecticut, Storrs, Connecticut, United States of America
- Department of Physics, University of Connecticut, Storrs, Connecticut, United States of America
| |
Collapse
|
3
|
Large EW, Roman I, Kim JC, Cannon J, Pazdera JK, Trainor LJ, Rinzel J, Bose A. Dynamic models for musical rhythm perception and coordination. Front Comput Neurosci 2023; 17:1151895. [PMID: 37265781 PMCID: PMC10229831 DOI: 10.3389/fncom.2023.1151895] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2023] [Accepted: 04/28/2023] [Indexed: 06/03/2023] Open
Abstract
Rhythmicity permeates large parts of human experience. Humans generate various motor and brain rhythms spanning a range of frequencies. We also experience and synchronize to externally imposed rhythmicity, for example from music and song or from the 24-h light-dark cycles of the sun. In the context of music, humans have the ability to perceive, generate, and anticipate rhythmic structures, for example, "the beat." Experimental and behavioral studies offer clues about the biophysical and neural mechanisms that underlie our rhythmic abilities, and about different brain areas that are involved but many open questions remain. In this paper, we review several theoretical and computational approaches, each centered at different levels of description, that address specific aspects of musical rhythmic generation, perception, attention, perception-action coordination, and learning. We survey methods and results from applications of dynamical systems theory, neuro-mechanistic modeling, and Bayesian inference. Some frameworks rely on synchronization of intrinsic brain rhythms that span the relevant frequency range; some formulations involve real-time adaptation schemes for error-correction to align the phase and frequency of a dedicated circuit; others involve learning and dynamically adjusting expectations to make rhythm tracking predictions. Each of the approaches, while initially designed to answer specific questions, offers the possibility of being integrated into a larger framework that provides insights into our ability to perceive and generate rhythmic patterns.
Collapse
Affiliation(s)
- Edward W. Large
- Department of Psychological Sciences, University of Connecticut, Mansfield, CT, United States
- Department of Physics, University of Connecticut, Mansfield, CT, United States
| | - Iran Roman
- Music and Audio Research Laboratory, New York University, New York, NY, United States
| | - Ji Chul Kim
- Department of Psychological Sciences, University of Connecticut, Mansfield, CT, United States
| | - Jonathan Cannon
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, ON, Canada
| | - Jesse K. Pazdera
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, ON, Canada
| | - Laurel J. Trainor
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, ON, Canada
| | - John Rinzel
- Center for Neural Science, New York University, New York, NY, United States
- Courant Institute of Mathematical Sciences, New York University, New York, NY, United States
| | - Amitabha Bose
- Department of Mathematical Sciences, New Jersey Institute of Technology, Newark, NJ, United States
| |
Collapse
|
4
|
Yin B, Shi Z, Wang Y, Meck WH. Oscillation/Coincidence-Detection Models of Reward-Related Timing in Corticostriatal Circuits. TIMING & TIME PERCEPTION 2022. [DOI: 10.1163/22134468-bja10057] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
Abstract
The major tenets of beat-frequency/coincidence-detection models of reward-related timing are reviewed in light of recent behavioral and neurobiological findings. This includes the emphasis on a core timing network embedded in the motor system that is comprised of a corticothalamic-basal ganglia circuit. Therein, a central hub provides timing pulses (i.e., predictive signals) to the entire brain, including a set of distributed satellite regions in the cerebellum, cortex, amygdala, and hippocampus that are selectively engaged in timing in a manner that is more dependent upon the specific sensory, behavioral, and contextual requirements of the task. Oscillation/coincidence-detection models also emphasize the importance of a tuned ‘perception’ learning and memory system whereby target durations are detected by striatal networks of medium spiny neurons (MSNs) through the coincidental activation of different neural populations, typically utilizing patterns of oscillatory input from the cortex and thalamus or derivations thereof (e.g., population coding) as a time base. The measure of success of beat-frequency/coincidence-detection accounts, such as the Striatal Beat-Frequency model of reward-related timing (SBF), is their ability to accommodate new experimental findings while maintaining their original framework, thereby making testable experimental predictions concerning diagnosis and treatment of issues related to a variety of dopamine-dependent basal ganglia disorders, including Huntington’s and Parkinson’s disease.
Collapse
Affiliation(s)
- Bin Yin
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708, USA
- School of Psychology, Fujian Normal University, Fuzhou, 350117, Fujian, China
| | - Zhuanghua Shi
- Department of Psychology, Ludwig Maximilian University of Munich, 80802 Munich, Germany
| | - Yaxin Wang
- School of Psychology, Fujian Normal University, Fuzhou, 350117, Fujian, China
| | - Warren H. Meck
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708, USA
| |
Collapse
|
5
|
Zemlianova K, Bose A, Rinzel J. A biophysical counting mechanism for keeping time. BIOLOGICAL CYBERNETICS 2022; 116:205-218. [PMID: 35031845 DOI: 10.1007/s00422-021-00915-4] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/16/2021] [Accepted: 11/16/2021] [Indexed: 06/14/2023]
Abstract
The ability to estimate and produce appropriately timed responses is central to many behaviors including speaking, dancing, and playing a musical instrument. A classical framework for estimating or producing a time interval is the pacemaker-accumulator model in which pulses of a pacemaker are counted and compared to a stored representation. However, the neural mechanisms for how these pulses are counted remain an open question. The presence of noise and stochasticity further complicates the picture. We present a biophysical model of how to keep count of a pacemaker in the presence of various forms of stochasticity using a system of bistable Wilson-Cowan units asymmetrically connected in a one-dimensional array; all units receive the same input pulses from a central clock but only one unit is active at any point in time. With each pulse from the clock, the position of the activated unit changes thereby encoding the total number of pulses emitted by the clock. This neural architecture maps the counting problem into the spatial domain, which in turn translates count to a time estimate. We further extend the model to a hierarchical structure to be able to robustly achieve higher counts.
Collapse
Affiliation(s)
| | - Amitabha Bose
- Department of Mathematical Sciences, New Jersey Institute of Technology, Newark, NJ, USA
| | - John Rinzel
- Center for Neural Science, New York University, New York, NY, USA
- Courant Institute of Mathematical Sciences, New York University, New York, NY, USA
| |
Collapse
|
6
|
Savinov M, Swigon D, Ermentrout B. Synchronization and locking in oscillators with flexible periods. CHAOS (WOODBURY, N.Y.) 2021; 31:033143. [PMID: 33810738 DOI: 10.1063/5.0021836] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/14/2020] [Accepted: 03/04/2021] [Indexed: 06/12/2023]
Abstract
Entrainment of a nonlinear oscillator by a periodic external force is a much studied problem in nonlinear dynamics and characterized by the well-known Arnold tongues. The circle map is the simplest such system allowing for stable N:M entrainment where the oscillator produces N cycles for every M stimulus cycles. There are a number of experiments that suggest that entrainment to external stimuli can involve both a shift in the phase and an adjustment of the intrinsic period of the oscillator. Motivated by a recent model of Loehr et al. [J. Exp. Psychol.: Hum. Percept. Perform. 37, 1292 (2011)], we explore a two-dimensional map in which the phase and the period are allowed to update as a function of the phase of the stimulus. We characterize the number and stability of fixed points for different N:M-locking regions, specifically, 1:1, 1:2, 2:3, and their reciprocals, as a function of the sensitivities of the phase and period to the stimulus as well as the degree that the oscillator has a preferred period. We find that even in the limited number of locking regimes explored, there is a great deal of multi-stability of locking modes, and the basins of attraction can be complex and riddled. We also show that when the forcing period changes between a starting and final period, the rate of this change determines, in a complex way, the final locking pattern.
Collapse
Affiliation(s)
- Mariya Savinov
- Department of Physics and Astronomy, University of Pittsburgh, Pittsburgh, Pennsylvania 15260, USA
| | - David Swigon
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania 15260, USA
| | - Bard Ermentrout
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania 15260, USA
| |
Collapse
|
7
|
Marcano M, Bose A, Bayman P. A one-dimensional map to study multi-seasonal coffee infestation by the coffee berry borer. Math Biosci 2021; 333:108530. [PMID: 33484730 DOI: 10.1016/j.mbs.2020.108530] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2020] [Revised: 12/10/2020] [Accepted: 12/10/2020] [Indexed: 10/22/2022]
Abstract
The coffee berry borer (CBB, Hypothenemus hampei) is the most serious insect pest of coffee worldwide; understanding the dynamics of its reproduction is essential for pest management. The female CBB penetrates the coffee berry, eats the seed, and reproduces inside it. A mathematical model of the infestation progress of the coffee berry by the CBB during several coffee seasons is formulated. The model represents the interaction among five populations: uninfested, slightly infested, and severely infested coffee berries, and free and encapsulated CBBs. Coffee harvesting is also included in the model. A one-dimensional map is derived for tracking the population dynamics subject to certain coffee harvesting percentages over several seasons. Stability analysis of the map's fixed points shows that CBB infestation could be eliminated or controlled to a specific level over multiple seasons of coffee harvesting. However, the percent of coffee harvesting required is determined by the level of CBB infestation at the beginning of the first season and in some cases it is impossible to achieve that percentage.
Collapse
Affiliation(s)
- Mariano Marcano
- Department of Computer Science, University of Puerto Rico, Río Piedras Campus, San Juan, PR, 00931, USA.
| | - Amitabha Bose
- Department of Mathematical Sciences, New Jersey Institute of Technology, Newark, NJ, 07102, USA
| | - Paul Bayman
- Department of Biology, University of Puerto Rico, Río Piedras Campus, PO Box 23360, San Juan, PR, 00931, USA
| |
Collapse
|
8
|
Abstract
Humans and animals can effortlessly coordinate their movements with external stimuli. This capacity indicates that sensory inputs can rapidly and flexibly reconfigure the ongoing dynamics in the neural circuits that control movements. Here, we develop a circuit-level model that coordinates movement times with expected and unexpected temporal events. The model consists of two interacting modules, a motor planning module that controls movement times and a sensory anticipation module that anticipates external events. Both modules harbor a reservoir of latent dynamics, and their interaction forms a control system whose output is adjusted adaptively to minimize timing errors. We show that the model’s output matches human behavior in a range of tasks including time interval production, periodic production, synchronization/continuation, and Bayesian time interval reproduction. These results demonstrate how recurrent interactions in a simple and modular neural circuit could create the dynamics needed to control timing behavior. We can flexibly coordinate our movements with external stimuli, but no circuit-level model exists to explain this ability. Inspired by fundamental concepts in control theory, the authors construct a modular neural circuit that captures human behavior in a wide range of temporal coordination tasks.
Collapse
|
9
|
Byrne Á, Rinzel J, Bose A. Order-indeterminant event-based maps for learning a beat. CHAOS (WOODBURY, N.Y.) 2020; 30:083138. [PMID: 32872826 DOI: 10.1063/5.0013771] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/15/2020] [Accepted: 07/30/2020] [Indexed: 06/11/2023]
Abstract
The process by which humans synchronize to a musical beat is believed to occur through error-correction where an individual's estimates of the period and phase of the beat time are iteratively adjusted to align with an external stimuli. Mathematically, error-correction can be described using a two-dimensional map where convergence to a fixed point corresponds to synchronizing to the beat. In this paper, we show how a neural system, called a beat generator, learns to adapt its oscillatory behavior through error-correction to synchronize to an external periodic signal. We construct a two-dimensional event-based map, which iteratively adjusts an internal parameter of the beat generator to speed up or slow down its oscillatory behavior to bring it into synchrony with the periodic stimulus. The map is novel in that the order of events defining the map are not a priori known. Instead, the type of error-correction adjustment made at each iterate of the map is determined by a sequence of expected events. The map possesses a rich repertoire of dynamics, including periodic solutions and chaotic orbits.
Collapse
Affiliation(s)
- Áine Byrne
- School of Mathematics and Statistics, University College Dublin, Dublin 4, Ireland
| | - John Rinzel
- Center for Neural Science, New York University, New York, New York 10003, USA
| | - Amitabha Bose
- Department of Mathematical Sciences, New Jersey Institute of Technology, Newark, New Jersey 07102, USA
| |
Collapse
|
10
|
Roman IR, Washburn A, Large EW, Chafe C, Fujioka T. Delayed feedback embedded in perception-action coordination cycles results in anticipation behavior during synchronized rhythmic action: A dynamical systems approach. PLoS Comput Biol 2019; 15:e1007371. [PMID: 31671096 PMCID: PMC6822724 DOI: 10.1371/journal.pcbi.1007371] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2019] [Accepted: 09/02/2019] [Indexed: 11/19/2022] Open
Abstract
Dancing and playing music require people to coordinate actions with auditory rhythms. In laboratory perception-action coordination tasks, people are asked to synchronize taps with a metronome. When synchronizing with a metronome, people tend to anticipate stimulus onsets, tapping slightly before the stimulus. The anticipation tendency increases with longer stimulus periods of up to 3500ms, but is less pronounced in trained individuals like musicians compared to non-musicians. Furthermore, external factors influence the timing of tapping. These factors include the presence of auditory feedback from one’s own taps, the presence of a partner performing coordinated joint tapping, and transmission latencies (TLs) between coordinating partners. Phenomena like the anticipation tendency can be explained by delay-coupled systems, which may be inherent to the sensorimotor system during perception-action coordination. Here we tested whether a dynamical systems model based on this hypothesis reproduces observed patterns of human synchronization. We simulated behavior with a model consisting of an oscillator receiving its own delayed activity as input. Three simulation experiments were conducted using previously-published behavioral data from 1) simple tapping, 2) two-person alternating beat-tapping, and 3) two-person alternating rhythm-clapping in the presence of a range of constant auditory TLs. In Experiment 1, our model replicated the larger anticipation observed for longer stimulus intervals and adjusting the amplitude of the delayed feedback reproduced the difference between musicians and non-musicians. In Experiment 2, by connecting two models we replicated the smaller anticipation observed in human joint tapping with bi-directional auditory feedback compared to joint tapping without feedback. In Experiment 3, we varied TLs between two models alternately receiving signals from one another. Results showed reciprocal lags at points of alternation, consistent with behavioral patterns. Overall, our model explains various anticipatory behaviors, and has potential to inform theories of adaptive human synchronization. When navigating a busy sidewalk, people coordinate their behavior in an orderly manner. Other activities require people to carefully synchronize periodic actions, as in a group rowing or marching. When individuals tap in synchrony with a metronome, their taps tend to anticipate the metronome. Experiments have revealed that factors like musical expertise, the presence of a synchronizing partner, auditory feedback, and the sound travel time, all systematically affect the tendency to anticipate. While researchers have hypothesized a number of potential mechanisms for such anticipatory behavior, none have successfully accounted for all of the effects. Previous research on coupled physical systems has shown that when one system receives input from a second system, plus its own delayed signal as input, this causes system 1 to anticipate system 2. We hypothesize that the tendency to anticipate is the result of delayed communication between neurons. Our work demonstrates the ability of delay-coupled physical systems to capture human anticipation and the effect of external factors in the anticipation tendency. Our model supports the theory that delayed communication within the nervous system is crucial to understanding anticipatory coordinative behavior.
Collapse
Affiliation(s)
- Iran R. Roman
- Center for Computer Research in Music and Acoustics, Department of Music, Stanford University, Stanford, United States of America
- Stanford Neurosciences Graduate Training Program, Stanford University, Stanford, United States of America
- * E-mail:
| | - Auriel Washburn
- Center for Computer Research in Music and Acoustics, Department of Music, Stanford University, Stanford, United States of America
- Department of Computer Science and Engineering, University of California San Diego, La Jolla, United States of America
| | - Edward W. Large
- Department of Psychological Sciences, University of Connecticut, Storrs, United States of America
- Department of Physics, University of Connecticut, Storrs, United States of America
| | - Chris Chafe
- Center for Computer Research in Music and Acoustics, Department of Music, Stanford University, Stanford, United States of America
| | - Takako Fujioka
- Center for Computer Research in Music and Acoustics, Department of Music, Stanford University, Stanford, United States of America
- Wu Tsai Neurosciences Institute, Stanford University, Stanford, United States of America
| |
Collapse
|
11
|
Rankin J, Rinzel J. Computational models of auditory perception from feature extraction to stream segregation and behavior. Curr Opin Neurobiol 2019; 58:46-53. [PMID: 31326723 DOI: 10.1016/j.conb.2019.06.009] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2019] [Accepted: 06/22/2019] [Indexed: 10/26/2022]
Abstract
Audition is by nature dynamic, from brainstem processing on sub-millisecond time scales, to segregating and tracking sound sources with changing features, to the pleasure of listening to music and the satisfaction of getting the beat. We review recent advances from computational models of sound localization, of auditory stream segregation and of beat perception/generation. A wealth of behavioral, electrophysiological and imaging studies shed light on these processes, typically with synthesized sounds having regular temporal structure. Computational models integrate knowledge from different experimental fields and at different levels of description. We advocate a neuromechanistic modeling approach that incorporates knowledge of the auditory system from various fields, that utilizes plausible neural mechanisms, and that bridges our understanding across disciplines.
Collapse
Affiliation(s)
- James Rankin
- College of Engineering, Mathematics and Physical Sciences, University of Exeter, Harrison Building, North Park Rd, Exeter EX4 4QF, UK.
| | - John Rinzel
- Center for Neural Science, New York University, 4 Washington Place, 10003 New York, NY, United States; Courant Institute of Mathematical Sciences, New York University, 251 Mercer St, 10012 New York, NY, United States
| |
Collapse
|