1
|
Zoefel B, Kösem A. Neural tracking of continuous acoustics: properties, speech-specificity and open questions. Eur J Neurosci 2024; 59:394-414. [PMID: 38151889 DOI: 10.1111/ejn.16221] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2023] [Revised: 11/17/2023] [Accepted: 11/22/2023] [Indexed: 12/29/2023]
Abstract
Human speech is a particularly relevant acoustic stimulus for our species, due to its role of information transmission during communication. Speech is inherently a dynamic signal, and a recent line of research focused on neural activity following the temporal structure of speech. We review findings that characterise neural dynamics in the processing of continuous acoustics and that allow us to compare these dynamics with temporal aspects in human speech. We highlight properties and constraints that both neural and speech dynamics have, suggesting that auditory neural systems are optimised to process human speech. We then discuss the speech-specificity of neural dynamics and their potential mechanistic origins and summarise open questions in the field.
Collapse
Affiliation(s)
- Benedikt Zoefel
- Centre de Recherche Cerveau et Cognition (CerCo), CNRS UMR 5549, Toulouse, France
- Université de Toulouse III Paul Sabatier, Toulouse, France
| | - Anne Kösem
- Lyon Neuroscience Research Center (CRNL), INSERM U1028, Bron, France
| |
Collapse
|
2
|
Lenc T, Peter V, Hooper C, Keller PE, Burnham D, Nozaradan S. Infants show enhanced neural responses to musical meter frequencies beyond low-level features. Dev Sci 2023; 26:e13353. [PMID: 36415027 DOI: 10.1111/desc.13353] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2022] [Revised: 10/20/2022] [Accepted: 11/16/2022] [Indexed: 11/24/2022]
Abstract
Music listening often entails spontaneous perception and body movement to a periodic pulse-like meter. There is increasing evidence that this cross-cultural ability relates to neural processes that selectively enhance metric periodicities, even when these periodicities are not prominent in the acoustic stimulus. However, whether these neural processes emerge early in development remains largely unknown. Here, we recorded the electroencephalogram (EEG) of 20 healthy 5- to 6-month-old infants, while they were exposed to two rhythms known to induce the perception of meter consistently across Western adults. One rhythm contained prominent acoustic periodicities corresponding to the meter, whereas the other rhythm did not. Infants showed significantly enhanced representations of meter periodicities in their EEG responses to both rhythms. This effect is unlikely to reflect the tracking of salient acoustic features in the stimulus, as it was observed irrespective of the prominence of meter periodicities in the audio signals. Moreover, as previously observed in adults, the neural enhancement of meter was greater when the rhythm was delivered by low-pitched sounds. Together, these findings indicate that the endogenous enhancement of metric periodicities beyond low-level acoustic features is a neural property that is already present soon after birth. These high-level neural processes could set the stage for internal representations of musical meter that are critical for human movement coordination during rhythmic musical behavior. RESEARCH HIGHLIGHTS: 5- to 6-month-old infants were presented with auditory rhythms that induce the perception of a periodic pulse-like meter in adults. Infants showed selective enhancement of EEG activity at meter-related frequencies irrespective of the prominence of these frequencies in the stimulus. Responses at meter-related frequencies were boosted when the rhythm was conveyed by bass sounds. High-level neural processes that transform rhythmic auditory stimuli into internal meter templates emerge early after birth.
Collapse
Affiliation(s)
- Tomas Lenc
- Institute of Neuroscience (IONS), Université catholique de Louvain (UCL), Brussels, Belgium
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
| | - Varghese Peter
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
- School of Health and Behavioural Sciences, University of the Sunshine Coast, Queensland, Australia
| | - Caitlin Hooper
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
| | - Peter E Keller
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
- Center for Music in the Brain & Department of Clinical Medicine, Aarhus University, Aarhus, Denmark
| | - Denis Burnham
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
| | - Sylvie Nozaradan
- Institute of Neuroscience (IONS), Université catholique de Louvain (UCL), Brussels, Belgium
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, Canada
| |
Collapse
|
3
|
Modeling enculturated bias in entrainment to rhythmic patterns. PLoS Comput Biol 2022; 18:e1010579. [PMID: 36174063 PMCID: PMC9553061 DOI: 10.1371/journal.pcbi.1010579] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2021] [Revised: 10/11/2022] [Accepted: 09/16/2022] [Indexed: 11/19/2022] Open
Abstract
Long-term and culture-specific experience of music shapes rhythm perception, leading to enculturated expectations that make certain rhythms easier to track and more conducive to synchronized movement. However, the influence of enculturated bias on the moment-to-moment dynamics of rhythm tracking is not well understood. Recent modeling work has formulated entrainment to rhythms as a formal inference problem, where phase is continuously estimated based on precise event times and their correspondence to timing expectations: PIPPET (Phase Inference from Point Process Event Timing). Here we propose that the problem of optimally tracking a rhythm also requires an ongoing process of inferring which pattern of event timing expectations is most suitable to predict a stimulus rhythm. We formalize this insight as an extension of PIPPET called pPIPPET (PIPPET with pattern inference). The variational solution to this problem introduces terms representing the likelihood that a stimulus is based on a particular member of a set of event timing patterns, which we initialize according to culturally-learned prior expectations of a listener. We evaluate pPIPPET in three experiments. First, we demonstrate that pPIPPET can qualitatively reproduce enculturated bias observed in human tapping data for simple two-interval rhythms. Second, we simulate categorization of a continuous three-interval rhythm space by Western-trained musicians through derivation of a comprehensive set of priors for pPIPPET from metrical patterns in a sample of Western rhythms. Third, we simulate iterated reproduction of three-interval rhythms, and show that models configured with notated rhythms from different cultures exhibit both universal and enculturated biases as observed experimentally in listeners from those cultures. These results suggest the influence of enculturated timing expectations on human perceptual and motor entrainment can be understood as approximating optimal inference about the rhythmic stimulus, with respect to prototypical patterns in an empirical sample of rhythms that represent the music-cultural environment of the listener. Cross-cultural studies have highlighted that listeners from non-Western cultures can precisely tap along with complex rhythms present in music from their culture that are challenging for participants from Western cultures. Therefore, while most adults can synchronize movements with simple periodic patterns (e.g. a ticking clock, a metronome), the ability to precisely track more complex rhythmic patterns depends on musical experience. Many computer models have been developed to describe the remarkable precision of human “entrainment”, but they have done little to explain how this ability depends on cultural musical experience. Here, we describe this as the problem of estimating the phase of a cycle underlying an auditory rhythm in real time, by drawing upon learned patterns (reference structures) that could plausibly describe the structure of observed events. By creating a model that solves this inference problem, and configuring these patterns to reflect specific musical features, we are able to simulate cultural variation in synchronization to rhythm. These results highlight that while humans universally move to musical rhythm, the ability to do so depends on musical experience within a cultural tradition, as reflected by the distinct “categories” of rhythm learned during such experience.
Collapse
|
4
|
Sauvé SA, Bolt ELW, Nozaradan S, Zendel BR. Aging effects on neural processing of rhythm and meter. Front Aging Neurosci 2022; 14:848608. [PMID: 36118692 PMCID: PMC9475293 DOI: 10.3389/fnagi.2022.848608] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2022] [Accepted: 08/01/2022] [Indexed: 11/13/2022] Open
Abstract
When listening to musical rhythm, humans can perceive and move to beat-like metrical pulses. Recently, it has been hypothesized that meter perception is related to brain activity responding to the acoustic fluctuation of the rhythmic input, with selective enhancement of the brain response elicited at meter-related frequencies. In the current study, electroencephalography (EEG) was recorded while younger (<35) and older (>60) adults listened to rhythmic patterns presented at two different tempi while intermittently performing a tapping task. Despite significant hearing loss compared to younger adults, older adults showed preserved brain activity to the rhythms. However, age effects were observed in the distribution of amplitude across frequencies. Specifically, in contrast with younger adults, older adults showed relatively larger amplitude at the frequency corresponding to the rate of individual events making up the rhythms as compared to lower meter-related frequencies. This difference is compatible with larger N1-P2 potentials as generally observed in older adults in response to acoustic onsets, irrespective of meter perception. These larger low-level responses to sounds have been linked to processes by which age-related hearing loss would be compensated by cortical sensory mechanisms. Importantly, this low-level effect would be associated here with relatively reduced neural activity at lower frequencies corresponding to higher-level metrical grouping of the acoustic events, as compared to younger adults.
Collapse
|
5
|
Vuust P, Heggli OA, Friston KJ, Kringelbach ML. Music in the brain. Nat Rev Neurosci 2022; 23:287-305. [PMID: 35352057 DOI: 10.1038/s41583-022-00578-5] [Citation(s) in RCA: 79] [Impact Index Per Article: 39.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/22/2022] [Indexed: 02/06/2023]
Abstract
Music is ubiquitous across human cultures - as a source of affective and pleasurable experience, moving us both physically and emotionally - and learning to play music shapes both brain structure and brain function. Music processing in the brain - namely, the perception of melody, harmony and rhythm - has traditionally been studied as an auditory phenomenon using passive listening paradigms. However, when listening to music, we actively generate predictions about what is likely to happen next. This enactive aspect has led to a more comprehensive understanding of music processing involving brain structures implicated in action, emotion and learning. Here we review the cognitive neuroscience literature of music perception. We show that music perception, action, emotion and learning all rest on the human brain's fundamental capacity for prediction - as formulated by the predictive coding of music model. This Review elucidates how this formulation of music perception and expertise in individuals can be extended to account for the dynamics and underlying brain mechanisms of collective music making. This in turn has important implications for human creativity as evinced by music improvisation. These recent advances shed new light on what makes music meaningful from a neuroscientific perspective.
Collapse
Affiliation(s)
- Peter Vuust
- Center for Music in the Brain, Aarhus University and The Royal Academy of Music (Det Jyske Musikkonservatorium), Aarhus, Denmark.
| | - Ole A Heggli
- Center for Music in the Brain, Aarhus University and The Royal Academy of Music (Det Jyske Musikkonservatorium), Aarhus, Denmark
| | - Karl J Friston
- Wellcome Centre for Human Neuroimaging, University College London, London, UK
| | - Morten L Kringelbach
- Center for Music in the Brain, Aarhus University and The Royal Academy of Music (Det Jyske Musikkonservatorium), Aarhus, Denmark.,Department of Psychiatry, University of Oxford, Oxford, UK.,Centre for Eudaimonia and Human Flourishing, Linacre College, University of Oxford, Oxford, UK
| |
Collapse
|
6
|
Alemi R, Nozaradan S, Lehmann A. Free-Field Cortical Steady-State Evoked Potentials in Cochlear Implant Users. Brain Topogr 2021; 34:664-680. [PMID: 34185222 DOI: 10.1007/s10548-021-00860-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2021] [Accepted: 06/18/2021] [Indexed: 11/25/2022]
Abstract
Auditory steady-state evoked potentials (SS-EPs) are phase-locked neural responses to periodic stimuli, believed to reflect specific neural generators. As an objective measure, steady-state responses have been used in different clinical settings, including measuring hearing thresholds of normal and hearing-impaired subjects. Recent studies are in favor of recording these responses as a part of the cochlear implant (CI) device-fitting procedure. Considering these potential benefits, the goals of the present study were to assess the feasibility of recording free-field SS-EPs in CI users and to compare their characteristics between CI users and controls. By taking advantage of a recently developed dual-frequency tagging method, we attempted to record subcortical and cortical SS-EPs from adult CI users and controls and measured reliable subcortical and cortical SS-EPs in the control group. Independent component analysis (ICA) was used to remove CI stimulation artifacts, yet subcortical responses of several CIs were heavily contaminated by these artifacts. Consequently, only cortical SS-EPs were compared between groups, which were found to be larger in the controls. The lower cortical SS-EPs' amplitude in CI users might indicate a reduction in neural synchrony evoked by the modulation rate of the auditory input across different neural assemblies in the auditory pathway. The brain topographies of cortical auditory SS-EPs, the time course of cortical responses, and the reconstructed cortical maps were highly similar between groups, confirming their neural origin and possibility to obtain such responses also in CI recipients. As for subcortical SS-EPs, our results highlight a need for sophisticated denoising algorithms to pinpoint and remove artifactual components from the biological response.
Collapse
Affiliation(s)
- Razieh Alemi
- Faculty of Medicine, Department of Otolaryngology, McGill University, Montreal, QC, Canada.
- Centre for Research On Brain, Language & Music (CRBLM), Montreal, Canada.
- International Laboratory for Brain, Music & Sound Research (BRAMS), Montreal, QC, Canada.
| | - Sylvie Nozaradan
- Institute of Neuroscience (IONS), Université Catholique de Louvain (UCL), Ottignies-Louvain-la-Neuve, Belgium
| | - Alexandre Lehmann
- Faculty of Medicine, Department of Otolaryngology, McGill University, Montreal, QC, Canada
- Centre for Research On Brain, Language & Music (CRBLM), Montreal, Canada
- International Laboratory for Brain, Music & Sound Research (BRAMS), Montreal, QC, Canada
| |
Collapse
|
7
|
Nijhuis P, Keller PE, Nozaradan S, Varlet M. Dynamic modulation of cortico-muscular coupling during real and imagined sensorimotor synchronisation. Neuroimage 2021; 238:118209. [PMID: 34051354 DOI: 10.1016/j.neuroimage.2021.118209] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2020] [Revised: 04/19/2021] [Accepted: 05/10/2021] [Indexed: 12/20/2022] Open
Abstract
People have a natural and intrinsic ability to coordinate body movements with rhythms surrounding them, known as sensorimotor synchronisation. This can be observed in daily environments, when dancing or singing along with music, or spontaneously walking, talking or applauding in synchrony with one another. However, the neurophysiological mechanisms underlying accurately synchronised movement with selected rhythms in the environment remain unclear. Here we studied real and imagined sensorimotor synchronisation with interleaved auditory and visual rhythms using cortico-muscular coherence (CMC) to better understand the processes underlying the preparation and execution of synchronised movement. Electroencephalography (EEG), electromyography (EMG) from the finger flexors, and continuous force signals were recorded in 20 participants during tapping and imagined tapping with discrete stimulus sequences consisting of alternating auditory beeps and visual flashes. The results show that the synchronisation between cortical and muscular activity in the beta (14-38 Hz) frequency band becomes time-locked to the taps executed in synchrony with the visual and auditory stimuli. Dynamic modulation in CMC also occurred when participants imagined tapping with the visual stimuli, but with lower amplitude and a different temporal profile compared to real tapping. These results suggest that CMC does not only reflect changes related to the production of the synchronised movement, but also to its preparation, which appears heightened under higher attentional demands imposed when synchronising with the visual stimuli. These findings highlight a critical role of beta band neural oscillations in the cortical-muscular coupling underlying sensorimotor synchronisation.
Collapse
Affiliation(s)
- Patti Nijhuis
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia.
| | - Peter E Keller
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
| | - Sylvie Nozaradan
- Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Belgium
| | - Manuel Varlet
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia; School of Psychology, Western Sydney University, Sydney, Australia
| |
Collapse
|
8
|
Gilmore SA, Russo FA. Neural and Behavioral Evidence for Vibrotactile Beat Perception and Bimodal Enhancement. J Cogn Neurosci 2021; 33:635-650. [PMID: 33475449 DOI: 10.1162/jocn_a_01673] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The ability to synchronize movements to a rhythmic stimulus, referred to as sensorimotor synchronization (SMS), is a behavioral measure of beat perception. Although SMS is generally superior when rhythms are presented in the auditory modality, recent research has demonstrated near-equivalent SMS for vibrotactile presentations of isochronous rhythms [Ammirante, P., Patel, A. D., & Russo, F. A. Synchronizing to auditory and tactile metronomes: A test of the auditory-motor enhancement hypothesis. Psychonomic Bulletin & Review, 23, 1882-1890, 2016]. The current study aimed to replicate and extend this study by incorporating a neural measure of beat perception. Nonmusicians were asked to tap to rhythms or to listen passively while EEG data were collected. Rhythmic complexity (isochronous, nonisochronous) and presentation modality (auditory, vibrotactile, bimodal) were fully crossed. Tapping data were consistent with those observed by Ammirante et al. (2016), revealing near-equivalent SMS for isochronous rhythms across modality conditions and a drop-off in SMS for nonisochronous rhythms, especially in the vibrotactile condition. EEG data revealed a greater degree of neural entrainment for isochronous compared to nonisochronous trials as well as for auditory and bimodal compared to vibrotactile trials. These findings led us to three main conclusions. First, isochronous rhythms lead to higher levels of beat perception than nonisochronous rhythms across modalities. Second, beat perception is generally enhanced for auditory presentations of rhythm but still possible under vibrotactile presentation conditions. Finally, exploratory analysis of neural entrainment at harmonic frequencies suggests that beat perception may be enhanced for bimodal presentations of rhythm.
Collapse
|