1
|
Wu Q, Sun L, Ding N, Yang Y. Musical tension is affected by metrical structure dynamically and hierarchically. Cogn Neurodyn 2024; 18:1955-1976. [PMID: 39104669 PMCID: PMC11297889 DOI: 10.1007/s11571-023-10058-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2022] [Revised: 11/29/2023] [Accepted: 12/11/2023] [Indexed: 08/07/2024] Open
Abstract
As the basis of musical emotions, dynamic tension experience is felt by listeners as music unfolds over time. The effects of musical harmonic and melodic structures on tension have been widely investigated, however, the potential roles of metrical structures in tension perception remain largely unexplored. This experiment examined how different metrical structures affect tension experience and explored the underlying neural activities. The electroencephalogram (EEG) was recorded and subjective tension was rated simultaneously while participants listened to music meter sequences. On large time scale of whole meter sequences, it was found that different overall tension and low-frequency (1 ~ 4 Hz) steady-state evoked potentials were elicited by metrical structures with different periods of strong beats, and the higher overall tension was associated with metrical structure with the shorter intervals between strong beats. On small time scale of measures, dynamic tension fluctuations within measures was found to be associated with the periodic modulations of high-frequency (10 ~ 25 Hz) neural activities. The comparisons between the same beats within measures and across different meters both on small and large time scales verified the contextual effects of meter on tension induced by beats. Our findings suggest that the overall tension is determined by temporal intervals between strong beats, and the dynamic tension experience may arise from cognitive processing of hierarchical temporal expectation and attention, which are discussed under the theoretical frameworks of metrical hierarchy, musical expectation and dynamic attention.
Collapse
Affiliation(s)
- Qiong Wu
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, No. 16 Lincui Road, Chaoyang District, Beijing, 100101 China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Lijun Sun
- College of Arts, Nanjing University of Aeronautics and Astronautics, Nanjing, China
| | - Nai Ding
- Key Laboratory for Biomedical Engineering of Ministry of Education, College of Biomedical Engineering and Instrument Sciences, Zhejiang University, Hangzhou, China
| | - Yufang Yang
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, No. 16 Lincui Road, Chaoyang District, Beijing, 100101 China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
2
|
Teng X, Larrouy-Maestri P, Poeppel D. Segmenting and Predicting Musical Phrase Structure Exploits Neural Gain Modulation and Phase Precession. J Neurosci 2024; 44:e1331232024. [PMID: 38926087 PMCID: PMC11270514 DOI: 10.1523/jneurosci.1331-23.2024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2023] [Revised: 05/29/2024] [Accepted: 06/11/2024] [Indexed: 06/28/2024] Open
Abstract
Music, like spoken language, is often characterized by hierarchically organized structure. Previous experiments have shown neural tracking of notes and beats, but little work touches on the more abstract question: how does the brain establish high-level musical structures in real time? We presented Bach chorales to participants (20 females and 9 males) undergoing electroencephalogram (EEG) recording to investigate how the brain tracks musical phrases. We removed the main temporal cues to phrasal structures, so that listeners could only rely on harmonic information to parse a continuous musical stream. Phrasal structures were disrupted by locally or globally reversing the harmonic progression, so that our observations on the original music could be controlled and compared. We first replicated the findings on neural tracking of musical notes and beats, substantiating the positive correlation between musical training and neural tracking. Critically, we discovered a neural signature in the frequency range ∼0.1 Hz (modulations of EEG power) that reliably tracks musical phrasal structure. Next, we developed an approach to quantify the phrasal phase precession of the EEG power, revealing that phrase tracking is indeed an operation of active segmentation involving predictive processes. We demonstrate that the brain establishes complex musical structures online over long timescales (>5 s) and actively segments continuous music streams in a manner comparable to language processing. These two neural signatures, phrase tracking and phrasal phase precession, provide new conceptual and technical tools to study the processes underpinning high-level structure building using noninvasive recording techniques.
Collapse
Affiliation(s)
- Xiangbin Teng
- Department of Psychology, The Chinese University of Hong Kong, Shatin, Hong Kong SAR, China
| | - Pauline Larrouy-Maestri
- Music Department, Max-Planck-Institute for Empirical Aesthetics, Frankfurt 60322, Germany
- Center for Language, Music, and Emotion (CLaME), New York, New York 10003
| | - David Poeppel
- Center for Language, Music, and Emotion (CLaME), New York, New York 10003
- Department of Psychology, New York University, New York, New York 10003
- Ernst Struengmann Institute for Neuroscience, Frankfurt 60528, Germany
- Music and Audio Research Laboratory (MARL), New York, New York 11201
| |
Collapse
|
3
|
Cirelli LK, Talukder LS, Kragness HE. Infant attention to rhythmic audiovisual synchrony is modulated by stimulus properties. Front Psychol 2024; 15:1393295. [PMID: 39027053 PMCID: PMC11256966 DOI: 10.3389/fpsyg.2024.1393295] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2024] [Accepted: 06/06/2024] [Indexed: 07/20/2024] Open
Abstract
Musical interactions are a common and multimodal part of an infant's daily experiences. Infants hear their parents sing while watching their lips move and see their older siblings dance along to music playing over the radio. Here, we explore whether 8- to 12-month-old infants associate musical rhythms they hear with synchronous visual displays by tracking their dynamic visual attention to matched and mismatched displays. Visual attention was measured using eye-tracking while they attended to a screen displaying two videos of a finger tapping at different speeds. These videos were presented side by side while infants listened to an auditory rhythm (high or low pitch) synchronized with one of the two videos. Infants attended more to the low-pitch trials than to the high-pitch trials but did not display a preference for attending to the synchronous hand over the asynchronous hand within trials. Exploratory evidence, however, suggests that tempo, pitch, and rhythmic complexity interactively engage infants' visual attention to a tapping hand, especially when that hand is aligned with the auditory stimulus. For example, when the rhythm was complex and the auditory stimulus was low in pitch, infants attended to the fast hand more when it aligned with the auditory stream than to misaligned trials. These results suggest that the audiovisual integration in rhythmic non-speech contexts is influenced by stimulus properties.
Collapse
Affiliation(s)
- Laura K. Cirelli
- Department of Psychology, University of Toronto Scarborough, Toronto, ON, Canada
| | - Labeeb S. Talukder
- Department of Psychology, University of Toronto Scarborough, Toronto, ON, Canada
| | - Haley E. Kragness
- Department of Psychology, University of Toronto Scarborough, Toronto, ON, Canada
- Psychology Department, Bucknell University, Lewisburg, PA, United States
| |
Collapse
|
4
|
Black T, Jenkins BW, Laprairie RB, Howland JG. Therapeutic potential of gamma entrainment using sensory stimulation for cognitive symptoms associated with schizophrenia. Neurosci Biobehav Rev 2024; 161:105681. [PMID: 38641090 DOI: 10.1016/j.neubiorev.2024.105681] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2024] [Revised: 03/27/2024] [Accepted: 04/16/2024] [Indexed: 04/21/2024]
Abstract
Schizophrenia is a complex neuropsychiatric disorder with significant morbidity. Treatment options that address the spectrum of symptoms are limited, highlighting the need for innovative therapeutic approaches. Gamma Entrainment Using Sensory Stimulation (GENUS) is an emerging treatment for neuropsychiatric disorders that uses sensory stimulation to entrain impaired oscillatory network activity and restore brain function. Aberrant oscillatory activity often underlies the symptoms experienced by patients with schizophrenia. We propose that GENUS has therapeutic potential for schizophrenia. This paper reviews the current status of schizophrenia treatment and explores the use of sensory stimulation as an adjunctive treatment, specifically through gamma entrainment. Impaired gamma frequency entrainment is observed in patients, particularly in response to auditory and visual stimuli. Thus, sensory stimulation, such as music listening, may have therapeutic potential for individuals with schizophrenia. GENUS holds novel therapeutic potential to improve the lives of individuals with schizophrenia, but further research is required to determine the efficacy of GENUS, optimize its delivery and therapeutic window, and develop strategies for its implementation in specific patient populations.
Collapse
Affiliation(s)
- Tallan Black
- College of Pharmacy and Nutrition, University of Saskatchewan, Saskatoon, SK, Canada.
| | - Bryan W Jenkins
- Division of Behavioral Biology, Department of Psychiatry and Behavioral Sciences, Johns Hopkins University School of Medicine, Baltimore, MD, United States
| | - Robert B Laprairie
- College of Pharmacy and Nutrition, University of Saskatchewan, Saskatoon, SK, Canada; Department of Pharmacology, College of Medicine, Dalhousie University, Halifax, NS, Canada
| | - John G Howland
- Department of Anatomy, Physiology, and Pharmacology, College of Medicine, University of Saskatchewan, Saskatoon, SK, Canada
| |
Collapse
|
5
|
Spiech C, Danielsen A, Laeng B, Endestad T. Oscillatory attention in groove. Cortex 2024; 174:137-148. [PMID: 38547812 DOI: 10.1016/j.cortex.2024.02.013] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2023] [Revised: 11/10/2023] [Accepted: 02/19/2024] [Indexed: 04/21/2024]
Abstract
Attention is not constant but rather fluctuates over time and these attentional fluctuations may prioritize the processing of certain events over others. In music listening, the pleasurable urge to move to music (termed 'groove' by music psychologists) offers a particularly convenient case study of oscillatory attention because it engenders synchronous and oscillatory movements which also vary predictably with stimulus complexity. In this study, we simultaneously recorded pupillometry and scalp electroencephalography (EEG) from participants while they listened to drumbeats of varying complexity that they rated in terms of groove afterwards. Using the intertrial phase coherence of the beat frequency, we found that while subjects were listening, their pupil activity became entrained to the beat of the drumbeats and this entrained attention persisted in the EEG even as subjects imagined the drumbeats continuing through subsequent silent periods. This entrainment in both the pupillometry and EEG worsened with increasing rhythmic complexity, indicating poorer sensory precision as the beat became more obscured. Additionally, sustained pupil dilations revealed the expected, inverted U-shaped relationship between rhythmic complexity and groove ratings. Taken together, this work bridges oscillatory attention to rhythmic complexity in relation to musical groove.
Collapse
Affiliation(s)
- Connor Spiech
- RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, University of Oslo, Norway; Department of Psychology, University of Oslo, Norway.
| | - Anne Danielsen
- RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, University of Oslo, Norway; Department of Musicology, University of Oslo, Norway
| | - Bruno Laeng
- RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, University of Oslo, Norway; Department of Psychology, University of Oslo, Norway
| | - Tor Endestad
- RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, University of Oslo, Norway; Department of Psychology, University of Oslo, Norway
| |
Collapse
|
6
|
Etani T, Miura A, Kawase S, Fujii S, Keller PE, Vuust P, Kudo K. A review of psychological and neuroscientific research on musical groove. Neurosci Biobehav Rev 2024; 158:105522. [PMID: 38141692 DOI: 10.1016/j.neubiorev.2023.105522] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2023] [Revised: 12/18/2023] [Accepted: 12/19/2023] [Indexed: 12/25/2023]
Abstract
When listening to music, we naturally move our bodies rhythmically to the beat, which can be pleasurable and difficult to resist. This pleasurable sensation of wanting to move the body to music has been called "groove." Following pioneering humanities research, psychological and neuroscientific studies have provided insights on associated musical features, behavioral responses, phenomenological aspects, and brain structural and functional correlates of the groove experience. Groove research has advanced the field of music science and more generally informed our understanding of bidirectional links between perception and action, and the role of the motor system in prediction. Activity in motor and reward-related brain networks during music listening is associated with the groove experience, and this neural activity is linked to temporal prediction and learning. This article reviews research on groove as a psychological phenomenon with neurophysiological correlates that link musical rhythm perception, sensorimotor prediction, and reward processing. Promising future research directions range from elucidating specific neural mechanisms to exploring clinical applications and socio-cultural implications of groove.
Collapse
Affiliation(s)
- Takahide Etani
- School of Medicine, College of Medical, Pharmaceutical, and Health, Kanazawa University, Kanazawa, Japan; Graduate School of Media and Governance, Keio University, Fujisawa, Japan; Advanced Research Center for Human Sciences, Waseda University, Tokorozawa, Japan.
| | - Akito Miura
- Faculty of Human Sciences, Waseda University, Tokorozawa, Japan
| | - Satoshi Kawase
- The Faculty of Psychology, Kobe Gakuin University, Kobe, Japan
| | - Shinya Fujii
- Faculty of Environment and Information Studies, Keio University, Fujisawa, Japan
| | - Peter E Keller
- Center for Music in the Brain, Aarhus University, Aarhus, Denmark/The Royal Academy of Music Aarhus/Aalborg, Denmark; The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia
| | - Peter Vuust
- Center for Music in the Brain, Aarhus University, Aarhus, Denmark/The Royal Academy of Music Aarhus/Aalborg, Denmark
| | - Kazutoshi Kudo
- Graduate School of Arts and Sciences, The University of Tokyo, Tokyo, Japan
| |
Collapse
|
7
|
Zhang H, Xie J, Tao Q, Xiao Y, Cui G, Fang W, Zhu X, Xu G, Li M, Han C. The effect of motion frequency and sound source frequency on steady-state auditory motion evoked potential. Hear Res 2023; 439:108897. [PMID: 37871451 DOI: 10.1016/j.heares.2023.108897] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/27/2023] [Revised: 08/18/2023] [Accepted: 10/12/2023] [Indexed: 10/25/2023]
Abstract
The ability of humans to perceive motion sound sources is important for accurate response to the living environment. Periodic motion sound sources can elicit steady-state motion auditory evoked potential (SSMAEP). The purpose of this study was to investigate the effects of different motion frequencies and different frequencies of sound source on SSMAEP. The stimulation paradigms for simulating periodic motion of sound sources were designed utilizing head-related transfer function (HRTF) techniques in this study. The motion frequencies of the paradigm are set respectively to 1-10 Hz, 15 Hz, 20 Hz, 30 Hz, 40 Hz, 60 Hz, and 80 Hz. In addition, the frequencies of sound source of the paradigms were set to 500 Hz, 1000 Hz, 2000 Hz, 3000 Hz, and 4000 Hz at motion frequencies of 6 Hz and 40 Hz. Fourteen subjects with normal hearing were recruited for the study. SSMAEP was elicited by 500 Hz pure tone at motion frequencies of 1-10 Hz, 15 Hz, 20 Hz, 30 Hz, 40 Hz, 60 Hz, and 80 Hz. SSMAEP was strongest at motion frequencies of 6 Hz. Moreover, at 6 Hz motion frequency, the SSMAEP amplitude was largest at the tone frequency of 500 Hz and smallest at 4000 Hz. Whilst SSMAEP elicited by 4000 Hz pure tone was significantly the strongest at motion frequency of 40 Hz. SSMAEP can be elicited by periodic motion sound sources at motion frequencies up to 80 Hz. SSMAEP also has a strong response at lower frequency. Low-frequency pure tones are beneficial to enhance SSMAEP at low-frequency sound source motion, whilst high-frequency pure tones help to enhance SSMAEP at high-frequency sound source motion. The study provides new insight into the brain's perception of rhythmic auditory motion.
Collapse
Affiliation(s)
- Huanqing Zhang
- School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, China
| | - Jun Xie
- School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, China; School of Mechanical Engineering, Xinjiang University, Urumqi, China; National Key Laboratory of Human Factors Engineering, China Astronauts Research and Training Center, Beijing, China; State Key Laboratory for Manufacturing Systems Engineering, Xi'an Jiaotong University, Xi'an, China.
| | - Qing Tao
- School of Mechanical Engineering, Xinjiang University, Urumqi, China.
| | - Yi Xiao
- National Key Laboratory of Human Factors Engineering, China Astronauts Research and Training Center, Beijing, China
| | - Guiling Cui
- National Key Laboratory of Human Factors Engineering, China Astronauts Research and Training Center, Beijing, China
| | - Wenhu Fang
- School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, China
| | - Xinyu Zhu
- School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, China
| | - Guanghua Xu
- School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, China; State Key Laboratory for Manufacturing Systems Engineering, Xi'an Jiaotong University, Xi'an, China
| | - Min Li
- School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, China; State Key Laboratory for Manufacturing Systems Engineering, Xi'an Jiaotong University, Xi'an, China
| | - Chengcheng Han
- School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, China; State Key Laboratory for Manufacturing Systems Engineering, Xi'an Jiaotong University, Xi'an, China
| |
Collapse
|
8
|
Xu N, Qin X, Zhou Z, Shan W, Ren J, Yang C, Lu L, Wang Q. Age differentially modulates the cortical tracking of the lower and higher level linguistic structures during speech comprehension. Cereb Cortex 2023; 33:10463-10474. [PMID: 37566910 DOI: 10.1093/cercor/bhad296] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2022] [Revised: 07/23/2023] [Accepted: 07/24/2023] [Indexed: 08/13/2023] Open
Abstract
Speech comprehension requires listeners to rapidly parse continuous speech into hierarchically-organized linguistic structures (i.e. syllable, word, phrase, and sentence) and entrain the neural activities to the rhythm of different linguistic levels. Aging is accompanied by changes in speech processing, but it remains unclear how aging affects different levels of linguistic representation. Here, we recorded magnetoencephalography signals in older and younger groups when subjects actively and passively listened to the continuous speech in which hierarchical linguistic structures of word, phrase, and sentence were tagged at 4, 2, and 1 Hz, respectively. A newly-developed parameterization algorithm was applied to separate the periodically linguistic tracking from the aperiodic component. We found enhanced lower-level (word-level) tracking, reduced higher-level (phrasal- and sentential-level) tracking, and reduced aperiodic offset in older compared with younger adults. Furthermore, we observed the attentional modulation on the sentential-level tracking being larger for younger than for older ones. Notably, the neuro-behavior analyses showed that subjects' behavioral accuracy was positively correlated with the higher-level linguistic tracking, reversely correlated with the lower-level linguistic tracking. Overall, these results suggest that the enhanced lower-level linguistic tracking, reduced higher-level linguistic tracking and less flexibility of attentional modulation may underpin aging-related decline in speech comprehension.
Collapse
Affiliation(s)
- Na Xu
- Department of Neurology, Beijing Tiantan Hospital, Capital Medical University, Beijing 100070, China
- National Clinical Research Center for Neurological Diseases, Beijing 100070, China
| | - Xiaoxiao Qin
- Department of Neurology, Beijing Tiantan Hospital, Capital Medical University, Beijing 100070, China
- National Clinical Research Center for Neurological Diseases, Beijing 100070, China
| | - Ziqi Zhou
- Department of Neurology, Beijing Tiantan Hospital, Capital Medical University, Beijing 100070, China
- National Clinical Research Center for Neurological Diseases, Beijing 100070, China
| | - Wei Shan
- Department of Neurology, Beijing Tiantan Hospital, Capital Medical University, Beijing 100070, China
- National Clinical Research Center for Neurological Diseases, Beijing 100070, China
| | - Jiechuan Ren
- Department of Neurology, Beijing Tiantan Hospital, Capital Medical University, Beijing 100070, China
- National Clinical Research Center for Neurological Diseases, Beijing 100070, China
| | - Chunqing Yang
- Department of Neurology, Beijing Tiantan Hospital, Capital Medical University, Beijing 100070, China
- National Clinical Research Center for Neurological Diseases, Beijing 100070, China
| | - Lingxi Lu
- Center for the Cognitive Science of Language, Beijing Language and Culture University, Beijing 100083, China
| | - Qun Wang
- Department of Neurology, Beijing Tiantan Hospital, Capital Medical University, Beijing 100070, China
- National Clinical Research Center for Neurological Diseases, Beijing 100070, China
- Beijing Institute of Brain Disorders, Collaborative Innovation Center for Brain Disorders, Capital Medical University, Beijing 100069, China
| |
Collapse
|
9
|
Lenc T, Peter V, Hooper C, Keller PE, Burnham D, Nozaradan S. Infants show enhanced neural responses to musical meter frequencies beyond low-level features. Dev Sci 2023; 26:e13353. [PMID: 36415027 DOI: 10.1111/desc.13353] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2022] [Revised: 10/20/2022] [Accepted: 11/16/2022] [Indexed: 11/24/2022]
Abstract
Music listening often entails spontaneous perception and body movement to a periodic pulse-like meter. There is increasing evidence that this cross-cultural ability relates to neural processes that selectively enhance metric periodicities, even when these periodicities are not prominent in the acoustic stimulus. However, whether these neural processes emerge early in development remains largely unknown. Here, we recorded the electroencephalogram (EEG) of 20 healthy 5- to 6-month-old infants, while they were exposed to two rhythms known to induce the perception of meter consistently across Western adults. One rhythm contained prominent acoustic periodicities corresponding to the meter, whereas the other rhythm did not. Infants showed significantly enhanced representations of meter periodicities in their EEG responses to both rhythms. This effect is unlikely to reflect the tracking of salient acoustic features in the stimulus, as it was observed irrespective of the prominence of meter periodicities in the audio signals. Moreover, as previously observed in adults, the neural enhancement of meter was greater when the rhythm was delivered by low-pitched sounds. Together, these findings indicate that the endogenous enhancement of metric periodicities beyond low-level acoustic features is a neural property that is already present soon after birth. These high-level neural processes could set the stage for internal representations of musical meter that are critical for human movement coordination during rhythmic musical behavior. RESEARCH HIGHLIGHTS: 5- to 6-month-old infants were presented with auditory rhythms that induce the perception of a periodic pulse-like meter in adults. Infants showed selective enhancement of EEG activity at meter-related frequencies irrespective of the prominence of these frequencies in the stimulus. Responses at meter-related frequencies were boosted when the rhythm was conveyed by bass sounds. High-level neural processes that transform rhythmic auditory stimuli into internal meter templates emerge early after birth.
Collapse
Affiliation(s)
- Tomas Lenc
- Institute of Neuroscience (IONS), Université catholique de Louvain (UCL), Brussels, Belgium
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
| | - Varghese Peter
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
- School of Health and Behavioural Sciences, University of the Sunshine Coast, Queensland, Australia
| | - Caitlin Hooper
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
| | - Peter E Keller
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
- Center for Music in the Brain & Department of Clinical Medicine, Aarhus University, Aarhus, Denmark
| | - Denis Burnham
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
| | - Sylvie Nozaradan
- Institute of Neuroscience (IONS), Université catholique de Louvain (UCL), Brussels, Belgium
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, Canada
| |
Collapse
|
10
|
Rosso M, Moens B, Leman M, Moumdjian L. Neural entrainment underpins sensorimotor synchronization to dynamic rhythmic stimuli. Neuroimage 2023; 277:120226. [PMID: 37321359 DOI: 10.1016/j.neuroimage.2023.120226] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2023] [Revised: 05/02/2023] [Accepted: 06/12/2023] [Indexed: 06/17/2023] Open
Abstract
Neural entrainment, defined as unidirectional synchronization of neural oscillations to an external rhythmic stimulus, is a topic of major interest in the field of neuroscience. Despite broad scientific consensus on its existence, on its pivotal role in sensory and motor processes, and on its fundamental definition, empirical research struggles in quantifying it with non-invasive electrophysiology. To this date, broadly adopted state-of-the-art methods still fail to capture the dynamic underlying the phenomenon. Here, we present event-related frequency adjustment (ERFA) as a methodological framework to induce and to measure neural entrainment in human participants, optimized for multivariate EEG datasets. By applying dynamic phase and tempo perturbations to isochronous auditory metronomes during a finger-tapping task, we analyzed adaptive changes in instantaneous frequency of entrained oscillatory components during error correction. Spatial filter design allowed us to untangle, from the multivariate EEG signal, perceptual and sensorimotor oscillatory components attuned to the stimulation frequency. Both components dynamically adjusted their frequency in response to perturbations, tracking the stimulus dynamics by slowing down and speeding up the oscillation over time. Source separation revealed that sensorimotor processing enhanced the entrained response, supporting the notion that the active engagement of the motor system plays a critical role in processing rhythmic stimuli. In the case of phase shift, motor engagement was a necessary condition to observe any response, whereas sustained tempo changes induced frequency adjustment even in the perceptual oscillatory component. Although the magnitude of the perturbations was controlled across positive and negative direction, we observed a general bias in the frequency adjustments towards positive changes, which points at the effect of intrinsic dynamics constraining neural entrainment. We conclude that our findings provide compelling evidence for neural entrainment as mechanism underlying overt sensorimotor synchronization, and highlight that our methodology offers a paradigm and a measure for quantifying its oscillatory dynamics by means of non-invasive electrophysiology, rigorously informed by the fundamental definition of entrainment.
Collapse
Affiliation(s)
- Mattia Rosso
- IPEM Institute for Systematic Musicology, Ghent University, Ghent, Belgium; Université de Lille, ULR 4072 - PSITEC - Psychologie: Interactions, Temps, Emotions, Cognition, Lille, France.
| | - Bart Moens
- IPEM Institute for Systematic Musicology, Ghent University, Ghent, Belgium
| | - Marc Leman
- IPEM Institute for Systematic Musicology, Ghent University, Ghent, Belgium
| | - Lousin Moumdjian
- IPEM Institute for Systematic Musicology, Ghent University, Ghent, Belgium; REVAL Rehabilitation Research Center, Faculty of Rehabilitation Sciences, Hasselt University, Hasselt, Belgium; UMSC Hasselt, Pelt, Belgium
| |
Collapse
|
11
|
Foster Vander Elst O, Foster NHD, Vuust P, Keller PE, Kringelbach ML. The Neuroscience of Dance: A Conceptual Framework and Systematic Review. Neurosci Biobehav Rev 2023; 150:105197. [PMID: 37100162 DOI: 10.1016/j.neubiorev.2023.105197] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2022] [Revised: 04/13/2023] [Accepted: 04/21/2023] [Indexed: 04/28/2023]
Abstract
Ancient and culturally universal, dance pervades many areas of life and has multiple benefits. In this article, we provide a conceptual framework and systematic review, as a guide for researching the neuroscience of dance. We identified relevant articles following PRISMA guidelines, and summarised and evaluated all original results. We identified avenues for future research in: the interactive and collective aspects of dance; groove; dance performance; dance observation; and dance therapy. Furthermore, the interactive and collective aspects of dance constitute a vital part of the field but have received almost no attention from a neuroscientific perspective so far. Dance and music engage overlapping brain networks, including common regions involved in perception, action, and emotion. In music and dance, rhythm, melody, and harmony are processed in an active, sustained pleasure cycle giving rise to action, emotion, and learning, led by activity in specific hedonic brain networks. The neuroscience of dance is an exciting field, which may yield information concerning links between psychological processes and behaviour, human flourishing, and the concept of eudaimonia.
Collapse
Affiliation(s)
- Olivia Foster Vander Elst
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Denmark; Centre for Eudaimonia and Human Flourishing, Linacre College, University of Oxford, UK.
| | | | - Peter Vuust
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Denmark
| | - Peter E Keller
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Denmark; The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Australia
| | - Morten L Kringelbach
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Denmark; Centre for Eudaimonia and Human Flourishing, Linacre College, University of Oxford, UK; Department of Psychiatry, University of Oxford, UK
| |
Collapse
|
12
|
Lapenta OM, Keller PE, Nozaradan S, Varlet M. Spatial and temporal (non)binding of audiovisual rhythms in sensorimotor synchronisation. Exp Brain Res 2023; 241:875-887. [PMID: 36788141 PMCID: PMC9985575 DOI: 10.1007/s00221-023-06569-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2022] [Accepted: 02/06/2023] [Indexed: 02/16/2023]
Abstract
Human movement synchronisation with moving objects strongly relies on visual input. However, auditory information also plays an important role, since real environments are intrinsically multimodal. We used electroencephalography (EEG) frequency tagging to investigate the selective neural processing and integration of visual and auditory information during motor tracking and tested the effects of spatial and temporal congruency between audiovisual modalities. EEG was recorded while participants tracked with their index finger a red flickering (rate fV = 15 Hz) dot oscillating horizontally on a screen. The simultaneous auditory stimulus was modulated in pitch (rate fA = 32 Hz) and lateralised between left and right audio channels to induce perception of a periodic displacement of the sound source. Audiovisual congruency was manipulated in terms of space in Experiment 1 (no motion, same direction or opposite direction), and timing in Experiment 2 (no delay, medium delay or large delay). For both experiments, significant EEG responses were elicited at fV and fA tagging frequencies. It was also hypothesised that intermodulation products corresponding to the nonlinear integration of visual and auditory stimuli at frequencies fV ± fA would be elicited, due to audiovisual integration, especially in Congruent conditions. However, these components were not observed. Moreover, synchronisation and EEG results were not influenced by congruency manipulations, which invites further exploration of the conditions which may modulate audiovisual processing and the motor tracking of moving objects.
Collapse
Affiliation(s)
- Olivia Morgan Lapenta
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia.
- Psychological Neuroscience Lab, Center for Investigation in Psychology, University of Minho, Rua da Universidade, 4710-057, Braga, Portugal.
| | - Peter E Keller
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University, Aarhus, Denmark
| | - Sylvie Nozaradan
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia
- Institute of Neuroscience, Université Catholique de Louvain, Woluwe-Saint-Lambert, Belgium
| | - Manuel Varlet
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia
- School of Psychology, Western Sydney University, Penrith, Australia
| |
Collapse
|
13
|
Cortical encoding of rhythmic kinematic structures in biological motion. Neuroimage 2023; 268:119893. [PMID: 36693597 DOI: 10.1016/j.neuroimage.2023.119893] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2022] [Revised: 01/04/2023] [Accepted: 01/20/2023] [Indexed: 01/22/2023] Open
Abstract
Biological motion (BM) perception is of great survival value to human beings. The critical characteristics of BM information lie in kinematic cues containing rhythmic structures. However, how rhythmic kinematic structures of BM are dynamically represented in the brain and contribute to visual BM processing remains largely unknown. Here, we probed this issue in three experiments using electroencephalogram (EEG). We found that neural oscillations of observers entrained to the hierarchical kinematic structures of the BM sequences (i.e., step-cycle and gait-cycle for point-light walkers). Notably, only the cortical tracking of the higher-level rhythmic structure (i.e., gait-cycle) exhibited a BM processing specificity, manifested by enhanced neural responses to upright over inverted BM stimuli. This effect could be extended to different motion types and tasks, with its strength positively correlated with the perceptual sensitivity to BM stimuli at the right temporal brain region dedicated to visual BM processing. Modeling results further suggest that the neural encoding of spatiotemporally integrative kinematic cues, in particular the opponent motions of bilateral limbs, drives the selective cortical tracking of BM information. These findings underscore the existence of a cortical mechanism that encodes periodic kinematic features of body movements, which underlies the dynamic construction of visual BM perception.
Collapse
|
14
|
Varlet M, Nozaradan S, Schmidt RC, Keller PE. Neural tracking of visual periodic motion. Eur J Neurosci 2023; 57:1081-1097. [PMID: 36788113 DOI: 10.1111/ejn.15934] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2021] [Revised: 02/10/2023] [Accepted: 02/10/2023] [Indexed: 02/16/2023]
Abstract
Periodicity is a fundamental property of biological systems, including human movement systems. Periodic movements support displacements of the body in the environment as well as interactions and communication between individuals. Here, we use electroencephalography (EEG) to investigate the neural tracking of visual periodic motion, and more specifically, the relevance of spatiotemporal information contained at and between their turning points. We compared EEG responses to visual sinusoidal oscillations versus nonlinear Rayleigh oscillations, which are both typical of human movements. These oscillations contain the same spatiotemporal information at their turning points but differ between turning points, with Rayleigh oscillations having an earlier peak velocity, shown to increase an individual's capacity to produce accurately synchronized movements. EEG analyses highlighted the relevance of spatiotemporal information between the turning points by showing that the brain precisely tracks subtle differences in velocity profiles, as indicated by earlier EEG responses for Rayleigh oscillations. The results suggest that the brain is particularly responsive to velocity peaks in visual periodic motion, supporting their role in conveying behaviorally relevant timing information at a neurophysiological level. The results also suggest key functions of neural oscillations in the Alpha and Beta frequency bands, particularly in the right hemisphere. Together, these findings provide insights into the neural mechanisms underpinning the processing of visual periodic motion and the critical role of velocity peaks in enabling proficient visuomotor synchronization.
Collapse
Affiliation(s)
- Manuel Varlet
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia.,School of Psychology, Western Sydney University, Penrith, Australia
| | - Sylvie Nozaradan
- Institute of Neuroscience (IONS), Université catholique de Louvain (UCL), Brussels, Belgium
| | - Richard C Schmidt
- Department of Psychology, College of the Holy Cross, Worcester, Massachusetts, USA
| | - Peter E Keller
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia.,Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Aarhus, Denmark
| |
Collapse
|
15
|
Zhang M, Li F, Wang D, Ba X, Liu Z. Mapping Research Trends from 20 Years of Publications in Rhythmic Auditory Stimulation. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2022; 20:215. [PMID: 36612537 PMCID: PMC9819413 DOI: 10.3390/ijerph20010215] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/19/2022] [Revised: 12/05/2022] [Accepted: 12/07/2022] [Indexed: 06/17/2023]
Abstract
This study aims to create an all-around insight into the evolutions, status, and global trends of rhythmic auditory stimulation (RAS) research via enhanced bibliometric methods for the 2001-2020 time period. Articles concerning RAS were extracted from the Web of Science database. CiteSpace, Bibliometrix, VOSviewer, and Graphpad Prism were employed to analyze publication patterns and research trends. A total of 586 publications related to RAS between 2001 and 2020 were retrieved from the Web of Science database. The researcher Goswami U. made the greatest contribution to this field. The University of Toronto was the institution that published the most articles. Motor dysfunction, sensory perception, and cognition are the three major domains of RAS research. Neural tracking, working memory, and neural basis may be the latest research frontiers. This study reveals the publication patterns and topic trends of RAS based on the records published between 2001 and 2020. The insights obtained provided useful references for the future research and applications of RAS.
Collapse
Affiliation(s)
- Meiqi Zhang
- Department of Physical Education and Health Education, Springfield College, Springfield, MA 01109, USA
- Yale/VA Learning-Based Recovery Center, Yale University, New Haven, CT 06510, USA
| | - Fang Li
- Department of Neurology, The First Affiliated Hospital of Jinzhou Medical University, Jinzhou 121001, China
| | - Dongyu Wang
- Department of Neurology, The Center Hospital of Jinzhou, Jinzhou 121001, China
| | - Xiaohong Ba
- Department of Neurology, The First Affiliated Hospital of Jinzhou Medical University, Jinzhou 121001, China
| | - Zhan Liu
- Department of Physical Education and Health Education, Springfield College, Springfield, MA 01109, USA
| |
Collapse
|
16
|
Cameron DJ, Dotov D, Flaten E, Bosnyak D, Hove MJ, Trainor LJ. Undetectable very-low frequency sound increases dancing at a live concert. Curr Biol 2022; 32:R1222-R1223. [DOI: 10.1016/j.cub.2022.09.035] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
17
|
Cantiani C, Dondena C, Molteni M, Riva V, Piazza C. Synchronizing with the rhythm: Infant neural entrainment to complex musical and speech stimuli. Front Psychol 2022; 13:944670. [PMID: 36337544 PMCID: PMC9635850 DOI: 10.3389/fpsyg.2022.944670] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2022] [Accepted: 09/22/2022] [Indexed: 11/14/2022] Open
Abstract
Neural entrainment is defined as the process whereby brain activity, and more specifically neuronal oscillations measured by EEG, synchronize with exogenous stimulus rhythms. Despite the importance that neural oscillations have assumed in recent years in the field of auditory neuroscience and speech perception, in human infants the oscillatory brain rhythms and their synchronization with complex auditory exogenous rhythms are still relatively unexplored. In the present study, we investigate infant neural entrainment to complex non-speech (musical) and speech rhythmic stimuli; we provide a developmental analysis to explore potential similarities and differences between infants' and adults' ability to entrain to the stimuli; and we analyze the associations between infants' neural entrainment measures and the concurrent level of development. 25 8-month-old infants were included in the study. Their EEG signals were recorded while they passively listened to non-speech and speech rhythmic stimuli modulated at different rates. In addition, Bayley Scales were administered to all infants to assess their cognitive, language, and social-emotional development. Neural entrainment to the incoming rhythms was measured in the form of peaks emerging from the EEG spectrum at frequencies corresponding to the rhythm envelope. Analyses of the EEG spectrum revealed clear responses above the noise floor at frequencies corresponding to the rhythm envelope, suggesting that - similarly to adults - infants at 8 months of age were capable of entraining to the incoming complex auditory rhythms. Infants' measures of neural entrainment were associated with concurrent measures of cognitive and social-emotional development.
Collapse
Affiliation(s)
- Chiara Cantiani
- Child Psychopathology Unit, Scientific Institute, IRCCS Eugenio Medea, Lecco, Italy
| | - Chiara Dondena
- Child Psychopathology Unit, Scientific Institute, IRCCS Eugenio Medea, Lecco, Italy
| | - Massimo Molteni
- Child Psychopathology Unit, Scientific Institute, IRCCS Eugenio Medea, Lecco, Italy
| | - Valentina Riva
- Child Psychopathology Unit, Scientific Institute, IRCCS Eugenio Medea, Lecco, Italy
| | - Caterina Piazza
- Bioengineering Lab, Scientific Institute, IRCCS Eugenio Medea, Lecco, Italy
| |
Collapse
|
18
|
Sauvé SA, Bolt ELW, Nozaradan S, Zendel BR. Aging effects on neural processing of rhythm and meter. Front Aging Neurosci 2022; 14:848608. [PMID: 36118692 PMCID: PMC9475293 DOI: 10.3389/fnagi.2022.848608] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2022] [Accepted: 08/01/2022] [Indexed: 11/13/2022] Open
Abstract
When listening to musical rhythm, humans can perceive and move to beat-like metrical pulses. Recently, it has been hypothesized that meter perception is related to brain activity responding to the acoustic fluctuation of the rhythmic input, with selective enhancement of the brain response elicited at meter-related frequencies. In the current study, electroencephalography (EEG) was recorded while younger (<35) and older (>60) adults listened to rhythmic patterns presented at two different tempi while intermittently performing a tapping task. Despite significant hearing loss compared to younger adults, older adults showed preserved brain activity to the rhythms. However, age effects were observed in the distribution of amplitude across frequencies. Specifically, in contrast with younger adults, older adults showed relatively larger amplitude at the frequency corresponding to the rate of individual events making up the rhythms as compared to lower meter-related frequencies. This difference is compatible with larger N1-P2 potentials as generally observed in older adults in response to acoustic onsets, irrespective of meter perception. These larger low-level responses to sounds have been linked to processes by which age-related hearing loss would be compensated by cortical sensory mechanisms. Importantly, this low-level effect would be associated here with relatively reduced neural activity at lower frequencies corresponding to higher-level metrical grouping of the acoustic events, as compared to younger adults.
Collapse
|
19
|
Peter V, van Ommen S, Kalashnikova M, Mazuka R, Nazzi T, Burnham D. Language specificity in cortical tracking of speech rhythm at the mora, syllable, and foot levels. Sci Rep 2022; 12:13477. [PMID: 35931787 PMCID: PMC9356059 DOI: 10.1038/s41598-022-17401-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2021] [Accepted: 07/25/2022] [Indexed: 11/29/2022] Open
Abstract
Recent research shows that adults' neural oscillations track the rhythm of the speech signal. However, the extent to which this tracking is driven by the acoustics of the signal, or by language-specific processing remains unknown. Here adult native listeners of three rhythmically different languages (English, French, Japanese) were compared on their cortical tracking of speech envelopes synthesized in their three native languages, which allowed for coding at each of the three language's dominant rhythmic unit, respectively the foot (2.5 Hz), syllable (5 Hz), or mora (10 Hz) level. The three language groups were also tested with a sequence in a non-native language, Polish, and a non-speech vocoded equivalent, to investigate possible differential speech/nonspeech processing. The results first showed that cortical tracking was most prominent at 5 Hz (syllable rate) for all three groups, but the French listeners showed enhanced tracking at 5 Hz compared to the English and the Japanese groups. Second, across groups, there were no differences in responses for speech versus non-speech at 5 Hz (syllable rate), but there was better tracking for speech than for non-speech at 10 Hz (not the syllable rate). Together these results provide evidence for both language-general and language-specific influences on cortical tracking.
Collapse
Affiliation(s)
- Varghese Peter
- MARCS Institute for Brain Behaviour and Development, Western Sydney University, Penrith, NSW, Australia.
- School of Health and Behavioural Sciences, University of the Sunshine Coast, Sippy Downs, Australia.
| | - Sandrien van Ommen
- Integrative Neuroscience and Cognition Center, CNRS-Université Paris Cité, Paris, France
- Neurosciences Fondamentales, University of Geneva, Geneva, Switzerland
| | - Marina Kalashnikova
- MARCS Institute for Brain Behaviour and Development, Western Sydney University, Penrith, NSW, Australia
- BCBL, Basque Center on Cognition, Brain and Language, San Sebastian, Guipuzcoa, Spain
- IKERBASQUE, Basque Foundation for Science, Bilbao, Bizcaya, Spain
| | - Reiko Mazuka
- Laboratory for Language Development, RIKEN Center for Brain Science, Saitama, Japan
- Department of Psychology and Neuroscience, Duke University, Durham, NC, USA
| | - Thierry Nazzi
- Integrative Neuroscience and Cognition Center, CNRS-Université Paris Cité, Paris, France
| | - Denis Burnham
- MARCS Institute for Brain Behaviour and Development, Western Sydney University, Penrith, NSW, Australia
| |
Collapse
|
20
|
Wang Y, Lu L, Zou G, Zheng L, Qin L, Zou Q, Gao JH. Disrupted neural tracking of sound localization during non-rapid eye movement sleep. Neuroimage 2022; 260:119490. [PMID: 35853543 DOI: 10.1016/j.neuroimage.2022.119490] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2022] [Revised: 06/16/2022] [Accepted: 07/15/2022] [Indexed: 11/27/2022] Open
Abstract
Spatial hearing in humans is a high-level auditory process that is crucial to rapid sound localization in the environment. Both neurophysiological models with animals and neuroimaging evidence from human subjects in the wakefulness stage suggest that the localization of auditory objects is mainly located in the posterior auditory cortex. However, whether this cognitive process is preserved during sleep remains unclear. To fill this research gap, we investigated the sleeping brain's capacity to identify sound locations by recording simultaneous electroencephalographic (EEG) and magnetoencephalographic (MEG) signals during wakefulness and non-rapid eye movement (NREM) sleep in human subjects. Using the frequency-tagging paradigm, the subjects were presented with a basic syllable sequence at 5 Hz and a location change that occurred every three syllables, resulting in a sound localization shift at 1.67 Hz. The EEG and MEG signals were used for sleep scoring and neural tracking analyses, respectively. Neural tracking responses at 5 Hz reflecting basic auditory processing were observed during both wakefulness and NREM sleep, although the responses during sleep were weaker than those during wakefulness. Cortical responses at 1.67 Hz, which correspond to the sound location change, were observed during wakefulness regardless of attention to the stimuli but vanished during NREM sleep. These results for the first time indicate that sleep preserves basic auditory processing but disrupts the higher-order brain function of sound localization.
Collapse
Affiliation(s)
- Yan Wang
- Center for MRI Research, Academy for Advanced Interdisciplinary Studies, Peking University, Beijing 100871, China; Chinese Institute for Brain Research, Beijing 102206, China; PKU-IDG/McGovern Institute for Brain Research, Peking University, Beijing 100871, China
| | - Lingxi Lu
- Center for the Cognitive Science of Language, Beijing Language and Culture University, Beijing 100083, China.
| | - Guangyuan Zou
- Center for MRI Research, Academy for Advanced Interdisciplinary Studies, Peking University, Beijing 100871, China
| | - Li Zheng
- Center for MRI Research, Academy for Advanced Interdisciplinary Studies, Peking University, Beijing 100871, China
| | - Lang Qin
- Center for MRI Research, Academy for Advanced Interdisciplinary Studies, Peking University, Beijing 100871, China
| | - Qihong Zou
- Center for MRI Research, Academy for Advanced Interdisciplinary Studies, Peking University, Beijing 100871, China.
| | - Jia-Hong Gao
- Center for MRI Research, Academy for Advanced Interdisciplinary Studies, Peking University, Beijing 100871, China; PKU-IDG/McGovern Institute for Brain Research, Peking University, Beijing 100871, China; Beijing City Key Lab for Medical Physics and Engineering, Institution of Heavy Ion Physics, School of Physics, Peking University, Beijing 100871, China; National Biomedical Imaging Center, Peking University, Beijing 100871, China.
| |
Collapse
|
21
|
Spiech C, Sioros G, Endestad T, Danielsen A, Laeng B. Pupil drift rate indexes groove ratings. Sci Rep 2022; 12:11620. [PMID: 35804069 PMCID: PMC9270355 DOI: 10.1038/s41598-022-15763-w] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2021] [Accepted: 06/29/2022] [Indexed: 11/09/2022] Open
Abstract
Groove, understood as an enjoyable compulsion to move to musical rhythms, typically varies along an inverted U-curve with increasing rhythmic complexity (e.g., syncopation, pickups). Predictive coding accounts posit that moderate complexity drives us to move to reduce sensory prediction errors and model the temporal structure. While musicologists generally distinguish the effects of pickups (anacruses) and syncopations, their difference remains unexplored in groove. We used pupillometry as an index to noradrenergic arousal while subjects listened to and rated drumbeats varying in rhythmic complexity. We replicated the inverted U-shaped relationship between rhythmic complexity and groove and showed this is modulated by musical ability, based on a psychoacoustic beat perception test. The pupil drift rates suggest that groovier rhythms hold attention longer than ones rated less groovy. Moreover, we found complementary effects of syncopations and pickups on groove ratings and pupil size, respectively, discovering a distinct predictive process related to pickups. We suggest that the brain deploys attention to pickups to sharpen subsequent strong beats, augmenting the predictive scaffolding's focus on beats that reduce syncopations' prediction errors. This interpretation is in accordance with groove envisioned as an embodied resolution of precision-weighted prediction error.
Collapse
Affiliation(s)
- Connor Spiech
- RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, University of Oslo, Postboks 1133 Blindern, 0318, Oslo, Norway. .,Department of Psychology, University of Oslo, Oslo, Norway.
| | - George Sioros
- RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, University of Oslo, Postboks 1133 Blindern, 0318, Oslo, Norway.,Department of Musicology, University of Oslo, Oslo, Norway
| | - Tor Endestad
- RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, University of Oslo, Postboks 1133 Blindern, 0318, Oslo, Norway.,Department of Psychology, University of Oslo, Oslo, Norway
| | - Anne Danielsen
- RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, University of Oslo, Postboks 1133 Blindern, 0318, Oslo, Norway.,Department of Musicology, University of Oslo, Oslo, Norway
| | - Bruno Laeng
- RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, University of Oslo, Postboks 1133 Blindern, 0318, Oslo, Norway.,Department of Psychology, University of Oslo, Oslo, Norway
| |
Collapse
|
22
|
Kabdebon C, Fló A, de Heering A, Aslin R. The power of rhythms: how steady-state evoked responses reveal early neurocognitive development. Neuroimage 2022; 254:119150. [PMID: 35351649 PMCID: PMC9294992 DOI: 10.1016/j.neuroimage.2022.119150] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2021] [Revised: 03/23/2022] [Accepted: 03/24/2022] [Indexed: 12/17/2022] Open
Abstract
Electroencephalography (EEG) is a non-invasive and painless recording of cerebral activity, particularly well-suited for studying young infants, allowing the inspection of cerebral responses in a constellation of different ways. Of particular interest for developmental cognitive neuroscientists is the use of rhythmic stimulation, and the analysis of steady-state evoked potentials (SS-EPs) - an approach also known as frequency tagging. In this paper we rely on the existing SS-EP early developmental literature to illustrate the important advantages of SS-EPs for studying the developing brain. We argue that (1) the technique is both objective and predictive: the response is expected at the stimulation frequency (and/or higher harmonics), (2) its high spectral specificity makes the computed responses particularly robust to artifacts, and (3) the technique allows for short and efficient recordings, compatible with infants' limited attentional spans. We additionally provide an overview of some recent inspiring use of the SS-EP technique in adult research, in order to argue that (4) the SS-EP approach can be implemented creatively to target a wide range of cognitive and neural processes. For all these reasons, we expect SS-EPs to play an increasing role in the understanding of early cognitive processes. Finally, we provide practical guidelines for implementing and analyzing SS-EP studies.
Collapse
Affiliation(s)
- Claire Kabdebon
- Laboratoire de Sciences Cognitives et Psycholinguistique, Département d'études cognitives, ENS, EHESS, CNRS, PSL University, Paris, France; Haskins Laboratories, New Haven, CT, USA.
| | - Ana Fló
- Cognitive Neuroimaging Unit, CNRS ERL 9003, INSERM U992, CEA, Université Paris-Saclay, NeuroSpin Center, Gif/Yvette, France
| | - Adélaïde de Heering
- Center for Research in Cognition & Neuroscience (CRCN), Université libre de Bruxelles (ULB), Brussels, Belgium
| | - Richard Aslin
- Haskins Laboratories, New Haven, CT, USA; Department of Psychology, Yale University, New Haven, CT, USA
| |
Collapse
|
23
|
Lapenta OM, Keller PE, Nozaradan S, Varlet M. Lateralised dynamic modulations of corticomuscular coherence associated with bimanual learning of rhythmic patterns. Sci Rep 2022; 12:6271. [PMID: 35428836 PMCID: PMC9012795 DOI: 10.1038/s41598-022-10342-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2021] [Accepted: 03/28/2022] [Indexed: 11/09/2022] Open
Abstract
Human movements are spontaneously attracted to auditory rhythms, triggering an automatic activation of the motor system, a central phenomenon to music perception and production. Cortico-muscular coherence (CMC) in the theta, alpha, beta and gamma frequencies has been used as an index of the synchronisation between cortical motor regions and the muscles. Here we investigated how learning to produce a bimanual rhythmic pattern composed of low- and high-pitch sounds affects CMC in the beta frequency band. Electroencephalography (EEG) and electromyography (EMG) from the left and right First Dorsal Interosseus and Flexor Digitorum Superficialis muscles were concurrently recorded during constant pressure on a force sensor held between the thumb and index finger while listening to the rhythmic pattern before and after a bimanual training session. During the training, participants learnt to produce the rhythmic pattern guided by visual cues by pressing the force sensors with their left or right hand to produce the low- and high-pitch sounds, respectively. Results revealed no changes after training in overall beta CMC or beta oscillation amplitude, nor in the correlation between the left and right sides for EEG and EMG separately. However, correlation analyses indicated that left- and right-hand beta EEG-EMG coherence were positively correlated over time before training but became uncorrelated after training. This suggests that learning to bimanually produce a rhythmic musical pattern reinforces lateralised and segregated cortico-muscular communication.
Collapse
Affiliation(s)
- Olivia Morgan Lapenta
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia. .,Center for Investigation in Psychology, University of Minho, Braga, Portugal.
| | - Peter E Keller
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia
| | - Sylvie Nozaradan
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia.,Institute of Neuroscience, Catholic University of Louvain, Woluwe-Saint-Lambert, Belgium
| | - Manuel Varlet
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia.,School of Psychology, Western Sydney University, Penrith, Australia
| |
Collapse
|
24
|
Grossberg S. Toward Understanding the Brain Dynamics of Music: Learning and Conscious Performance of Lyrics and Melodies With Variable Rhythms and Beats. Front Syst Neurosci 2022; 16:766239. [PMID: 35465193 PMCID: PMC9028030 DOI: 10.3389/fnsys.2022.766239] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2021] [Accepted: 02/23/2022] [Indexed: 11/13/2022] Open
Abstract
A neural network architecture models how humans learn and consciously perform musical lyrics and melodies with variable rhythms and beats, using brain design principles and mechanisms that evolved earlier than human musical capabilities, and that have explained and predicted many kinds of psychological and neurobiological data. One principle is called factorization of order and rhythm: Working memories store sequential information in a rate-invariant and speaker-invariant way to avoid using excessive memory and to support learning of language, spatial, and motor skills. Stored invariant representations can be flexibly performed in a rate-dependent and speaker-dependent way under volitional control. A canonical working memory design stores linguistic, spatial, motoric, and musical sequences, including sequences with repeated words in lyrics, or repeated pitches in songs. Stored sequences of individual word chunks and pitch chunks are categorized through learning into lyrics chunks and pitches chunks. Pitches chunks respond selectively to stored sequences of individual pitch chunks that categorize harmonics of each pitch, thereby supporting tonal music. Bottom-up and top-down learning between working memory and chunking networks dynamically stabilizes the memory of learned music. Songs are learned by associatively linking sequences of lyrics and pitches chunks. Performance begins when list chunks read word chunk and pitch chunk sequences into working memory. Learning and performance of regular rhythms exploits cortical modulation of beats that are generated in the basal ganglia. Arbitrary performance rhythms are learned by adaptive timing circuits in the cerebellum interacting with prefrontal cortex and basal ganglia. The same network design that controls walking, running, and finger tapping also generates beats and the urge to move with a beat.
Collapse
Affiliation(s)
- Stephen Grossberg
- Center for Adaptive Systems, Graduate Program in Cognitive and Neural Systems, Department of Mathematics & Statistics, Psychological & Brain Sciences, and Biomedical Engineering, Boston University, Boston, MA, United States
| |
Collapse
|
25
|
Emmanouil A, Rousanoglou E, Georgaki A, Boudolos KD. When Musical Accompaniment Allows the Preferred Spatio-Temporal Pattern of Movement. Sports Med Int Open 2021; 5:E81-E90. [PMID: 34646934 PMCID: PMC8500738 DOI: 10.1055/a-1553-7063] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2021] [Accepted: 05/11/2021] [Indexed: 11/24/2022] Open
Abstract
A musical accompaniment is often used in movement coordination and stability
exercise modalities, although considered obstructive for their fundament of
preferred movement pace. This study examined if the rhythmic strength of musical
excerpts used in movement coordination and exercise modalities allows the
preferred spatio-temporal pattern of movement. Voluntary and spontaneous body
sway (70 s) were tested (N=20 young women) in a non-musical
(preferred) and two rhythmic strength (RS) musical conditions (Higher:HrRS,
Lower:LrRS). The center of pressure trajectory was used for the body sway
spatio-temporal characteristics (Kistler forceplate, 100 Hz). Statistics
included paired t-tests between each musical condition and the non-musical one,
as well as between musical conditions (p≤0.05). Results indicated no
significant difference between the musical and the non-musical conditions
(p>0.05). The HrRS differed significantly from LrRS only in the
voluntary body sway, with increased sway duration (p=0.03), center of
pressure path (p=0.04) and velocity (p=0.01). The findings
provide evidence-based support for the rhythmic strength recommendations in
movement coordination and stability exercise modalities. The HrRS to LrRS
differences in voluntary body sway most possibly indicate that low-frequency
musical features rather than just tempo and pulse clarity are also
important.
Collapse
Affiliation(s)
- Analina Emmanouil
- National and Kapodistrian University of Athens, Faculty of Physical Education and Sport Science, Department of Sport Medicine and Biology of Exercise, Sport Biomechanics Lab, Daphne, Greece
| | - Elissavet Rousanoglou
- National and Kapodistrian University of Athens, Faculty of Physical Education and Sport Science, Department of Sport Medicine and Biology of Exercise, Sport Biomechanics Lab, Daphne, Greece
| | - Anastasia Georgaki
- National and Kapodistrian University of Athens, Department of Music Studies, Athens, Greece
| | - Konstantinos D Boudolos
- National and Kapodistrian University of Athens, Faculty of Physical Education and Sport Science, Department of Sport Medicine and Biology of Exercise, Sport Biomechanics Lab, Daphne, Greece
| |
Collapse
|
26
|
Lenc T, Merchant H, Keller PE, Honing H, Varlet M, Nozaradan S. Mapping between sound, brain and behaviour: four-level framework for understanding rhythm processing in humans and non-human primates. Philos Trans R Soc Lond B Biol Sci 2021; 376:20200325. [PMID: 34420381 PMCID: PMC8380981 DOI: 10.1098/rstb.2020.0325] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/14/2021] [Indexed: 12/16/2022] Open
Abstract
Humans perceive and spontaneously move to one or several levels of periodic pulses (a meter, for short) when listening to musical rhythm, even when the sensory input does not provide prominent periodic cues to their temporal location. Here, we review a multi-levelled framework to understanding how external rhythmic inputs are mapped onto internally represented metric pulses. This mapping is studied using an approach to quantify and directly compare representations of metric pulses in signals corresponding to sensory inputs, neural activity and behaviour (typically body movement). Based on this approach, recent empirical evidence can be drawn together into a conceptual framework that unpacks the phenomenon of meter into four levels. Each level highlights specific functional processes that critically enable and shape the mapping from sensory input to internal meter. We discuss the nature, constraints and neural substrates of these processes, starting with fundamental mechanisms investigated in macaque monkeys that enable basic forms of mapping between simple rhythmic stimuli and internally represented metric pulse. We propose that human evolution has gradually built a robust and flexible system upon these fundamental processes, allowing more complex levels of mapping to emerge in musical behaviours. This approach opens promising avenues to understand the many facets of rhythmic behaviours across individuals and species. This article is part of the theme issue 'Synchrony and rhythm interaction: from the brain to behavioural ecology'.
Collapse
Affiliation(s)
- Tomas Lenc
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, New South Wales 2751, Australia
- Institute of Neuroscience (IONS), Université Catholique de Louvain (UCL), Brussels 1200, Belgium
| | - Hugo Merchant
- Instituto de Neurobiologia, UNAM, Campus Juriquilla, Querétaro 76230, Mexico
| | - Peter E. Keller
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, New South Wales 2751, Australia
| | - Henkjan Honing
- Amsterdam Brain and Cognition (ABC), Institute for Logic, Language and Computation (ILLC), University of Amsterdam, Amsterdam 1090 GE, The Netherlands
| | - Manuel Varlet
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, New South Wales 2751, Australia
- School of Psychology, Western Sydney University, Penrith, New South Wales 2751, Australia
| | - Sylvie Nozaradan
- Institute of Neuroscience (IONS), Université Catholique de Louvain (UCL), Brussels 1200, Belgium
| |
Collapse
|
27
|
Pesnot Lerousseau J, Trébuchon A, Morillon B, Schön D. Frequency Selectivity of Persistent Cortical Oscillatory Responses to Auditory Rhythmic Stimulation. J Neurosci 2021; 41:7991-8006. [PMID: 34301825 PMCID: PMC8460151 DOI: 10.1523/jneurosci.0213-21.2021] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2021] [Revised: 06/28/2021] [Accepted: 07/01/2021] [Indexed: 11/21/2022] Open
Abstract
Cortical oscillations have been proposed to play a functional role in speech and music perception, attentional selection, and working memory, via the mechanism of neural entrainment. One of the properties of neural entrainment that is often taken for granted is that its modulatory effect on ongoing oscillations outlasts rhythmic stimulation. We tested the existence of this phenomenon by studying cortical neural oscillations during and after presentation of melodic stimuli in a passive perception paradigm. Melodies were composed of ∼60 and ∼80 Hz tones embedded in a 2.5 Hz stream. Using intracranial and surface recordings in male and female humans, we reveal persistent oscillatory activity in the high-γ band in response to the tones throughout the cortex, well beyond auditory regions. By contrast, in response to the 2.5 Hz stream, no persistent activity in any frequency band was observed. We further show that our data are well captured by a model of damped harmonic oscillator and can be classified into three classes of neural dynamics, with distinct damping properties and eigenfrequencies. This model provides a mechanistic and quantitative explanation of the frequency selectivity of auditory neural entrainment in the human cortex.SIGNIFICANCE STATEMENT It has been proposed that the functional role of cortical oscillations is subtended by a mechanism of entrainment, the synchronization in phase or amplitude of neural oscillations to a periodic stimulation. One of the properties of neural entrainment that is often taken for granted is that its modulatory effect on ongoing oscillations outlasts rhythmic stimulation. Using intracranial and surface recordings of humans passively listening to rhythmic auditory stimuli, we reveal consistent oscillatory responses throughout the cortex, with persistent activity of high-γ oscillations. On the contrary, neural oscillations do not outlast low-frequency acoustic dynamics. We interpret our results as reflecting harmonic oscillator properties, a model ubiquitous in physics but rarely used in neuroscience.
Collapse
Affiliation(s)
| | - Agnès Trébuchon
- Inserm, Inst Neurosci Syst, Aix Marseille Univ, Inserm, INS, Inst Neurosci Syst, Marseille, France
- APHM, Hôpital de la Timone, Service de Neurophysiologie Clinique, Marseille 13005, France
| | - Benjamin Morillon
- Inserm, Inst Neurosci Syst, Aix Marseille Univ, Inserm, INS, Inst Neurosci Syst, Marseille, France
| | - Daniele Schön
- Inserm, Inst Neurosci Syst, Aix Marseille Univ, Inserm, INS, Inst Neurosci Syst, Marseille, France
| |
Collapse
|
28
|
Sifuentes-Ortega R, Lenc T, Nozaradan S, Peigneux P. Partially Preserved Processing of Musical Rhythms in REM but Not in NREM Sleep. Cereb Cortex 2021; 32:1508-1519. [PMID: 34491309 DOI: 10.1093/cercor/bhab303] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/01/2023] Open
Abstract
The extent of high-level perceptual processing during sleep remains controversial. In wakefulness, perception of periodicities supports the emergence of high-order representations such as the pulse-like meter perceived while listening to music. Electroencephalography (EEG) frequency-tagged responses elicited at envelope frequencies of musical rhythms have been shown to provide a neural representation of rhythm processing. Specifically, responses at frequencies corresponding to the perceived meter are enhanced over responses at meter-unrelated frequencies. This selective enhancement must rely on higher-level perceptual processes, as it occurs even in irregular (i.e., syncopated) rhythms where meter frequencies are not prominent input features, thus ruling out acoustic confounds. We recorded EEG while presenting a regular (unsyncopated) and an irregular (syncopated) rhythm across sleep stages and wakefulness. Our results show that frequency-tagged responses at meter-related frequencies of the rhythms were selectively enhanced during wakefulness but attenuated across sleep states. Most importantly, this selective attenuation occurred even in response to the irregular rhythm, where meter-related frequencies were not prominent in the stimulus, thus suggesting that neural processes selectively enhancing meter-related frequencies during wakefulness are weakened during rapid eye movement (REM) and further suppressed in non-rapid eye movement (NREM) sleep. These results indicate preserved processing of low-level acoustic properties but limited higher-order processing of auditory rhythms during sleep.
Collapse
Affiliation(s)
- Rebeca Sifuentes-Ortega
- UR2NF - Neuropsychology and Functional Neuroimaging Research Unit at CRCN - Center for Research in Cognition & Neurosciences, and UNI - ULB Neuroscience Institute, Université Libre de Bruxelles (ULB), 1050 Brussels, Belgium
| | - Tomas Lenc
- Institute of Neuroscience (IONS), Université Catholique de Louvain, 1200 Brussels, Belgium
| | - Sylvie Nozaradan
- Institute of Neuroscience (IONS), Université Catholique de Louvain, 1200 Brussels, Belgium
| | - Philippe Peigneux
- UR2NF - Neuropsychology and Functional Neuroimaging Research Unit at CRCN - Center for Research in Cognition & Neurosciences, and UNI - ULB Neuroscience Institute, Université Libre de Bruxelles (ULB), 1050 Brussels, Belgium
| |
Collapse
|
29
|
The influence of auditory rhythms on the speed of inferred motion. Atten Percept Psychophys 2021; 84:2360-2383. [PMID: 34435321 DOI: 10.3758/s13414-021-02364-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/26/2021] [Indexed: 12/24/2022]
Abstract
The present research explored the influence of isochronous auditory rhythms on the timing of movement-related prediction in two experiments. In both experiments, participants observed a moving disc that was visible for a predetermined period before disappearing behind a small, medium, or large occluded area for the remainder of its movement. In Experiment 1, the disc was visible for 1 s. During this period, participants were exposed to either a fast or slow auditory rhythm, or they heard nothing. They were instructed to press a key to indicate when they believed the moving disc had reached a specified location on the other side of the occluded area. The procedure measured the (signed) error in participants' estimate of the time it would take for a moving object to contact a stationary one. The principal results of Experiment 1 were main effects of the rate of the auditory rhythm and of the size of the occlusion on participants' judgments. In Experiment 2, the period of visibility was varied with size of the occlusion area to keep the total movement time constant for all three levels of occlusion. The results replicated the main effect of rhythm found in Experiment 1 and showed a small, significant interaction, but indicated no main effect of occlusion size. Overall, the results indicate that exposure to fast isochronous auditory rhythms during an interval of inferred motion can influence the imagined rate of such motion and suggest a possible role of an internal rhythmicity in the maintenance of temporally accurate dynamic mental representations.
Collapse
|
30
|
Møller C, Stupacher J, Celma-Miralles A, Vuust P. Beat perception in polyrhythms: Time is structured in binary units. PLoS One 2021; 16:e0252174. [PMID: 34415911 PMCID: PMC8378699 DOI: 10.1371/journal.pone.0252174] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2021] [Accepted: 08/01/2021] [Indexed: 11/19/2022] Open
Abstract
In everyday life, we group and subdivide time to understand the sensory environment surrounding us. Organizing time in units, such as diurnal rhythms, phrases, and beat patterns, is fundamental to behavior, speech, and music. When listening to music, our perceptual system extracts and nests rhythmic regularities to create a hierarchical metrical structure that enables us to predict the timing of the next events. Foot tapping and head bobbing to musical rhythms are observable evidence of this process. In the special case of polyrhythms, at least two metrical structures compete to become the reference for these temporal regularities, rendering several possible beats with which we can synchronize our movements. While there is general agreement that tempo, pitch, and loudness influence beat perception in polyrhythms, we focused on the yet neglected influence of beat subdivisions, i.e., the least common denominator of a polyrhythm ratio. In three online experiments, 300 participants listened to a range of polyrhythms and tapped their index fingers in time with the perceived beat. The polyrhythms consisted of two simultaneously presented isochronous pulse trains with different ratios (2:3, 2:5, 3:4, 3:5, 4:5, 5:6) and different tempi. For ratios 2:3 and 3:4, we additionally manipulated the pitch of the pulse trains. Results showed a highly robust influence of subdivision grouping on beat perception. This was manifested as a propensity towards beats that are subdivided into two or four equally spaced units, as opposed to beats with three or more complex groupings of subdivisions. Additionally, lower pitched pulse trains were more often perceived as the beat. Our findings suggest that subdivisions, not beats, are the basic unit of beat perception, and that the principle underlying the binary grouping of subdivisions reflects a propensity towards simplicity. This preference for simple grouping is widely applicable to human perception and cognition of time.
Collapse
Affiliation(s)
- Cecilie Møller
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Aarhus C, Denmark
| | - Jan Stupacher
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Aarhus C, Denmark
| | - Alexandre Celma-Miralles
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Aarhus C, Denmark
| | - Peter Vuust
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg, Aarhus C, Denmark
| |
Collapse
|
31
|
Varlet M, Nozaradan S, Trainor L, Keller PE. Dynamic Modulation of Beta Band Cortico-Muscular Coupling Induced by Audio-Visual Rhythms. Cereb Cortex Commun 2021; 1:tgaa043. [PMID: 34296112 PMCID: PMC8263089 DOI: 10.1093/texcom/tgaa043] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2020] [Revised: 07/27/2020] [Accepted: 07/28/2020] [Indexed: 12/18/2022] Open
Abstract
Human movements often spontaneously fall into synchrony with auditory and visual environmental rhythms. Related behavioral studies have shown that motor responses are automatically and unintentionally coupled with external rhythmic stimuli. However, the neurophysiological processes underlying such motor entrainment remain largely unknown. Here, we investigated with electroencephalography (EEG) and electromyography (EMG) the modulation of neural and muscular activity induced by periodic audio and/or visual sequences. The sequences were presented at either 1 or 2 Hz, while participants maintained constant finger pressure on a force sensor. The results revealed that although there was no change of amplitude in participants' EMG in response to the sequences, the synchronization between EMG and EEG recorded over motor areas in the beta (12-40 Hz) frequency band was dynamically modulated, with maximal coherence occurring about 100 ms before each stimulus. These modulations in beta EEG-EMG motor coherence were found for the 2-Hz audio-visual sequences, confirming at a neurophysiological level the enhancement of motor entrainment with multimodal rhythms that fall within preferred perceptual and movement frequency ranges. Our findings identify beta band cortico-muscular coupling as a potential underlying mechanism of motor entrainment, further elucidating the nature of the link between sensory and motor systems in humans.
Collapse
Affiliation(s)
- Manuel Varlet
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia
| | - Sylvie Nozaradan
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia
| | - Laurel Trainor
- Department of Psychology, Neuroscience & Behaviour, McMaster University, Hamilton, Ontario, Canada
| | - Peter E Keller
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia
| |
Collapse
|
32
|
Schirmer A, Wijaya M, Chiu MH, Maess B, Gunter TC. Musical rhythm effects on visual attention are non-rhythmical: evidence against metrical entrainment. Soc Cogn Affect Neurosci 2021; 16:58-71. [PMID: 32507877 PMCID: PMC7812633 DOI: 10.1093/scan/nsaa077] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2020] [Revised: 05/26/2020] [Accepted: 06/02/2020] [Indexed: 12/11/2022] Open
Abstract
The idea that external rhythms synchronize attention cross-modally has attracted much interest and scientific inquiry. Yet, whether associated attentional modulations are indeed rhythmical in that they spring from and map onto an underlying meter has not been clearly established. Here we tested this idea while addressing the shortcomings of previous work associated with confounding (i) metricality and regularity, (ii) rhythmic and temporal expectations or (iii) global and local temporal effects. We designed sound sequences that varied orthogonally (high/low) in metricality and regularity and presented them as task-irrelevant auditory background in four separate blocks. The participants' task was to detect rare visual targets occurring at a silent metrically aligned or misaligned temporal position. We found that target timing was irrelevant for reaction times and visual event-related potentials. High background regularity and to a lesser extent metricality facilitated target processing across metrically aligned and misaligned positions. Additionally, high regularity modulated auditory background frequencies in the EEG recorded over occipital cortex. We conclude that external rhythms, rather than synchronizing attention cross-modally, confer general, nontemporal benefits. Their predictability conserves processing resources that then benefit stimulus representations in other modalities.
Collapse
Affiliation(s)
- Annett Schirmer
- Correspondence should be addressed to Annett Schirmer, Department of Psychology, The Chinese University of Hong Kong, 3rd Floor, Sino Building, Shatin, N.T., Hong Kong. E-mail:
| | - Maria Wijaya
- Department of Psychology, The Chinese University of Hong Kong, Shatin, Hong Kong SAR
| | - Man Hey Chiu
- Department of Psychology, The Chinese University of Hong Kong, Shatin, Hong Kong SAR
| | - Burkhard Maess
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig 04103, Germany
| | - Thomas C Gunter
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig 04103, Germany
| |
Collapse
|
33
|
Rosso M, Leman M, Moumdjian L. Neural Entrainment Meets Behavior: The Stability Index as a Neural Outcome Measure of Auditory-Motor Coupling. Front Hum Neurosci 2021; 15:668918. [PMID: 34177492 PMCID: PMC8219856 DOI: 10.3389/fnhum.2021.668918] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2021] [Accepted: 04/29/2021] [Indexed: 01/23/2023] Open
Abstract
Understanding rhythmic behavior in the context of coupled auditory and motor systems has been of interest to neurological rehabilitation, in particular, to facilitate walking. Recent work based on behavioral measures revealed an entrainment effect of auditory rhythms on motor rhythms. In this study, we propose a method to compute the neural component of such a process from an electroencephalographic (EEG) signal. A simple auditory-motor synchronization paradigm was used, where 28 healthy participants were instructed to synchronize their finger-tapping with a metronome. The computation of the neural outcome measure was carried out in two blocks. In the first block, we used Generalized Eigendecomposition (GED) to reduce the data dimensionality to the component which maximally entrained to the metronome frequency. The scalp topography pointed at brain activity over contralateral sensorimotor regions. In the second block, we computed instantaneous frequency from the analytic signal of the extracted component. This returned a time-varying measure of frequency fluctuations, whose standard deviation provided our "stability index" as a neural outcome measure of auditory-motor coupling. Finally, the proposed neural measure was validated by conducting a correlation analysis with a set of behavioral outcomes from the synchronization task: resultant vector length, relative phase angle, mean asynchrony, and tempo matching. Significant moderate negative correlations were found with the first three measures, suggesting that the stability index provided a quantifiable neural outcome measure of entrainment, with selectivity towards phase-correction mechanisms. We address further adoption of the proposed approach, especially with populations where sensorimotor abilities are compromised by an underlying pathological condition. The impact of using stability index can potentially be used as an outcome measure to assess rehabilitation protocols, and possibly provide further insight into neuropathological models of auditory-motor coupling.
Collapse
Affiliation(s)
- Mattia Rosso
- Institute of Psychoacoustics and Electronic Music (IPEM), Faculty of Arts and Philosophy, Ghent University, Ghent, Belgium
| | - Marc Leman
- Institute of Psychoacoustics and Electronic Music (IPEM), Faculty of Arts and Philosophy, Ghent University, Ghent, Belgium
| | - Lousin Moumdjian
- Institute of Psychoacoustics and Electronic Music (IPEM), Faculty of Arts and Philosophy, Ghent University, Ghent, Belgium.,UMSC Hasselt-Pelt, Limburg, Belgium.,REVAL Rehabilitation Research Center, Faculty of Rehabilitation Sciences, Limburg, Belgium
| |
Collapse
|
34
|
Cannon J. Expectancy-based rhythmic entrainment as continuous Bayesian inference. PLoS Comput Biol 2021; 17:e1009025. [PMID: 34106918 PMCID: PMC8216548 DOI: 10.1371/journal.pcbi.1009025] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2020] [Revised: 06/21/2021] [Accepted: 04/29/2021] [Indexed: 11/18/2022] Open
Abstract
When presented with complex rhythmic auditory stimuli, humans are able to track underlying temporal structure (e.g., a "beat"), both covertly and with their movements. This capacity goes far beyond that of a simple entrained oscillator, drawing on contextual and enculturated timing expectations and adjusting rapidly to perturbations in event timing, phase, and tempo. Previous modeling work has described how entrainment to rhythms may be shaped by event timing expectations, but sheds little light on any underlying computational principles that could unify the phenomenon of expectation-based entrainment with other brain processes. Inspired by the predictive processing framework, we propose that the problem of rhythm tracking is naturally characterized as a problem of continuously estimating an underlying phase and tempo based on precise event times and their correspondence to timing expectations. We present two inference problems formalizing this insight: PIPPET (Phase Inference from Point Process Event Timing) and PATIPPET (Phase and Tempo Inference). Variational solutions to these inference problems resemble previous "Dynamic Attending" models of perceptual entrainment, but introduce new terms representing the dynamics of uncertainty and the influence of expectations in the absence of sensory events. These terms allow us to model multiple characteristics of covert and motor human rhythm tracking not addressed by other models, including sensitivity of error corrections to inter-event interval and perceived tempo changes induced by event omissions. We show that positing these novel influences in human entrainment yields a range of testable behavioral predictions. Guided by recent neurophysiological observations, we attempt to align the phase inference framework with a specific brain implementation. We also explore the potential of this normative framework to guide the interpretation of experimental data and serve as building blocks for even richer predictive processing and active inference models of timing.
Collapse
Affiliation(s)
- Jonathan Cannon
- Department of Brain and Cognitive Science, Massachusetts Institute of Technology, Cambridge, Massachusetts, United States of America
- * E-mail:
| |
Collapse
|
35
|
Nijhuis P, Keller PE, Nozaradan S, Varlet M. Dynamic modulation of cortico-muscular coupling during real and imagined sensorimotor synchronisation. Neuroimage 2021; 238:118209. [PMID: 34051354 DOI: 10.1016/j.neuroimage.2021.118209] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2020] [Revised: 04/19/2021] [Accepted: 05/10/2021] [Indexed: 12/20/2022] Open
Abstract
People have a natural and intrinsic ability to coordinate body movements with rhythms surrounding them, known as sensorimotor synchronisation. This can be observed in daily environments, when dancing or singing along with music, or spontaneously walking, talking or applauding in synchrony with one another. However, the neurophysiological mechanisms underlying accurately synchronised movement with selected rhythms in the environment remain unclear. Here we studied real and imagined sensorimotor synchronisation with interleaved auditory and visual rhythms using cortico-muscular coherence (CMC) to better understand the processes underlying the preparation and execution of synchronised movement. Electroencephalography (EEG), electromyography (EMG) from the finger flexors, and continuous force signals were recorded in 20 participants during tapping and imagined tapping with discrete stimulus sequences consisting of alternating auditory beeps and visual flashes. The results show that the synchronisation between cortical and muscular activity in the beta (14-38 Hz) frequency band becomes time-locked to the taps executed in synchrony with the visual and auditory stimuli. Dynamic modulation in CMC also occurred when participants imagined tapping with the visual stimuli, but with lower amplitude and a different temporal profile compared to real tapping. These results suggest that CMC does not only reflect changes related to the production of the synchronised movement, but also to its preparation, which appears heightened under higher attentional demands imposed when synchronising with the visual stimuli. These findings highlight a critical role of beta band neural oscillations in the cortical-muscular coupling underlying sensorimotor synchronisation.
Collapse
Affiliation(s)
- Patti Nijhuis
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia.
| | - Peter E Keller
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
| | - Sylvie Nozaradan
- Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Belgium
| | - Manuel Varlet
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia; School of Psychology, Western Sydney University, Sydney, Australia
| |
Collapse
|
36
|
Gilmore SA, Russo FA. Neural and Behavioral Evidence for Vibrotactile Beat Perception and Bimodal Enhancement. J Cogn Neurosci 2021; 33:635-650. [PMID: 33475449 DOI: 10.1162/jocn_a_01673] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The ability to synchronize movements to a rhythmic stimulus, referred to as sensorimotor synchronization (SMS), is a behavioral measure of beat perception. Although SMS is generally superior when rhythms are presented in the auditory modality, recent research has demonstrated near-equivalent SMS for vibrotactile presentations of isochronous rhythms [Ammirante, P., Patel, A. D., & Russo, F. A. Synchronizing to auditory and tactile metronomes: A test of the auditory-motor enhancement hypothesis. Psychonomic Bulletin & Review, 23, 1882-1890, 2016]. The current study aimed to replicate and extend this study by incorporating a neural measure of beat perception. Nonmusicians were asked to tap to rhythms or to listen passively while EEG data were collected. Rhythmic complexity (isochronous, nonisochronous) and presentation modality (auditory, vibrotactile, bimodal) were fully crossed. Tapping data were consistent with those observed by Ammirante et al. (2016), revealing near-equivalent SMS for isochronous rhythms across modality conditions and a drop-off in SMS for nonisochronous rhythms, especially in the vibrotactile condition. EEG data revealed a greater degree of neural entrainment for isochronous compared to nonisochronous trials as well as for auditory and bimodal compared to vibrotactile trials. These findings led us to three main conclusions. First, isochronous rhythms lead to higher levels of beat perception than nonisochronous rhythms across modalities. Second, beat perception is generally enhanced for auditory presentations of rhythm but still possible under vibrotactile presentation conditions. Finally, exploratory analysis of neural entrainment at harmonic frequencies suggests that beat perception may be enhanced for bimodal presentations of rhythm.
Collapse
|
37
|
Lu L, Sheng J, Liu Z, Gao JH. Neural representations of imagined speech revealed by frequency-tagged magnetoencephalography responses. Neuroimage 2021; 229:117724. [PMID: 33421593 DOI: 10.1016/j.neuroimage.2021.117724] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2020] [Revised: 12/25/2020] [Accepted: 01/03/2021] [Indexed: 10/22/2022] Open
Abstract
Speech mental imagery is a quasi-perceptual experience that occurs in the absence of real speech stimulation. How imagined speech with higher-order structures such as words, phrases and sentences is rapidly organized and internally constructed remains elusive. To address this issue, subjects were tasked with imagining and perceiving poems along with a sequence of reference sounds with a presentation rate of 4 Hz while magnetoencephalography (MEG) recording was conducted. Giving that a sentence in a traditional Chinese poem is five syllables, a sentential rhythm was generated at a distinctive frequency of 0.8 Hz. Using the frequency tagging we concurrently tracked the neural processing timescale to the top-down generation of rhythmic constructs embedded in speech mental imagery and the bottom-up sensory-driven activity that were precisely tagged at the sentence-level rate of 0.8 Hz and a stimulus-level rate of 4 Hz, respectively. We found similar neural responses induced by the internal construction of sentences from syllables with both imagined and perceived poems and further revealed shared and distinct cohorts of cortical areas corresponding to the sentence-level rhythm in imagery and perception. This study supports the view of a common mechanism between imagery and perception by illustrating the neural representations of higher-order rhythmic structures embedded in imagined and perceived speech.
Collapse
Affiliation(s)
- Lingxi Lu
- PKU-IDG/McGovern Institute for Brain Research, Peking University, Beijing, 100871 China; Center for MRI Research, Academy for Advanced Interdisciplinary Studies, Peking University, Beijing, 100871 China; Center for the Cognitive Science of Language, Beijing Language and Culture University, Beijing, 100083 China
| | - Jingwei Sheng
- Center for MRI Research, Academy for Advanced Interdisciplinary Studies, Peking University, Beijing, 100871 China; Beijing Quanmag Healthcare, Beijing, 100195 China
| | - Zhaowei Liu
- Center for MRI Research, Academy for Advanced Interdisciplinary Studies, Peking University, Beijing, 100871 China; Center for Excellence in Brain Science and Intelligence Technology (Institute of Neuroscience), Chinese Academy of Science, Shanghai, 200031 China
| | - Jia-Hong Gao
- PKU-IDG/McGovern Institute for Brain Research, Peking University, Beijing, 100871 China; Center for MRI Research, Academy for Advanced Interdisciplinary Studies, Peking University, Beijing, 100871 China; Beijing City Key Lab for Medical Physics and Engineering, Institute of Heavy Ion Physics, School of Physics, Peking University, Beijing, 100871, China.
| |
Collapse
|
38
|
Bouvet CJ, Bardy BG, Keller PE, Dalla Bella S, Nozaradan S, Varlet M. Accent-induced Modulation of Neural and Movement Patterns during Spontaneous Synchronization to Auditory Rhythms. J Cogn Neurosci 2020; 32:2260-2271. [DOI: 10.1162/jocn_a_01605] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
Human rhythmic movements spontaneously synchronize with auditory rhythms at various frequency ratios. The emergence of more complex relationships—for instance, frequency ratios of 1:2 and 1:3—is enhanced by adding a congruent accentuation pattern (binary for 1:2 and ternary for 1:3), resulting in a 1:1 movement–accentuation relationship. However, this benefit of accentuation on movement synchronization appears to be stronger for the ternary pattern than for the binary pattern. Here, we investigated whether this difference in accent-induced movement synchronization may be related to a difference in the neural tracking of these accentuation profiles. Accented and control unaccented auditory sequences were presented to participants who concurrently produced finger taps at their preferred frequency, and spontaneous movement synchronization was measured. EEG was recorded during passive listening to each auditory sequence. The results revealed that enhanced movement synchronization with ternary accentuation was accompanied by enhanced neural tracking of this pattern. Larger EEG responses at the accentuation frequency were found for the ternary pattern compared with the binary pattern. Moreover, the amplitude of accent-induced EEG responses was positively correlated with the magnitude of accent-induced movement synchronization across participants. Altogether, these findings show that the dynamics of spontaneous auditory–motor synchronization is strongly driven by the multi-time-scale sensory processing of auditory rhythms, highlighting the importance of considering neural responses to rhythmic sequences for understanding and enhancing synchronization performance.
Collapse
Affiliation(s)
| | | | | | - Simone Dalla Bella
- Université Montpellier
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, Canada
- University of Montreal
- University of Economics and Human Sciences in Warsaw
| | - Sylvie Nozaradan
- Western Sydney University
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, Canada
- Université Catholique de Louvain
| | | |
Collapse
|
39
|
Prefrontal High Gamma in ECoG Tags Periodicity of Musical Rhythms in Perception and Imagination. eNeuro 2020; 7:ENEURO.0413-19.2020. [PMID: 32586843 PMCID: PMC7405071 DOI: 10.1523/eneuro.0413-19.2020] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2019] [Revised: 05/19/2020] [Accepted: 06/01/2020] [Indexed: 01/08/2023] Open
Abstract
Rhythmic auditory stimuli are known to elicit matching activity patterns in neural populations. Furthermore, recent research has established the particular importance of high-gamma brain activity in auditory processing by showing its involvement in auditory phrase segmentation and envelope tracking. Here, we use electrocorticographic (ECoG) recordings from eight human listeners to see whether periodicities in high-gamma activity track the periodicities in the envelope of musical rhythms during rhythm perception and imagination. Rhythm imagination was elicited by instructing participants to imagine the rhythm to continue during pauses of several repetitions. To identify electrodes whose periodicities in high-gamma activity track the periodicities in the musical rhythms, we compute the correlation between the autocorrelations (ACCs) of both the musical rhythms and the neural signals. A condition in which participants listened to white noise was used to establish a baseline. High-gamma autocorrelations in auditory areas in the superior temporal gyrus and in frontal areas on both hemispheres significantly matched the autocorrelations of the musical rhythms. Overall, numerous significant electrodes are observed on the right hemisphere. Of particular interest is a large cluster of electrodes in the right prefrontal cortex that is active during both rhythm perception and imagination. This indicates conscious processing of the rhythms’ structure as opposed to mere auditory phenomena. The autocorrelation approach clearly highlights that high-gamma activity measured from cortical electrodes tracks both attended and imagined rhythms.
Collapse
|
40
|
Lenc T, Keller PE, Varlet M, Nozaradan S. Neural and Behavioral Evidence for Frequency-Selective Context Effects in Rhythm Processing in Humans. Cereb Cortex Commun 2020; 1:tgaa037. [PMID: 34296106 PMCID: PMC8152888 DOI: 10.1093/texcom/tgaa037] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Revised: 06/30/2020] [Accepted: 07/16/2020] [Indexed: 01/17/2023] Open
Abstract
When listening to music, people often perceive and move along with a periodic meter. However, the dynamics of mapping between meter perception and the acoustic cues to meter periodicities in the sensory input remain largely unknown. To capture these dynamics, we recorded the electroencephalography while nonmusician and musician participants listened to nonrepeating rhythmic sequences, where acoustic cues to meter frequencies either gradually decreased (from regular to degraded) or increased (from degraded to regular). The results revealed greater neural activity selectively elicited at meter frequencies when the sequence gradually changed from regular to degraded compared with the opposite. Importantly, this effect was unlikely to arise from overall gain, or low-level auditory processing, as revealed by physiological modeling. Moreover, the context effect was more pronounced in nonmusicians, who also demonstrated facilitated sensory-motor synchronization with the meter for sequences that started as regular. In contrast, musicians showed weaker effects of recent context in their neural responses and robust ability to move along with the meter irrespective of stimulus degradation. Together, our results demonstrate that brain activity elicited by rhythm does not only reflect passive tracking of stimulus features, but represents continuous integration of sensory input with recent context.
Collapse
Affiliation(s)
- Tomas Lenc
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
| | - Peter E Keller
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
| | - Manuel Varlet
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
- School of Psychology, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
| | - Sylvie Nozaradan
- MARCS Institute for Brain, Behaviour, and Development, Western Sydney University, Penrith, Sydney, NSW 2751, Australia
- Institute of Neuroscience (IONS), Université Catholique de Louvain (UCL), Brussels 1200, Belgium
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal QC H3C 3J7, Canada
| |
Collapse
|
41
|
Music as a scaffold for listening to speech: Better neural phase-locking to song than speech. Neuroimage 2020; 214:116767. [DOI: 10.1016/j.neuroimage.2020.116767] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2019] [Revised: 03/18/2020] [Accepted: 03/19/2020] [Indexed: 11/23/2022] Open
|
42
|
Rajendran VG, Harper NS, Schnupp JWH. Auditory cortical representation of music favours the perceived beat. ROYAL SOCIETY OPEN SCIENCE 2020; 7:191194. [PMID: 32269783 PMCID: PMC7137933 DOI: 10.1098/rsos.191194] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/11/2019] [Accepted: 02/03/2020] [Indexed: 06/02/2023]
Abstract
Previous research has shown that musical beat perception is a surprisingly complex phenomenon involving widespread neural coordination across higher-order sensory, motor and cognitive areas. However, the question of how low-level auditory processing must necessarily shape these dynamics, and therefore perception, is not well understood. Here, we present evidence that the auditory cortical representation of music, even in the absence of motor or top-down activations, already favours the beat that will be perceived. Extracellular firing rates in the rat auditory cortex were recorded in response to 20 musical excerpts diverse in tempo and genre, for which musical beat perception had been characterized by the tapping behaviour of 40 human listeners. We found that firing rates in the rat auditory cortex were on average higher on the beat than off the beat. This 'neural emphasis' distinguished the beat that was perceived from other possible interpretations of the beat, was predictive of the degree of tapping consensus across human listeners, and was accounted for by a spectrotemporal receptive field model. These findings strongly suggest that the 'bottom-up' processing of music performed by the auditory system predisposes the timing and clarity of the perceived musical beat.
Collapse
Affiliation(s)
- Vani G. Rajendran
- Auditory Neuroscience Group, Department of Physiology, Anatomy, and Genetics, University of Oxford, Oxford, UK
- Department of Biomedical Sciences, City University of Hong Kong, Kowloon Tong, Hong Kong
| | - Nicol S. Harper
- Auditory Neuroscience Group, Department of Physiology, Anatomy, and Genetics, University of Oxford, Oxford, UK
| | - Jan W. H. Schnupp
- Auditory Neuroscience Group, Department of Physiology, Anatomy, and Genetics, University of Oxford, Oxford, UK
- Department of Biomedical Sciences, City University of Hong Kong, Kowloon Tong, Hong Kong
| |
Collapse
|
43
|
Varlet M, Nozaradan S, Nijhuis P, Keller PE. Neural tracking and integration of ‘self’ and ‘other’ in improvised interpersonal coordination. Neuroimage 2020; 206:116303. [DOI: 10.1016/j.neuroimage.2019.116303] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2019] [Revised: 10/10/2019] [Accepted: 10/18/2019] [Indexed: 11/24/2022] Open
|
44
|
Kaneshiro B, Nguyen DT, Norcia AM, Dmochowski JP, Berger J. Natural music evokes correlated EEG responses reflecting temporal structure and beat. Neuroimage 2020; 214:116559. [PMID: 31978543 DOI: 10.1016/j.neuroimage.2020.116559] [Citation(s) in RCA: 29] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2019] [Revised: 12/23/2019] [Accepted: 01/14/2020] [Indexed: 11/17/2022] Open
Abstract
The brain activity of multiple subjects has been shown to synchronize during salient moments of natural stimuli, suggesting that correlation of neural responses indexes a brain state operationally termed 'engagement'. While past electroencephalography (EEG) studies have considered both auditory and visual stimuli, the extent to which these results generalize to music-a temporally structured stimulus for which the brain has evolved specialized circuitry-is less understood. Here we investigated neural correlation during natural music listening by recording EEG responses from N=48 adult listeners as they heard real-world musical works, some of which were temporally disrupted through shuffling of short-term segments (measures), reversal, or randomization of phase spectra. We measured correlation between multiple neural responses (inter-subject correlation) and between neural responses and stimulus envelope fluctuations (stimulus-response correlation) in the time and frequency domains. Stimuli retaining basic musical features, such as rhythm and melody, elicited significantly higher behavioral ratings and neural correlation than did phase-scrambled controls. However, while unedited songs were self-reported as most pleasant, time-domain correlations were highest during measure-shuffled versions. Frequency-domain measures of correlation (coherence) peaked at frequencies related to the musical beat, although the magnitudes of these spectral peaks did not explain the observed temporal correlations. Our findings show that natural music evokes significant inter-subject and stimulus-response correlations, and suggest that the neural correlates of musical 'engagement' may be distinct from those of enjoyment.
Collapse
Affiliation(s)
- Blair Kaneshiro
- Center for Computer Research in Music and Acoustics, Stanford University, Stanford, CA, USA; Center for the Study of Language and Information, Stanford University, Stanford, CA, USA; Department of Otolaryngology Head & Neck Surgery, Stanford University School of Medicine, Palo Alto, CA, USA.
| | - Duc T Nguyen
- Center for Computer Research in Music and Acoustics, Stanford University, Stanford, CA, USA; Center for the Study of Language and Information, Stanford University, Stanford, CA, USA; Department of Biomedical Engineering, City College of New York, New York, NY, USA
| | - Anthony M Norcia
- Department of Psychology, Stanford University, Stanford, CA, USA
| | - Jacek P Dmochowski
- Department of Biomedical Engineering, City College of New York, New York, NY, USA; Department of Psychology, Stanford University, Stanford, CA, USA
| | - Jonathan Berger
- Center for Computer Research in Music and Acoustics, Stanford University, Stanford, CA, USA
| |
Collapse
|
45
|
Niesen M, Vander Ghinst M, Bourguignon M, Wens V, Bertels J, Goldman S, Choufani G, Hassid S, De Tiège X. Tracking the Effects of Top-Down Attention on Word Discrimination Using Frequency-tagged Neuromagnetic Responses. J Cogn Neurosci 2020; 32:877-888. [PMID: 31933439 DOI: 10.1162/jocn_a_01522] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Discrimination of words from nonspeech sounds is essential in communication. Still, how selective attention can influence this early step of speech processing remains elusive. To answer that question, brain activity was recorded with magnetoencephalography in 12 healthy adults while they listened to two sequences of auditory stimuli presented at 2.17 Hz, consisting of successions of one randomized word (tagging frequency = 0.54 Hz) and three acoustically matched nonverbal stimuli. Participants were instructed to focus their attention on the occurrence of a predefined word in the verbal attention condition and on a nonverbal stimulus in the nonverbal attention condition. Steady-state neuromagnetic responses were identified with spectral analysis at sensor and source levels. Significant sensor responses peaked at 0.54 and 2.17 Hz in both conditions. Sources at 0.54 Hz were reconstructed in supratemporal auditory cortex, left superior temporal gyrus (STG), left middle temporal gyrus, and left inferior frontal gyrus. Sources at 2.17 Hz were reconstructed in supratemporal auditory cortex and STG. Crucially, source strength in the left STG at 0.54 Hz was significantly higher in verbal attention than in nonverbal attention condition. This study demonstrates speech-sensitive responses at primary auditory and speech-related neocortical areas. Critically, it highlights that, during word discrimination, top-down attention modulates activity within the left STG. This area therefore appears to play a crucial role in selective verbal attentional processes for this early step of speech processing.
Collapse
|
46
|
Lu L, Wang Q, Sheng J, Liu Z, Qin L, Li L, Gao JH. Neural tracking of speech mental imagery during rhythmic inner counting. eLife 2019; 8:48971. [PMID: 31635693 PMCID: PMC6805153 DOI: 10.7554/elife.48971] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2019] [Accepted: 10/09/2019] [Indexed: 11/13/2022] Open
Abstract
The subjective inner experience of mental imagery is among the most ubiquitous human experiences in daily life. Elucidating the neural implementation underpinning the dynamic construction of mental imagery is critical to understanding high-order cognitive function in the human brain. Here, we applied a frequency-tagging method to isolate the top-down process of speech mental imagery from bottom-up sensory-driven activities and concurrently tracked the neural processing time scales corresponding to the two processes in human subjects. Notably, by estimating the source of the magnetoencephalography (MEG) signals, we identified isolated brain networks activated at the imagery-rate frequency. In contrast, more extensive brain regions in the auditory temporal cortex were activated at the stimulus-rate frequency. Furthermore, intracranial stereotactic electroencephalogram (sEEG) evidence confirmed the participation of the inferior frontal gyrus in generating speech mental imagery. Our results indicate that a disassociated neural network underlies the dynamic construction of speech mental imagery independent of auditory perception.
Collapse
Affiliation(s)
- Lingxi Lu
- PKU-IDG/McGovern Institute for Brain Research, Peking University, Beijing, China.,Center for MRI Research, Academy for Advanced Interdisciplinary Studies, Peking University, Beijing, China
| | - Qian Wang
- Department of Clinical Neuropsychology, Sanbo Brain Hospital, Capital Medical University, Beijing, China
| | - Jingwei Sheng
- Center for MRI Research, Academy for Advanced Interdisciplinary Studies, Peking University, Beijing, China
| | - Zhaowei Liu
- Center for MRI Research, Academy for Advanced Interdisciplinary Studies, Peking University, Beijing, China
| | - Lang Qin
- Center for MRI Research, Academy for Advanced Interdisciplinary Studies, Peking University, Beijing, China.,Department of Linguistics, The University of Hong Kong, Hong Kong, China
| | - Liang Li
- Speech and Hearing Research Center, School of Psychological and Cognitive Sciences, Peking University, Beijing, China
| | - Jia-Hong Gao
- PKU-IDG/McGovern Institute for Brain Research, Peking University, Beijing, China.,Center for MRI Research, Academy for Advanced Interdisciplinary Studies, Peking University, Beijing, China.,Beijing City Key Lab for Medical Physics and Engineering, Institution of Heavy Ion Physics, School of Physics, Peking University, Beijing, China
| |
Collapse
|
47
|
Takehana A, Uehara T, Sakaguchi Y. Audiovisual synchrony perception in observing human motion to music. PLoS One 2019; 14:e0221584. [PMID: 31454393 PMCID: PMC6711538 DOI: 10.1371/journal.pone.0221584] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2019] [Accepted: 08/09/2019] [Indexed: 11/18/2022] Open
Abstract
To examine how individuals perceive synchrony between music and body motion, we investigated the characteristics of synchrony perception during observation of a Japanese Radio Calisthenics routine. We used the constant stimuli method to present video clips of an individual performing an exercise routine. We generated stimuli with a range of temporal shifts between the visual and auditory streams, and asked participants to make synchrony judgments. We then examined which movement-feature points agreed with music beats when the participants perceived synchrony. We found that extremities (e.g., hands and feet) reached the movement endpoint or moved through the lowest position at music beats associated with synchrony. Movement onsets never agreed with music beats. To investigate whether visual information about the feature points was necessary for synchrony perception, we conducted a second experiment where only limited portions of video clips were presented to the participants. Participants consistently judged synchrony even when the video image did not contain the critical feature points, suggesting that a prediction mechanism contributes to synchrony perception. To discuss the meaning of these feature points with respect to synchrony perception, we examined the temporal relationship between the motion of body parts and the ground reaction force (GRF) of exercise performers, which reflected the total force acting on the performer. Interestingly, vertical GRF showed local peaks consistently synchronized with music beats for most exercises, with timing that was closely correlated with the timing of movement feature points. This result suggests that synchrony perception in humans is based on some global variable anticipated from visual information, instead of the feature points found in the motion of individual body parts. In summary, the present results indicate that synchrony perception during observation of human motion to music depends largely on spatiotemporal prediction of the performer's motion.
Collapse
Affiliation(s)
- Akira Takehana
- Department of Mechanical Engineering and Intelligent Systems, Graduate School of Informatics and Engineering, University of Electro-Communications, Chofu, Tokyo, Japan
| | - Tsukasa Uehara
- Department of Mechanical Engineering and Intelligent Systems, Graduate School of Informatics and Engineering, University of Electro-Communications, Chofu, Tokyo, Japan
| | - Yutaka Sakaguchi
- Department of Mechanical Engineering and Intelligent Systems, Graduate School of Informatics and Engineering, University of Electro-Communications, Chofu, Tokyo, Japan
- Research Center for Performance Art Science, University of Electro-Communications, Chofu, Tokyo, Japan
- * E-mail:
| |
Collapse
|
48
|
Zelic G, Nijhuis P, Charaf SA, Keller PE, Davis C, Kim J, Varlet M. The influence of pacer-movement continuity and pattern matching on auditory-motor synchronisation. Exp Brain Res 2019; 237:2705-2713. [DOI: 10.1007/s00221-019-05625-9] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2019] [Accepted: 08/08/2019] [Indexed: 11/28/2022]
|
49
|
Doelling KB, Assaneo MF, Bevilacqua D, Pesaran B, Poeppel D. An oscillator model better predicts cortical entrainment to music. Proc Natl Acad Sci U S A 2019; 116:10113-10121. [PMID: 31019082 PMCID: PMC6525506 DOI: 10.1073/pnas.1816414116] [Citation(s) in RCA: 87] [Impact Index Per Article: 17.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
A body of research demonstrates convincingly a role for synchronization of auditory cortex to rhythmic structure in sounds including speech and music. Some studies hypothesize that an oscillator in auditory cortex could underlie important temporal processes such as segmentation and prediction. An important critique of these findings raises the plausible concern that what is measured is perhaps not an oscillator but is instead a sequence of evoked responses. The two distinct mechanisms could look very similar in the case of rhythmic input, but an oscillator might better provide the computational roles mentioned above (i.e., segmentation and prediction). We advance an approach to adjudicate between the two models: analyzing the phase lag between stimulus and neural signal across different stimulation rates. We ran numerical simulations of evoked and oscillatory computational models, showing that in the evoked case,phase lag is heavily rate-dependent, while the oscillatory model displays marked phase concentration across stimulation rates. Next, we compared these model predictions with magnetoencephalography data recorded while participants listened to music of varying note rates. Our results show that the phase concentration of the experimental data is more in line with the oscillatory model than with the evoked model. This finding supports an auditory cortical signal that (i) contains components of both bottom-up evoked responses and internal oscillatory synchronization whose strengths are weighted by their appropriateness for particular stimulus types and (ii) cannot be explained by evoked responses alone.
Collapse
Affiliation(s)
- Keith B Doelling
- Department of Psychology, New York University, New York, NY 10003;
| | | | - Dana Bevilacqua
- Department of Psychology, New York University, New York, NY 10003
| | - Bijan Pesaran
- Center for Neural Science, New York University, New York, NY 10003
| | - David Poeppel
- Department of Psychology, New York University, New York, NY 10003
- Department of Neuroscience, Max Planck Institute for Empirical Aesthetics, 60322 Frankfurt am Main, Germany
| |
Collapse
|
50
|
Hove MJ, Vuust P, Stupacher J. Increased levels of bass in popular music recordings 1955-2016 and their relation to loudness. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2019; 145:2247. [PMID: 31046334 DOI: 10.1121/1.5097587] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/09/2018] [Accepted: 03/23/2019] [Indexed: 06/09/2023]
Abstract
The sound of recorded music has changed over time. These changes can be captured by different audio features. Over the past decades, popular songs have shown clear increases in RMS energy and loudness, but far less attention has addressed whether this upward trend is more prevalent in specific frequency bands, such as the bass. Bass frequencies are especially important for movement induction, such as foot tapping or dancing, and might offer competitive advantages of capturing attention and increasing engagement. Here, the authors examined the evolution of audio features, such as root-mean-square (RMS) energy, loudness, and spectral fluctuations (changes in the audio signal's frequency content) in ten frequency bands from songs on the Billboard Hot 100 charts from 1955 to 2016. Over time, RMS energy and loudness increased while dynamic range decreased. The largest increases were found in the bass range: Spectral flux increased most strongly in the lowest frequency bands (0-100 Hz), and when controlling for overall RMS, only the lowest frequency bands showed an increase over time. The upward trend of bass could reflect changes in technology and style; but based on links between bass and movement, it is likely a widespread technique to increase engagement and contribute to chart success.
Collapse
Affiliation(s)
- Michael J Hove
- Fitchburg State University, Fitchburg, Massachusetts 01420, USA
| | - Peter Vuust
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University and The Royal Academy of Music Aarhus/Aalborg, Denmark
| | - Jan Stupacher
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University and The Royal Academy of Music Aarhus/Aalborg, Denmark
| |
Collapse
|