1
|
Lv X, Wang Y, Zhang Y, Ma S, Liu J, Ye K, Wu Y, Voon V, Sun B. Auditory entrainment coordinates cortical-BNST-NAc triple time locking to alleviate the depressive disorder. Cell Rep 2024; 43:114474. [PMID: 39127041 DOI: 10.1016/j.celrep.2024.114474] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2023] [Revised: 04/15/2024] [Accepted: 06/24/2024] [Indexed: 08/12/2024] Open
Abstract
Listening to music is a promising and accessible intervention for alleviating symptoms of major depressive disorder. However, the neural mechanisms underlying its antidepressant effects remain unclear. In this study on patients with depression, we used auditory entrainment to evaluate intracranial recordings in the bed nucleus of the stria terminalis (BNST) and nucleus accumbens (NAc), along with temporal scalp electroencephalogram (EEG). We highlight music-induced synchronization across this circuit. The synchronization initiates with temporal theta oscillations, subsequently inducing local gamma oscillations in the BNST-NAc circuit. Critically, the incorporated external entrainment induced a modulatory effect from the auditory cortex to the BNST-NAc circuit, activating the antidepressant response and highlighting the causal role of physiological entrainment in enhancing the antidepressant response. Our study explores the pivotal role of the auditory cortex and proposes a neural oscillation triple time-locking model, emphasizing the capacity of the auditory cortex to access the BNST-NAc circuit.
Collapse
Affiliation(s)
- Xin Lv
- Center of Functional Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China; Department of Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Yuhan Wang
- Center of Functional Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China; Department of Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Yingying Zhang
- Center of Functional Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China; Neural and Intelligence Engineering Centre, Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai 200433, China
| | - Shuo Ma
- Center of Functional Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Jie Liu
- Center of Functional Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China; Department of Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Kuanghao Ye
- Center of Functional Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China; Department of Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Yunhao Wu
- Center of Functional Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China; Department of Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Valerie Voon
- Center of Functional Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China; Neural and Intelligence Engineering Centre, Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai 200433, China; Department of Psychiatry, Addenbrookes Hospital, University of Cambridge, CB2 0QQ Cambridge, UK.
| | - Bomin Sun
- Center of Functional Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China; Department of Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China.
| |
Collapse
|
2
|
Fram NR, Berger J. Syncopation as Probabilistic Expectation: Conceptual, Computational, and Experimental Evidence. Cogn Sci 2023; 47:e13390. [PMID: 38043104 DOI: 10.1111/cogs.13390] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2022] [Revised: 08/22/2023] [Accepted: 11/17/2023] [Indexed: 12/05/2023]
Abstract
Definitions of syncopation share two characteristics: the presence of a meter or analogous hierarchical rhythmic structure and a displacement or contradiction of that structure. These attributes are translated in terms of a Bayesian theory of syncopation, where the syncopation of a rhythm is inferred based on a hierarchical structure that is, in turn, learned from the ongoing musical stimulus. Several experiments tested its simplest possible implementation, with equally weighted priors associated with different meters and independence of auditory events, which can be decomposed into two terms representing note density and deviation from a metric hierarchy. A computational simulation demonstrated that extant measures of syncopation fall into two distinct factors analogous to the terms in the simple Bayesian model. Next, a series of behavioral experiments found that perceived syncopation is significantly related to both terms, offering support for the general Bayesian construction of syncopation. However, we also found that the prior expectations associated with different metric structures are not equal across meters and that there is an interaction between density and hierarchical deviation, implying that auditory events are not independent from each other. Together, these findings provide evidence that syncopation is a manifestation of a form of temporal expectation that can be directly represented in Bayesian terms and offer a complementary, feature-driven approach to recent Bayesian models of temporal prediction.
Collapse
Affiliation(s)
- Noah R Fram
- Center for Computer Research in Music and Acoustics, Department of Music, Stanford University
- Department of Otolaryngology, Vanderbilt University Medical Center
| | - Jonathan Berger
- Center for Computer Research in Music and Acoustics, Department of Music, Stanford University
| |
Collapse
|
3
|
Lenc T, Peter V, Hooper C, Keller PE, Burnham D, Nozaradan S. Infants show enhanced neural responses to musical meter frequencies beyond low-level features. Dev Sci 2023; 26:e13353. [PMID: 36415027 DOI: 10.1111/desc.13353] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2022] [Revised: 10/20/2022] [Accepted: 11/16/2022] [Indexed: 11/24/2022]
Abstract
Music listening often entails spontaneous perception and body movement to a periodic pulse-like meter. There is increasing evidence that this cross-cultural ability relates to neural processes that selectively enhance metric periodicities, even when these periodicities are not prominent in the acoustic stimulus. However, whether these neural processes emerge early in development remains largely unknown. Here, we recorded the electroencephalogram (EEG) of 20 healthy 5- to 6-month-old infants, while they were exposed to two rhythms known to induce the perception of meter consistently across Western adults. One rhythm contained prominent acoustic periodicities corresponding to the meter, whereas the other rhythm did not. Infants showed significantly enhanced representations of meter periodicities in their EEG responses to both rhythms. This effect is unlikely to reflect the tracking of salient acoustic features in the stimulus, as it was observed irrespective of the prominence of meter periodicities in the audio signals. Moreover, as previously observed in adults, the neural enhancement of meter was greater when the rhythm was delivered by low-pitched sounds. Together, these findings indicate that the endogenous enhancement of metric periodicities beyond low-level acoustic features is a neural property that is already present soon after birth. These high-level neural processes could set the stage for internal representations of musical meter that are critical for human movement coordination during rhythmic musical behavior. RESEARCH HIGHLIGHTS: 5- to 6-month-old infants were presented with auditory rhythms that induce the perception of a periodic pulse-like meter in adults. Infants showed selective enhancement of EEG activity at meter-related frequencies irrespective of the prominence of these frequencies in the stimulus. Responses at meter-related frequencies were boosted when the rhythm was conveyed by bass sounds. High-level neural processes that transform rhythmic auditory stimuli into internal meter templates emerge early after birth.
Collapse
Affiliation(s)
- Tomas Lenc
- Institute of Neuroscience (IONS), Université catholique de Louvain (UCL), Brussels, Belgium
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
| | - Varghese Peter
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
- School of Health and Behavioural Sciences, University of the Sunshine Coast, Queensland, Australia
| | - Caitlin Hooper
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
| | - Peter E Keller
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
- Center for Music in the Brain & Department of Clinical Medicine, Aarhus University, Aarhus, Denmark
| | - Denis Burnham
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
| | - Sylvie Nozaradan
- Institute of Neuroscience (IONS), Université catholique de Louvain (UCL), Brussels, Belgium
- MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, Canada
| |
Collapse
|
4
|
Huang JK, Yin B. Phylogenic evolution of beat perception and synchronization: a comparative neuroscience perspective. Front Syst Neurosci 2023; 17:1169918. [PMID: 37325439 PMCID: PMC10264645 DOI: 10.3389/fnsys.2023.1169918] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2023] [Accepted: 05/16/2023] [Indexed: 06/17/2023] Open
Abstract
The study of music has long been of interest to researchers from various disciplines. Scholars have put forth numerous hypotheses regarding the evolution of music. With the rise of cross-species research on music cognition, researchers hope to gain a deeper understanding of the phylogenic evolution, behavioral manifestation, and physiological limitations of the biological ability behind music, known as musicality. This paper presents the progress of beat perception and synchronization (BPS) research in cross-species settings and offers varying views on the relevant hypothesis of BPS. The BPS ability observed in rats and other mammals as well as recent neurobiological findings presents a significant challenge to the vocal learning and rhythm synchronization hypothesis if taken literally. An integrative neural-circuit model of BPS is proposed to accommodate the findings. In future research, it is recommended that greater consideration be given to the social attributes of musicality and to the behavioral and physiological changes that occur across different species in response to music characteristics.
Collapse
Affiliation(s)
- Jin-Kun Huang
- Laboratory for Learning and Behavioral Sciences, School of Psychology, Fujian Normal University, Fuzhou, Fujian, China
| | - Bin Yin
- Laboratory for Learning and Behavioral Sciences, School of Psychology, Fujian Normal University, Fuzhou, Fujian, China
- Department of Applied Psychology, School of Psychology, Fujian Normal University, Fuzhou, Fujian, China
| |
Collapse
|
5
|
Ito Y, Shiramatsu TI, Ishida N, Oshima K, Magami K, Takahashi H. Spontaneous beat synchronization in rats: Neural dynamics and motor entrainment. SCIENCE ADVANCES 2022; 8:eabo7019. [PMID: 36367945 PMCID: PMC9651867 DOI: 10.1126/sciadv.abo7019] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/19/2022] [Accepted: 09/26/2022] [Indexed: 06/16/2023]
Abstract
Beat perception and synchronization within 120 to 140 beats/min (BPM) are common in humans and frequently used in music composition. Why beat synchronization is uncommon in some species and the mechanism determining the optimal tempo are unclear. Here, we examined physical movements and neural activities in rats to determine their beat sensitivity. Close inspection of head movements and neural recordings revealed that rats displayed prominent beat synchronization and activities in the auditory cortex within 120 to 140 BPM. Mathematical modeling suggests that short-term adaptation underlies this beat tuning. Our results support the hypothesis that the optimal tempo for beat synchronization is determined by the time constant of neural dynamics conserved across species, rather than the species-specific time constant of physical movements. Thus, latent neural propensity for auditory motor entrainment may provide a basis for human entrainment that is much more widespread than currently thought. Further studies comparing humans and animals will offer insights into the origins of music and dancing.
Collapse
|
6
|
Zuk NJ, Murphy JW, Reilly RB, Lalor EC. Envelope reconstruction of speech and music highlights stronger tracking of speech at low frequencies. PLoS Comput Biol 2021; 17:e1009358. [PMID: 34534211 PMCID: PMC8480853 DOI: 10.1371/journal.pcbi.1009358] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2021] [Revised: 09/29/2021] [Accepted: 08/18/2021] [Indexed: 11/19/2022] Open
Abstract
The human brain tracks amplitude fluctuations of both speech and music, which reflects acoustic processing in addition to the encoding of higher-order features and one's cognitive state. Comparing neural tracking of speech and music envelopes can elucidate stimulus-general mechanisms, but direct comparisons are confounded by differences in their envelope spectra. Here, we use a novel method of frequency-constrained reconstruction of stimulus envelopes using EEG recorded during passive listening. We expected to see music reconstruction match speech in a narrow range of frequencies, but instead we found that speech was reconstructed better than music for all frequencies we examined. Additionally, models trained on all stimulus types performed as well or better than the stimulus-specific models at higher modulation frequencies, suggesting a common neural mechanism for tracking speech and music. However, speech envelope tracking at low frequencies, below 1 Hz, was associated with increased weighting over parietal channels, which was not present for the other stimuli. Our results highlight the importance of low-frequency speech tracking and suggest an origin from speech-specific processing in the brain.
Collapse
Affiliation(s)
- Nathaniel J. Zuk
- Department of Electronic & Electrical Engineering, Trinity College, The University of Dublin, Dublin, Ireland
- Department of Mechanical, Manufacturing & Biomedical Engineering, Trinity College, The University of Dublin, Dublin, Ireland
- Trinity College Institute of Neuroscience, Trinity College, The University of Dublin, Dublin, Ireland
- Department of Biomedical Engineering, University of Rochester, Rochester, New York, United States of America
- Del Monte Institute of Neuroscience, University of Rochester Medical Center, Rochester, New York, United States of America
| | - Jeremy W. Murphy
- Department of Electronic & Electrical Engineering, Trinity College, The University of Dublin, Dublin, Ireland
| | - Richard B. Reilly
- Department of Mechanical, Manufacturing & Biomedical Engineering, Trinity College, The University of Dublin, Dublin, Ireland
- Trinity College Institute of Neuroscience, Trinity College, The University of Dublin, Dublin, Ireland
- Trinity Centre for Biomedical Engineering, Trinity College, The University of Dublin, Dublin, Ireland
| | - Edmund C. Lalor
- Department of Electronic & Electrical Engineering, Trinity College, The University of Dublin, Dublin, Ireland
- Department of Biomedical Engineering, University of Rochester, Rochester, New York, United States of America
- Del Monte Institute of Neuroscience, University of Rochester Medical Center, Rochester, New York, United States of America
| |
Collapse
|
7
|
Liu Y, Lian W, Zhao X, Tang Q, Liu G. Spatial Connectivity and Temporal Dynamic Functional Network Connectivity of Musical Emotions Evoked by Dynamically Changing Tempo. Front Neurosci 2021; 15:700154. [PMID: 34421523 PMCID: PMC8375772 DOI: 10.3389/fnins.2021.700154] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2021] [Accepted: 07/07/2021] [Indexed: 11/13/2022] Open
Abstract
Music tempo is closely connected to listeners' musical emotion and multifunctional neural activities. Music with increasing tempo evokes higher emotional responses and music with decreasing tempo enhances relaxation. However, the neural substrate of emotion evoked by dynamically changing tempo is still unclear. To investigate the spatial connectivity and temporal dynamic functional network connectivity (dFNC) of musical emotion evoked by dynamically changing tempo, we collected dynamic emotional ratings and conducted group independent component analysis (ICA), sliding time window correlations, and k-means clustering to assess the FNC of emotion evoked by music with decreasing tempo (180-65 bpm) and increasing tempo (60-180 bpm). Music with decreasing tempo (with more stable dynamic valences) evoked higher valence than increasing tempo both with stronger independent components (ICs) in the default mode network (DMN) and sensorimotor network (SMN). The dFNC analysis showed that with time-decreasing FNC across the whole brain, emotion evoked by decreasing music was associated with strong spatial connectivity within the DMN and SMN. Meanwhile, it was associated with strong FNC between the DMN-frontoparietal network (FPN) and DMN-cingulate-opercular network (CON). The paired t-test showed that music with a decreasing tempo evokes stronger activation of ICs within DMN and SMN than that with an increasing tempo, which indicated that faster music is more likely to enhance listeners' emotions with multifunctional brain activities even when the tempo is slowing down. With increasing FNC across the whole brain, music with an increasing tempo was associated with strong connectivity within FPN; time-decreasing connectivity was found within CON, SMN, VIS, and between CON and SMN, which explained its unstable valence during the dynamic valence rating. Overall, the FNC can help uncover the spatial and temporal neural substrates of musical emotions evoked by dynamically changing tempi.
Collapse
Affiliation(s)
- Ying Liu
- School of Mathematics and Statistics, Southwest University, Chongqing, China
- School of Music, Southwest University, Chongqing, China
| | - Weili Lian
- College of Preschool Education, Chongqing Youth Vocational and Technical College, Chongqing, China
| | - Xingcong Zhao
- School of Electronic and Information Engineering, Southwest University, Chongqing, China
| | - Qingting Tang
- Faculty of Psychology, Southwest University, Chongqing, China
| | - Guangyuan Liu
- School of Electronic and Information Engineering, Southwest University, Chongqing, China
| |
Collapse
|
8
|
Liu Y, Zhao X, Tang Q, Li W, Liu G. Dynamic functional network connectivity associated with musical emotions evoked by different tempi. Brain Connect 2021; 12:584-597. [PMID: 34309409 DOI: 10.1089/brain.2021.0069] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Background:Music tempo has strong clinical maneuverability and positive emotional effect in music therapy, which can directly evoke multiple emotions and dynamic neural changes in the whole-brain. However, the precise relationship between music tempo and its emotional effects remains unclear. The present study aimed to investigate the dynamic functional network connectivity (dFNC) associated with emotions elicited by music at different tempi. METHODS We obtained emotion ratings of fast- (155-170 bpm), middle- (90 bpm), and slow-tempo (50-60 bpm) piano music from 40 participants both during and after functional magnetic resonance imaging (fMRI). Group independent component analysis (ICA), sliding time window correlations, and k-means clustering were used to assess dFNC of fMRI data. Paired t-tests were conducted to compare the difference of neural networks. RESULTS (1) Fast music was associated with higher ratings of emotional valence and arousal, which were accompanied with increasing dFNC between somatomotor (SM) and cingulo-opercular (CO) networks and decreasing dFNC between fronto-parietal and SM networks. (2) Even with stronger activation in auditory (AUD) networks, slow music was associated with weaker emotion than fast music, with decreasing FNC across the brain and the participation of default mode (DM). (3) Middle-tempo music elicited moderate emotional activation with the most stable dFNC in the whole brain. CONCLUSION Faster music increases neural activity in the SM and CO regions, increasing the intensity of the emotional experience. In contrast, slower music was associated with decreasing engagement of AUD and stable engagement of DM, resulting in a weak emotional experience. These findings suggested that the time-varying aspects of functional connectivity can help to uncover the dynamic neural substrates of tempo-evoked emotion while listening to music.
Collapse
Affiliation(s)
- Ying Liu
- Southwest University, 26463, School of Mathematics and Statistics , Chongqing, China.,Southwest University, 26463, School of Music, Chongqing, Sichuan, China;
| | - Xingcong Zhao
- Southwest University, 26463, School of Electronic and Information Engineering, Chongqing, Chongqing, China;
| | - Qingting Tang
- Southwest University, 26463, Faculty of Psychology, Chongqing, Chongqing, China;
| | - Wenhui Li
- Southwest University, 26463, School of Electronic and Information Engineering, Chongqing, Chongqing, China;
| | - Guangyuan Liu
- Southwest University, 26463, School of Electronic and Information Engineering, Chongqing, Chongqing, China;
| |
Collapse
|
9
|
Zuk NJ, Teoh ES, Lalor EC. EEG-based classification of natural sounds reveals specialized responses to speech and music. Neuroimage 2020; 210:116558. [DOI: 10.1016/j.neuroimage.2020.116558] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2019] [Revised: 12/23/2019] [Accepted: 01/14/2020] [Indexed: 11/30/2022] Open
|
10
|
Rajendran VG, Harper NS, Schnupp JWH. Auditory cortical representation of music favours the perceived beat. ROYAL SOCIETY OPEN SCIENCE 2020; 7:191194. [PMID: 32269783 PMCID: PMC7137933 DOI: 10.1098/rsos.191194] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/11/2019] [Accepted: 02/03/2020] [Indexed: 06/02/2023]
Abstract
Previous research has shown that musical beat perception is a surprisingly complex phenomenon involving widespread neural coordination across higher-order sensory, motor and cognitive areas. However, the question of how low-level auditory processing must necessarily shape these dynamics, and therefore perception, is not well understood. Here, we present evidence that the auditory cortical representation of music, even in the absence of motor or top-down activations, already favours the beat that will be perceived. Extracellular firing rates in the rat auditory cortex were recorded in response to 20 musical excerpts diverse in tempo and genre, for which musical beat perception had been characterized by the tapping behaviour of 40 human listeners. We found that firing rates in the rat auditory cortex were on average higher on the beat than off the beat. This 'neural emphasis' distinguished the beat that was perceived from other possible interpretations of the beat, was predictive of the degree of tapping consensus across human listeners, and was accounted for by a spectrotemporal receptive field model. These findings strongly suggest that the 'bottom-up' processing of music performed by the auditory system predisposes the timing and clarity of the perceived musical beat.
Collapse
Affiliation(s)
- Vani G. Rajendran
- Auditory Neuroscience Group, Department of Physiology, Anatomy, and Genetics, University of Oxford, Oxford, UK
- Department of Biomedical Sciences, City University of Hong Kong, Kowloon Tong, Hong Kong
| | - Nicol S. Harper
- Auditory Neuroscience Group, Department of Physiology, Anatomy, and Genetics, University of Oxford, Oxford, UK
| | - Jan W. H. Schnupp
- Auditory Neuroscience Group, Department of Physiology, Anatomy, and Genetics, University of Oxford, Oxford, UK
- Department of Biomedical Sciences, City University of Hong Kong, Kowloon Tong, Hong Kong
| |
Collapse
|