1
|
Morucci P, Nara S, Lizarazu M, Martin C, Molinaro N. Language experience shapes predictive coding of rhythmic sound sequences. eLife 2024; 12:RP91636. [PMID: 39268817 PMCID: PMC11398862 DOI: 10.7554/elife.91636] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/15/2024] Open
Abstract
Perceptual systems heavily rely on prior knowledge and predictions to make sense of the environment. Predictions can originate from multiple sources of information, including contextual short-term priors, based on isolated temporal situations, and context-independent long-term priors, arising from extended exposure to statistical regularities. While the effects of short-term predictions on auditory perception have been well-documented, how long-term predictions shape early auditory processing is poorly understood. To address this, we recorded magnetoencephalography data from native speakers of two languages with different word orders (Spanish: functor-initial vs Basque: functor-final) listening to simple sequences of binary sounds alternating in duration with occasional omissions. We hypothesized that, together with contextual transition probabilities, the auditory system uses the characteristic prosodic cues (duration) associated with the native language's word order as an internal model to generate long-term predictions about incoming non-linguistic sounds. Consistent with our hypothesis, we found that the amplitude of the mismatch negativity elicited by sound omissions varied orthogonally depending on the speaker's linguistic background and was most pronounced in the left auditory cortex. Importantly, listening to binary sounds alternating in pitch instead of duration did not yield group differences, confirming that the above results were driven by the hypothesized long-term 'duration' prior. These findings show that experience with a given language can shape a fundamental aspect of human perception - the neural processing of rhythmic sounds - and provides direct evidence for a long-term predictive coding system in the auditory cortex that uses auditory schemes learned over a lifetime to process incoming sound sequences.
Collapse
Affiliation(s)
- Piermatteo Morucci
- Department of Fundamental Neurosciences, University of GenevaGenevaSwitzerland
- Basque Center on Cognition, Brain and LanguageDonostia-San SebastianSpain
| | - Sanjeev Nara
- Basque Center on Cognition, Brain and LanguageDonostia-San SebastianSpain
- Mathematical Institute, Department of Mathematics and Computer Science, Physics, Geography, Liebig-Universität GießenGießenGermany
| | - Mikel Lizarazu
- Basque Center on Cognition, Brain and LanguageDonostia-San SebastianSpain
| | - Clara Martin
- Basque Center on Cognition, Brain and LanguageDonostia-San SebastianSpain
- Ikerbasque, Basque Foundation for ScienceBilbaoSpain
| | - Nicola Molinaro
- Basque Center on Cognition, Brain and LanguageDonostia-San SebastianSpain
- Ikerbasque, Basque Foundation for ScienceBilbaoSpain
| |
Collapse
|
2
|
Bonnet P, Bonnefond M, Kösem A. What is a Rhythm for the Brain? The Impact of Contextual Temporal Variability on Auditory Perception. J Cogn 2024; 7:15. [PMID: 38250558 PMCID: PMC10798173 DOI: 10.5334/joc.344] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2023] [Accepted: 01/02/2024] [Indexed: 01/23/2024] Open
Abstract
Temporal predictions can be formed and impact perception when sensory timing is fully predictable: for instance, the discrimination of a target sound is enhanced if it is presented on the beat of an isochronous rhythm. However, natural sensory stimuli, like speech or music, are not entirely predictable, but still possess statistical temporal regularities. We investigated whether temporal expectations can be formed in non-fully predictable contexts, and how the temporal variability of sensory contexts affects auditory perception. Specifically, we asked how "rhythmic" an auditory stimulation needs to be in order to observe temporal predictions effects on auditory discrimination performances. In this behavioral auditory oddball experiment, participants listened to auditory sound sequences where the temporal interval between each sound was drawn from gaussian distributions with distinct standard deviations. Participants were asked to discriminate sounds with a deviant pitch in the sequences. Auditory discrimination performances, as measured with deviant sound discrimination accuracy and response times, progressively declined as the temporal variability of the sound sequence increased. Moreover, both global and local temporal statistics impacted auditory perception, suggesting that temporal statistics are promptly integrated to optimize perception. Altogether, these results suggests that temporal predictions can be set up quickly based on the temporal statistics of past sensory events and are robust to a certain amount of temporal variability. Therefore, temporal predictions can be built on sensory stimulations that are not purely periodic nor temporally deterministic.
Collapse
Affiliation(s)
- Pierre Bonnet
- Lyon Neuroscience Research Center (CRNL), Computation, Cognition and Neurophysiology team (Cophy), Inserm U1028, Université Claude Bernard Lyon1, CNRS UMR 5292, 69000 Lyon, France
| | - Mathilde Bonnefond
- Lyon Neuroscience Research Center (CRNL), Computation, Cognition and Neurophysiology team (Cophy), Inserm U1028, Université Claude Bernard Lyon1, CNRS UMR 5292, 69000 Lyon, France
| | - Anne Kösem
- Lyon Neuroscience Research Center (CRNL), Computation, Cognition and Neurophysiology team (Cophy), Inserm U1028, Université Claude Bernard Lyon1, CNRS UMR 5292, 69000 Lyon, France
| |
Collapse
|
3
|
Bouwer FL, Háden GP, Honing H. Probing Beat Perception with Event-Related Potentials (ERPs) in Human Adults, Newborns, and Nonhuman Primates. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1455:227-256. [PMID: 38918355 DOI: 10.1007/978-3-031-60183-5_13] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/27/2024]
Abstract
The aim of this chapter is to give an overview of how the perception of rhythmic temporal regularity such as a regular beat in music can be studied in human adults, human newborns, and nonhuman primates using event-related brain potentials (ERPs). First, we discuss different aspects of temporal structure in general, and musical rhythm in particular, and we discuss the possible mechanisms underlying the perception of regularity (e.g., a beat) in rhythm. Additionally, we highlight the importance of dissociating beat perception from the perception of other types of structure in rhythm, such as predictable sequences of temporal intervals, ordinal structure, and rhythmic grouping. In the second section of the chapter, we start with a discussion of auditory ERPs elicited by infrequent and frequent sounds: ERP responses to regularity violations, such as mismatch negativity (MMN), N2b, and P3, as well as early sensory responses to sounds, such as P1 and N1, have been shown to be instrumental in probing beat perception. Subsequently, we discuss how beat perception can be probed by comparing ERP responses to sounds in regular and irregular sequences, and by comparing ERP responses to sounds in different metrical positions in a rhythm, such as on and off the beat or on strong and weak beats. Finally, we will discuss previous research that has used the aforementioned ERPs and paradigms to study beat perception in human adults, human newborns, and nonhuman primates. In doing so, we consider the possible pitfalls and prospects of the technique, as well as future perspectives.
Collapse
Affiliation(s)
- Fleur L Bouwer
- Cognitive Psychology Unit, Institute of Psychology, Leiden Institute for Brain and Cognition, Leiden University, Leiden, The Netherlands.
- Department of Psychology, Brain & Cognition, University of Amsterdam, Amsterdam, The Netherlands.
| | - Gábor P Háden
- Institute of Cognitive Neuroscience and Psychology, Budapest, Hungary
- Department of Telecommunications and Media Informatics, Faculty of Electrical Engineering and Informatics, Budapest University of Technology and Economics, Budapest, Hungary
| | - Henkjan Honing
- Music Cognition group (MCG), Institute for Logic, Language and Computation (ILLC), Amsterdam Brain and Cognition (ABC), University of Amsterdam, Amsterdam, The Netherlands
| |
Collapse
|
4
|
Benítez-Burraco A, Nikolsky A. The (Co)Evolution of Language and Music Under Human Self-Domestication. HUMAN NATURE (HAWTHORNE, N.Y.) 2023; 34:229-275. [PMID: 37097428 PMCID: PMC10354115 DOI: 10.1007/s12110-023-09447-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 03/27/2023] [Indexed: 04/26/2023]
Abstract
Together with language, music is perhaps the most distinctive behavioral trait of the human species. Different hypotheses have been proposed to explain why only humans perform music and how this ability might have evolved in our species. In this paper, we advance a new model of music evolution that builds on the self-domestication view of human evolution, according to which the human phenotype is, at least in part, the outcome of a process similar to domestication in other mammals, triggered by the reduction in reactive aggression responses to environmental changes. We specifically argue that self-domestication can account for some of the cognitive changes, and particularly for the behaviors conducive to the complexification of music through a cultural mechanism. We hypothesize four stages in the evolution of music under self-domestication forces: (1) collective protomusic; (2) private, timbre-oriented music; (3) small-group, pitch-oriented music; and (4) collective, tonally organized music. This line of development encompasses the worldwide diversity of music types and genres and parallels what has been hypothesized for languages. Overall, music diversity might have emerged in a gradual fashion under the effects of the enhanced cultural niche construction as shaped by the progressive decrease in reactive (i.e., impulsive, triggered by fear or anger) aggression and the increase in proactive (i.e., premeditated, goal-directed) aggression.
Collapse
Affiliation(s)
- Antonio Benítez-Burraco
- Department of Spanish Language, Linguistics and Literary Theory (Linguistics), Faculty of Philology, University of Seville, Seville, Spain.
- Departamento de Lengua Española, Facultad de Filología, Área de Lingüística General, Lingüística y Teoría de la Literatura, Universidad de Sevilla, C/ Palos de la Frontera s/n, Sevilla, 41007, España.
| | | |
Collapse
|
5
|
MacIntyre AD, Lo HYJ, Cross I, Scott S. Task-irrelevant auditory metre shapes visuomotor sequential learning. PSYCHOLOGICAL RESEARCH 2023; 87:872-893. [PMID: 35690927 PMCID: PMC10017598 DOI: 10.1007/s00426-022-01690-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2021] [Accepted: 05/17/2022] [Indexed: 11/24/2022]
Abstract
The ability to learn and reproduce sequences is fundamental to every-day life, and deficits in sequential learning are associated with developmental disorders such as specific language impairment. Individual differences in sequential learning are usually investigated using the serial reaction time task (SRTT), wherein a participant responds to a series of regularly timed, seemingly random visual cues that in fact follow a repeating deterministic structure. Although manipulating inter-cue interval timing has been shown to adversely affect sequential learning, the role of metre (the patterning of salience across time) remains unexplored within the regularly timed, visual SRTT. The current experiment consists of an SRTT adapted to include task-irrelevant auditory rhythms conferring a sense of metre. We predicted that (1) participants' (n = 41) reaction times would reflect the auditory metric structure; (2) that disrupting the correspondence between the learned visual sequence and auditory metre would impede performance; and (3) that individual differences in sensitivity to rhythm would predict the magnitude of these effects. Altering the relationship via a phase shift between the trained visual sequence and auditory metre slowed reaction times. Sensitivity to rhythm was predictive of reaction times over all. In an exploratory analysis, we, moreover, found that approximately half of participants made systematically different responses to visual cues on the basis of the cues' position within the auditory metre. We demonstrate the influence of auditory temporal structures on visuomotor sequential learning in a widely used task where metre and timing are rarely considered. The current results indicate sensitivity to metre as a possible latent factor underpinning individual differences in SRTT performance.
Collapse
Affiliation(s)
- Alexis Deighton MacIntyre
- Institute of Cognitive Neuroscience, University College London, London, UK. .,Centre for Music and Science, University of Cambridge, Cambridge, UK. .,MRC Cognition and Brain Sciences Unit, University of Cambridge, Cambridge, UK.
| | | | - Ian Cross
- Centre for Music and Science, University of Cambridge, Cambridge, UK
| | - Sophie Scott
- Institute of Cognitive Neuroscience, University College London, London, UK
| |
Collapse
|
6
|
Kasdan A, Gordon RL, Lense MD. Neurophysiological Correlates of Dynamic Beat Tracking in Individuals With Williams Syndrome. BIOLOGICAL PSYCHIATRY. COGNITIVE NEUROSCIENCE AND NEUROIMAGING 2022; 7:1183-1191. [PMID: 33419711 PMCID: PMC8060366 DOI: 10.1016/j.bpsc.2020.10.003] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/14/2020] [Revised: 08/20/2020] [Accepted: 10/09/2020] [Indexed: 12/16/2022]
Abstract
BACKGROUND Williams syndrome (WS) is a neurodevelopmental disorder characterized by hypersociability, heightened auditory sensitivities, attention deficits, and strong musical interests despite differences in musical skills. Behavioral studies have reported that individuals with WS exhibit variable beat and rhythm perception skills. METHODS We sought to investigate the neural basis of beat tracking in individuals with WS using electroencephalography. Twenty-seven adults with WS and 16 age-matched, typically developing control subjects passively listened to musical rhythms with accents on either the first or second tone of the repeating pattern, leading to distinct beat percepts. RESULTS Consistent with the role of beta and gamma oscillations in rhythm processing, individuals with WS and typically developing control subjects showed strong evoked neural activity in both the beta (13-30 Hz) and gamma (31-55 Hz) frequency bands in response to beat onsets. This neural response was somewhat more distributed across the scalp for individuals with WS. Compared with typically developing control subjects, individuals with WS exhibited significantly greater amplitude of auditory evoked potentials (P1-N1-P2 complex) and modulations in evoked alpha (8-12 Hz) activity, reflective of sensory and attentional processes. Individuals with WS also exhibited markedly stable neural responses over the course of the experiment, and these responses were significantly more stable than those of control subjects. CONCLUSIONS These results provide neurophysiological evidence for dynamic beat tracking in WS and coincide with the atypical auditory phenotype and attentional difficulties seen in this population.
Collapse
Affiliation(s)
- Anna Kasdan
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, Tennessee; Curb Center for Art, Enterprise, and Public Policy, Nashville, Tennessee.
| | - Reyna L Gordon
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, Tennessee; Department of Psychology, Vanderbilt University, Nashville, Tennessee; Department of Otolaryngology-Head and Neck Surgery, Vanderbilt University Medical Center, Nashville, Tennessee; Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, Tennessee; Curb Center for Art, Enterprise, and Public Policy, Nashville, Tennessee
| | - Miriam D Lense
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, Tennessee; Department of Psychology, Vanderbilt University, Nashville, Tennessee; Department of Otolaryngology-Head and Neck Surgery, Vanderbilt University Medical Center, Nashville, Tennessee; Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, Tennessee; Curb Center for Art, Enterprise, and Public Policy, Nashville, Tennessee
| |
Collapse
|
7
|
Bouwer FL, Nityananda V, Rouse AA, ten Cate C. Rhythmic abilities in humans and non-human animals: a review and recommendations from a methodological perspective. Philos Trans R Soc Lond B Biol Sci 2021; 376:20200335. [PMID: 34420380 PMCID: PMC8380979 DOI: 10.1098/rstb.2020.0335] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/27/2021] [Indexed: 12/15/2022] Open
Abstract
Rhythmic behaviour is ubiquitous in both human and non-human animals, but it is unclear whether the cognitive mechanisms underlying the specific rhythmic behaviours observed in different species are related. Laboratory experiments combined with highly controlled stimuli and tasks can be very effective in probing the cognitive architecture underlying rhythmic abilities. Rhythmic abilities have been examined in the laboratory with explicit and implicit perception tasks, and with production tasks, such as sensorimotor synchronization, with stimuli ranging from isochronous sequences of artificial sounds to human music. Here, we provide an overview of experimental findings on rhythmic abilities in human and non-human animals, while critically considering the wide variety of paradigms used. We identify several gaps in what is known about rhythmic abilities. Many bird species have been tested on rhythm perception, but research on rhythm production abilities in the same birds is lacking. By contrast, research in mammals has primarily focused on rhythm production rather than perception. Many experiments also do not differentiate between possible components of rhythmic abilities, such as processing of single temporal intervals, rhythmic patterns, a regular beat or hierarchical metrical structures. For future research, we suggest a careful choice of paradigm to aid cross-species comparisons, and a critical consideration of the multifaceted abilities that underlie rhythmic behaviour. This article is part of the theme issue 'Synchrony and rhythm interaction: from the brain to behavioural ecology'.
Collapse
Affiliation(s)
- Fleur L. Bouwer
- Department of Experimental and Applied Psychology, Vrije Universiteit Amsterdam, Van der Boechorststraat 7, 1081 BT Amsterdam, The Netherlands
- Institute for Logic, Language and Computation (ILLC), University of Amsterdam, PO Box 94242, 1090 CE Amsterdam, The Netherlands
- Department of Psychology, University of Amsterdam, PO Box 15900, 1001 NK Amsterdam, The Netherlands
| | - Vivek Nityananda
- Biosciences Institute, Faculty of Medical Sciences, Newcastle University, Henry Wellcome Building, Framlington Place, Newcastle upon Tyne NE2 4HH, UK
| | - Andrew A. Rouse
- Department of Psychology, Tufts University, Medford, MA 02155, USA
| | - Carel ten Cate
- Institute of Biology Leiden (IBL), Leiden Institute for Brain and Cognition (LIBC), Leiden University, PO Box 9505, 2300 RA Leiden, The Netherlands
| |
Collapse
|
8
|
Kondoh S, Okanoya K, Tachibana RO. Switching perception of musical meters by listening to different acoustic cues of biphasic sound stimulus. PLoS One 2021; 16:e0256712. [PMID: 34460855 PMCID: PMC8405023 DOI: 10.1371/journal.pone.0256712] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2021] [Accepted: 08/12/2021] [Indexed: 11/20/2022] Open
Abstract
Meter is one of the core features of music perception. It is the cognitive grouping of regular sound sequences, typically for every 2, 3, or 4 beats. Previous studies have suggested that one can not only passively perceive the meter from acoustic cues such as loudness, pitch, and duration of sound elements, but also actively perceive it by paying attention to isochronous sound events without any acoustic cues. Studying the interaction of top-down and bottom-up processing in meter perception leads to understanding the cognitive system’s ability to perceive the entire structure of music. The present study aimed to demonstrate that meter perception requires the top-down process (which maintains and switches attention between cues) as well as the bottom-up process for discriminating acoustic cues. We created a “biphasic” sound stimulus, which consists of successive tone sequences designed to provide cues for both the triple and quadruple meters in different sound attributes, frequency, and duration. Participants were asked to focus on either frequency or duration of the stimulus, and to answer how they perceived meters on a five-point scale (ranged from “strongly triple” to “strongly quadruple”). As a result, we found that participants perceived different meters by switching their attention to specific cues. This result adds evidence to the idea that meter perception involves the interaction between top-down and bottom-up processes.
Collapse
Affiliation(s)
- Sotaro Kondoh
- Department of Life Sciences, Graduate School of Arts and Sciences, The University of Tokyo, Tokyo, Japan
| | - Kazuo Okanoya
- Department of Life Sciences, Graduate School of Arts and Sciences, The University of Tokyo, Tokyo, Japan
- Center for Evolutionary Cognitive Sciences, Graduate School of Arts and Sciences, The University of Tokyo, Tokyo, Japan
- RIKEN Center for Brain Science, Saitama, Japan
- * E-mail: (KO); (ROT)
| | - Ryosuke O. Tachibana
- Center for Evolutionary Cognitive Sciences, Graduate School of Arts and Sciences, The University of Tokyo, Tokyo, Japan
- * E-mail: (KO); (ROT)
| |
Collapse
|
9
|
Nijhuis P, Keller PE, Nozaradan S, Varlet M. Dynamic modulation of cortico-muscular coupling during real and imagined sensorimotor synchronisation. Neuroimage 2021; 238:118209. [PMID: 34051354 DOI: 10.1016/j.neuroimage.2021.118209] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2020] [Revised: 04/19/2021] [Accepted: 05/10/2021] [Indexed: 12/20/2022] Open
Abstract
People have a natural and intrinsic ability to coordinate body movements with rhythms surrounding them, known as sensorimotor synchronisation. This can be observed in daily environments, when dancing or singing along with music, or spontaneously walking, talking or applauding in synchrony with one another. However, the neurophysiological mechanisms underlying accurately synchronised movement with selected rhythms in the environment remain unclear. Here we studied real and imagined sensorimotor synchronisation with interleaved auditory and visual rhythms using cortico-muscular coherence (CMC) to better understand the processes underlying the preparation and execution of synchronised movement. Electroencephalography (EEG), electromyography (EMG) from the finger flexors, and continuous force signals were recorded in 20 participants during tapping and imagined tapping with discrete stimulus sequences consisting of alternating auditory beeps and visual flashes. The results show that the synchronisation between cortical and muscular activity in the beta (14-38 Hz) frequency band becomes time-locked to the taps executed in synchrony with the visual and auditory stimuli. Dynamic modulation in CMC also occurred when participants imagined tapping with the visual stimuli, but with lower amplitude and a different temporal profile compared to real tapping. These results suggest that CMC does not only reflect changes related to the production of the synchronised movement, but also to its preparation, which appears heightened under higher attentional demands imposed when synchronising with the visual stimuli. These findings highlight a critical role of beta band neural oscillations in the cortical-muscular coupling underlying sensorimotor synchronisation.
Collapse
Affiliation(s)
- Patti Nijhuis
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia.
| | - Peter E Keller
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia
| | - Sylvie Nozaradan
- Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Belgium
| | - Manuel Varlet
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Sydney, Australia; School of Psychology, Western Sydney University, Sydney, Australia
| |
Collapse
|
10
|
Nikolsky A. The Pastoral Origin of Semiotically Functional Tonal Organization of Music. Front Psychol 2020; 11:1358. [PMID: 32848961 PMCID: PMC7396614 DOI: 10.3389/fpsyg.2020.01358] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2019] [Accepted: 05/22/2020] [Indexed: 11/13/2022] Open
Abstract
This paper presents a new line of inquiry into when and how music as a semiotic system was born. Eleven principal expressive aspects of music each contains specific structural patterns whose configuration signifies a certain affective state. This distinguishes the tonal organization of music from the phonetic and prosodic organization of natural languages and animal communication. The question of music’s origin can therefore be answered by establishing the point in human history at which all eleven expressive aspects might have been abstracted from the instinct-driven primate calls and used to express human psycho-emotional states. Etic analysis of acoustic parameters is the prime means of cross-examination of the typical patterns of expression of the basic emotions in human music versus animal vocal communication. A new method of such analysis is proposed here. Formation of such expressive aspects as meter, tempo, melodic intervals, and articulation can be explained by the influence of bipedal locomotion, breathing cycle, and heartbeat, long before Homo sapiens. However, two aspects, rhythm and melodic contour, most crucial for music as we know it, lack proxies in the Paleolithic lifestyle. The available ethnographic and developmental data leads one to believe that rhythmic and directional patterns of melody became involved in conveying emotion-related information in the process of frequent switching from one call-type to another within the limited repertory of calls. Such calls are usually adopted for the ongoing caretaking of human youngsters and domestic animals. The efficacy of rhythm and pitch contour in affective communication must have been spontaneously discovered in new important cultural activities. The most likely scenario for music to have become fully semiotically functional and to have spread wide enough to avoid extinctions is the formation of cross-specific communication between humans and domesticated animals during the Neolithic demographic explosion and the subsequent cultural revolution. Changes in distance during such communication must have promoted the integration between different expressive aspects and generated the basic musical grammar. The model of such communication can be found in the surviving tradition of Scandinavian pastoral music - kulning. This article discusses the most likely ways in which such music evolved.
Collapse
|
11
|
Bouwer FL, Honing H, Slagter HA. Beat-based and Memory-based Temporal Expectations in Rhythm: Similar Perceptual Effects, Different Underlying Mechanisms. J Cogn Neurosci 2020; 32:1221-1241. [PMID: 31933432 DOI: 10.1162/jocn_a_01529] [Citation(s) in RCA: 26] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Predicting the timing of incoming information allows the brain to optimize information processing in dynamic environments. Behaviorally, temporal expectations have been shown to facilitate processing of events at expected time points, such as sounds that coincide with the beat in musical rhythm. Yet, temporal expectations can develop based on different forms of structure in the environment, not just the regularity afforded by a musical beat. Little is still known about how different types of temporal expectations are neurally implemented and affect performance. Here, we orthogonally manipulated the periodicity and predictability of rhythmic sequences to examine the mechanisms underlying beat-based and memory-based temporal expectations, respectively. Behaviorally and using EEG, we looked at the effects of beat-based and memory-based expectations on auditory processing when rhythms were task-relevant or task-irrelevant. At expected time points, both beat-based and memory-based expectations facilitated target detection and led to attenuation of P1 and N1 responses, even when expectations were task-irrelevant (unattended). For beat-based expectations, we additionally found reduced target detection and enhanced N1 responses for events at unexpected time points (e.g., off-beat), regardless of the presence of memory-based expectations or task relevance. This latter finding supports the notion that periodicity selectively induces rhythmic fluctuations in neural excitability and furthermore indicates that, although beat-based and memory-based expectations may similarly affect auditory processing of expected events, their underlying neural mechanisms may be different.
Collapse
|
12
|
Abstract
In typical Western music, important pitches occur disproportionately often on important beats, referred to as the tonal-metric hierarchy (Prince & Schmuckler, 2014, Music Perception, 31, 254-270). We tested whether listeners are sensitive to this alignment of pitch and temporal structure. In Experiment 1, the stimuli were 200 artificial melodies with random pitch contours; all melodies had both a regular beat and a pitch class distribution that favored one musical key, but had either high or low agreement with the tonal-metric hierarchy. Thirty-two listeners rated the goodness of each melody, and another 41 listeners rated the melodies' metric clarity (how clear the beat was). The tonal-metric hierarchy did not affect either rating type, likely because the melodies may have only weakly (at best) established a musical key. In Experiment 2, we shuffled the pitches in 60 composed melodies (scrambling pitch contour, but not rhythm) to generate versions with high and low agreement with the tonal-metric hierarchy. Both ratings of goodness (N = 40) and metric clarity (N = 40) revealed strong evidence of the tonal-metric hierarchy influencing ratings; there was no effect of musical training. In Experiment 3, we phase-shifted, rather than shuffled, the pitches from the composed melodies, thus preserving pitch contour. Both rating types (goodness N = 43, metric clarity N = 32) replicated the results of Experiment 2. These findings establish the psychological reality of the tonal-metric hierarchy.
Collapse
|
13
|
Sidiras C, Iliadou VV, Nimatoudis I, Grube M, Griffiths T, Bamiou DE. Deficits in Auditory Rhythm Perception in Children With Auditory Processing Disorder Are Unrelated to Attention. Front Neurosci 2019; 13:953. [PMID: 31551701 PMCID: PMC6743378 DOI: 10.3389/fnins.2019.00953] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2019] [Accepted: 08/23/2019] [Indexed: 11/13/2022] Open
Abstract
Auditory processing disorder (APD) is defined as a specific deficit in the processing of auditory information along the central auditory nervous system, including bottom-up and top-down neural connectivity. Even though music comprises a big part of audition, testing music perception in APD population has not yet gained wide attention in research. This work tests the hypothesis that deficits in rhythm perception occur in a group of subjects with APD. The primary focus of this study is to measure perception of a simple auditory rhythm, i.e., short isochronous sequences of beats, in APD children and to compare their performance to age-matched normal controls. The secondary question is to study the relationship between cognition and auditory processing of rhythm perception. We tested 39 APD children and 25 control children aged between 6 and 12 years via (a) clinical APD tests, including a monaural speech in noise test, (b) isochrony task, a test measuring the detection of small deviations from perfect isochrony in a isochronous beats sequence, and (c) two cognitive tests (auditory memory and auditory attention). APD children scored worse in isochrony task compared to the age-matched control group. In the APD group, neither measure of cognition (attention nor memory) correlated with performance in isochrony task. Left (but not right) speech in noise performance correlated with performance in isochrony task. In the control group a large correlation (r = -0.701, p = 0.001) was observed between isochrony task and attention, but not with memory. The results demonstrate a deficit in the perception of regularly timed sequences in APD that is relevant to the perception of speech in noise, a ubiquitous complaint in this condition. Our results suggest (a) the existence of a non-attention related rhythm perception deficit in APD children and (b) differential effects of attention on task performance in normal vs. APD children. The potential beneficial use of music/rhythm training for rehabilitation purposes in APD children would need to be explored.
Collapse
Affiliation(s)
- Christos Sidiras
- Clinical Psychoacoustics Lab, Third Department of Psychiatry, Neuroscience Sector, Medical School, Aristotle University of Thessaloniki, Thessaloniki, Greece
| | - Vasiliki Vivian Iliadou
- Clinical Psychoacoustics Lab, Third Department of Psychiatry, Neuroscience Sector, Medical School, Aristotle University of Thessaloniki, Thessaloniki, Greece
| | - Ioannis Nimatoudis
- Clinical Psychoacoustics Lab, Third Department of Psychiatry, Neuroscience Sector, Medical School, Aristotle University of Thessaloniki, Thessaloniki, Greece
| | - Manon Grube
- Auditory Group, Medical School, Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, United Kingdom
| | - Tim Griffiths
- Auditory Group, Medical School, Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, United Kingdom
| | - Doris-Eva Bamiou
- Faculty of Brain Sciences, UCL Ear Institute, University College London, London, United Kingdom
- Hearing and Deafness Biomedical Research Centre, National Institute for Health Research, London, United Kingdom
| |
Collapse
|
14
|
Haumann NT, Vuust P, Bertelsen F, Garza-Villarreal EA. Influence of Musical Enculturation on Brain Responses to Metric Deviants. Front Neurosci 2018; 12:218. [PMID: 29720932 PMCID: PMC5915898 DOI: 10.3389/fnins.2018.00218] [Citation(s) in RCA: 42] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2017] [Accepted: 03/19/2018] [Indexed: 11/13/2022] Open
Abstract
The ability to recognize metric accents is fundamental in both music and language perception. It has been suggested that music listeners prefer rhythms that follow simple binary meters, which are common in Western music. This means that listeners expect odd-numbered beats to be strong and even-numbered beats to be weak. In support of this, studies have shown that listeners exposed to Western music show stronger novelty and incongruity related P3 and irregularity detection related mismatch negativity (MMN) brain responses to attenuated odd- than attenuated even-numbered metric positions. Furthermore, behavioral evidence suggests that music listeners' preferences can be changed by long-term exposure to non-Western rhythms and meters, e.g., by listening to African or Balkan music. In our study, we investigated whether it might be possible to measure effects of music enculturation on neural responses to attenuated tones on specific metric positions. We compared the magnetic mismatch negativity (MMNm) to attenuated beats in a “Western group” of listeners (n = 12) mainly exposed to Western music and a “Bicultural group” of listeners (n = 13) exposed for at least 1 year to both Sub-Saharan African music in addition to Western music. We found that in the “Western group” the MMNm was higher in amplitude to deviant tones on odd compared to even metric positions, but not in the “Bicultural group.” In support of this finding, there was also a trend of the “Western group” to rate omitted beats as more surprising on odd than even metric positions, whereas the “Bicultural group” seemed to discriminate less between metric positions in terms of surprise ratings. Also, we observed that the overall latency of the MMNm was significantly shorter in the Bicultural group compared to the Western group. These effects were not biased by possible differences in rhythm perception ability or music training, measured with the Musical Ear Test (MET). Furthermore, source localization analyses suggest that auditory, inferior temporal, sensory-motor, superior frontal, and parahippocampal regions might be involved in eliciting the MMNm to the metric deviants. These findings suggest that effects of music enculturation can be measured on MMNm responses to attenuated tones on specific metric positions.
Collapse
Affiliation(s)
- Niels T Haumann
- Department of Aesthetics and Communication (Musicology), Faculty of Arts, Aarhus University, Aarhus, Denmark.,Department of Clinical Medicine, Center for Music in the Brain, Royal Academy of Music, Aarhus University, Aarhus, Denmark
| | - Peter Vuust
- Department of Clinical Medicine, Center for Music in the Brain, Royal Academy of Music, Aarhus University, Aarhus, Denmark
| | - Freja Bertelsen
- Center of Functionally Integrative Neuroscience, Aarhus University, Aarhus, Denmark.,Department of Nuclear Medicine and PET Centre, Aarhus University Hospital, Aarhus, Denmark
| | - Eduardo A Garza-Villarreal
- Department of Clinical Medicine, Center for Music in the Brain, Royal Academy of Music, Aarhus University, Aarhus, Denmark.,Clinical Research Division, Instituto Nacional de Psiquiatría Ramón de la Fuente Muñiz (INPRFM), Mexico City, Mexico.,Department of Neurology, Faculty of Medicine and University Hospital, Universidad Autonoma de Nuevo Leon, Monterrey, Mexico
| |
Collapse
|
15
|
Bouwer FL, Burgoyne JA, Odijk D, Honing H, Grahn JA. What makes a rhythm complex? The influence of musical training and accent type on beat perception. PLoS One 2018; 13:e0190322. [PMID: 29320533 PMCID: PMC5761885 DOI: 10.1371/journal.pone.0190322] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2017] [Accepted: 12/12/2017] [Indexed: 11/18/2022] Open
Abstract
Perception of a regular beat in music is inferred from different types of accents. For example, increases in loudness cause intensity accents, and the grouping of time intervals in a rhythm creates temporal accents. Accents are expected to occur on the beat: when accents are "missing" on the beat, the beat is more difficult to find. However, it is unclear whether accents occurring off the beat alter beat perception similarly to missing accents on the beat. Moreover, no one has examined whether intensity accents influence beat perception more or less strongly than temporal accents, nor how musical expertise affects sensitivity to each type of accent. In two experiments, we obtained ratings of difficulty in finding the beat in rhythms with either temporal or intensity accents, and which varied in the number of accents on the beat as well as the number of accents off the beat. In both experiments, the occurrence of accents on the beat facilitated beat detection more in musical experts than in musical novices. In addition, the number of accents on the beat affected beat finding more in rhythms with temporal accents than in rhythms with intensity accents. The effect of accents off the beat was much weaker than the effect of accents on the beat and appeared to depend on musical expertise, as well as on the number of accents on the beat: when many accents on the beat are missing, beat perception is quite difficult, and adding accents off the beat may not reduce beat perception further. Overall, the different types of accents were processed qualitatively differently, depending on musical expertise. Therefore, these findings indicate the importance of designing ecologically valid stimuli when testing beat perception in musical novices, who may need different types of accent information than musical experts to be able to find a beat. Furthermore, our findings stress the importance of carefully designing rhythms for social and clinical applications of beat perception, as not all listeners treat all rhythms alike.
Collapse
Affiliation(s)
- Fleur L. Bouwer
- Institute for Logic, Language and Computation, University of Amsterdam, Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Amsterdam, The Netherlands
| | - J. Ashley Burgoyne
- Institute for Logic, Language and Computation, University of Amsterdam, Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Amsterdam, The Netherlands
| | - Daan Odijk
- Informatics Institute, University of Amsterdam, Amsterdam, The Netherlands
| | - Henkjan Honing
- Institute for Logic, Language and Computation, University of Amsterdam, Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Amsterdam, The Netherlands
| | - Jessica A. Grahn
- Brain and Mind Institute, Department of Psychology, University of Western Ontario, London (ON), Canada
| |
Collapse
|
16
|
Rajendran VG, Harper NS, Garcia-Lazaro JA, Lesica NA, Schnupp JWH. Midbrain adaptation may set the stage for the perception of musical beat. Proc Biol Sci 2017; 284:20171455. [PMID: 29118141 PMCID: PMC5698641 DOI: 10.1098/rspb.2017.1455] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2017] [Accepted: 10/13/2017] [Indexed: 11/20/2022] Open
Abstract
The ability to spontaneously feel a beat in music is a phenomenon widely believed to be unique to humans. Though beat perception involves the coordinated engagement of sensory, motor and cognitive processes in humans, the contribution of low-level auditory processing to the activation of these networks in a beat-specific manner is poorly understood. Here, we present evidence from a rodent model that midbrain preprocessing of sounds may already be shaping where the beat is ultimately felt. For the tested set of musical rhythms, on-beat sounds on average evoked higher firing rates than off-beat sounds, and this difference was a defining feature of the set of beat interpretations most commonly perceived by human listeners over others. Basic firing rate adaptation provided a sufficient explanation for these results. Our findings suggest that midbrain adaptation, by encoding the temporal context of sounds, creates points of neural emphasis that may influence the perceptual emergence of a beat.
Collapse
Affiliation(s)
- Vani G Rajendran
- Auditory Neuroscience Group, Department of Physiology, Anatomy, and Genetics, University of Oxford, Oxford, UK
| | - Nicol S Harper
- Auditory Neuroscience Group, Department of Physiology, Anatomy, and Genetics, University of Oxford, Oxford, UK
| | | | - Nicholas A Lesica
- UCL Ear Institute, 332 Grays Inn Rd, Kings Cross, London WC1X 8EE, UK
| | - Jan W H Schnupp
- Department of Biomedical Sciences, City University of Hong Kong, 1/F, Block 1, To Yuen Building, 31 To Yuen Street, Hong Kong
| |
Collapse
|
17
|
Neural processing of musical meter in musicians and non-musicians. Neuropsychologia 2017; 106:289-297. [DOI: 10.1016/j.neuropsychologia.2017.10.007] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2017] [Revised: 10/01/2017] [Accepted: 10/03/2017] [Indexed: 11/17/2022]
|
18
|
Okawa H, Suefusa K, Tanaka T. Neural Entrainment to Auditory Imagery of Rhythms. Front Hum Neurosci 2017; 11:493. [PMID: 29081742 PMCID: PMC5645537 DOI: 10.3389/fnhum.2017.00493] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/05/2017] [Accepted: 09/26/2017] [Indexed: 11/13/2022] Open
Abstract
A method of reconstructing perceived or imagined music by analyzing brain activity has not yet been established. As a first step toward developing such a method, we aimed to reconstruct the imagery of rhythm, which is one element of music. It has been reported that a periodic electroencephalogram (EEG) response is elicited while a human imagines a binary or ternary meter on a musical beat. However, it is not clear whether or not brain activity synchronizes with fully imagined beat and meter without auditory stimuli. To investigate neural entrainment to imagined rhythm during auditory imagery of beat and meter, we recorded EEG while nine participants (eight males and one female) imagined three types of rhythm without auditory stimuli but with visual timing, and then we analyzed the amplitude spectra of the EEG. We also recorded EEG while the participants only gazed at the visual timing as a control condition to confirm the visual effect. Furthermore, we derived features of the EEG using canonical correlation analysis (CCA) and conducted an experiment to individually classify the three types of imagined rhythm from the EEG. The results showed that classification accuracies exceeded the chance level in all participants. These results suggest that auditory imagery of meter elicits a periodic EEG response that changes at the imagined beat and meter frequency even in the fully imagined conditions. This study represents the first step toward the realization of a method for reconstructing the imagined music from brain activity.
Collapse
Affiliation(s)
- Haruki Okawa
- Department of Electrical and Electronic Engineering, Tokyo University of Agriculture and Technology, Tokyo, Japan
| | - Kaori Suefusa
- Department of Electrical and Information Engineering, Tokyo University of Agriculture and Technology, Tokyo, Japan
| | - Toshihisa Tanaka
- Department of Electrical and Electronic Engineering, Tokyo University of Agriculture and Technology, Tokyo, Japan.,RIKEN Brain Science Institute, Saitama, Japan
| |
Collapse
|
19
|
Transitional Probabilities Are Prioritized over Stimulus/Pattern Probabilities in Auditory Deviance Detection: Memory Basis for Predictive Sound Processing. J Neurosci 2017; 36:9572-9. [PMID: 27629709 DOI: 10.1523/jneurosci.1041-16.2016] [Citation(s) in RCA: 25] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2016] [Accepted: 07/25/2016] [Indexed: 11/21/2022] Open
Abstract
UNLABELLED Representations encoding the probabilities of auditory events do not directly support predictive processing. In contrast, information about the probability with which a given sound follows another (transitional probability) allows predictions of upcoming sounds. We tested whether behavioral and cortical auditory deviance detection (the latter indexed by the mismatch negativity event-related potential) relies on probabilities of sound patterns or on transitional probabilities. We presented healthy adult volunteers with three types of rare tone-triplets among frequent standard triplets of high-low-high (H-L-H) or L-H-L pitch structure: proximity deviant (H-H-H/L-L-L), reversal deviant (L-H-L/H-L-H), and first-tone deviant (L-L-H/H-H-L). If deviance detection was based on pattern probability, reversal and first-tone deviants should be detected with similar latency because both differ from the standard at the first pattern position. If deviance detection was based on transitional probabilities, then reversal deviants should be the most difficult to detect because, unlike the other two deviants, they contain no low-probability pitch transitions. The data clearly showed that both behavioral and cortical auditory deviance detection uses transitional probabilities. Thus, the memory traces underlying cortical deviance detection may provide a link between stimulus probability-based change/novelty detectors operating at lower levels of the auditory system and higher auditory cognitive functions that involve predictive processing. SIGNIFICANCE STATEMENT Our research presents the first definite evidence for the auditory system prioritizing transitional probabilities over probabilities of individual sensory events. Forming representations for transitional probabilities paves the way for predictions of upcoming sounds. Several recent theories suggest that predictive processing provides the general basis of human perception, including important auditory functions, such as auditory scene analysis. Our results demonstrate that the memory traces underlying cortical deviance detection form a link between stimulus probability-based change/novelty detectors operating at lower levels of the auditory system and higher auditory cognitive functions that involve predictive processing.
Collapse
|
20
|
Sidiras C, Iliadou V, Nimatoudis I, Reichenbach T, Bamiou DE. Spoken Word Recognition Enhancement Due to Preceding Synchronized Beats Compared to Unsynchronized or Unrhythmic Beats. Front Neurosci 2017; 11:415. [PMID: 28769752 PMCID: PMC5513984 DOI: 10.3389/fnins.2017.00415] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2017] [Accepted: 07/04/2017] [Indexed: 11/16/2022] Open
Abstract
The relation between rhythm and language has been investigated over the last decades, with evidence that these share overlapping perceptual mechanisms emerging from several different strands of research. The dynamic Attention Theory posits that neural entrainment to musical rhythm results in synchronized oscillations in attention, enhancing perception of other events occurring at the same rate. In this study, this prediction was tested in 10 year-old children by means of a psychoacoustic speech recognition in babble paradigm. It was hypothesized that rhythm effects evoked via a short isochronous sequence of beats would provide optimal word recognition in babble when beats and word are in sync. We compared speech recognition in babble performance in the presence of isochronous and in sync vs. non-isochronous or out of sync sequence of beats. Results showed that (a) word recognition was the best when rhythm and word were in sync, and (b) the effect was not uniform across syllables and gender of subjects. Our results suggest that pure tone beats affect speech recognition at early levels of sensory or phonemic processing.
Collapse
Affiliation(s)
- Christos Sidiras
- Clinical Psychoacoustics Laboratory, Neuroscience Division, 3rd Psychiatric Department, Aristotle University of ThessalonikiThessaloniki, Greece
| | - Vasiliki Iliadou
- Clinical Psychoacoustics Laboratory, Neuroscience Division, 3rd Psychiatric Department, Aristotle University of ThessalonikiThessaloniki, Greece
| | - Ioannis Nimatoudis
- Clinical Psychoacoustics Laboratory, Neuroscience Division, 3rd Psychiatric Department, Aristotle University of ThessalonikiThessaloniki, Greece
| | - Tobias Reichenbach
- Department of Bioengineering, Imperial College LondonLondon, United Kingdom
| | - Doris-Eva Bamiou
- Faculty of Brain Sciences, UCL Ear Institute, University College LondonLondon, United Kingdom
| |
Collapse
|
21
|
Neural Entrainment to the Beat: The "Missing-Pulse" Phenomenon. J Neurosci 2017; 37:6331-6341. [PMID: 28559379 DOI: 10.1523/jneurosci.2500-16.2017] [Citation(s) in RCA: 72] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2016] [Revised: 05/09/2017] [Accepted: 05/16/2017] [Indexed: 11/21/2022] Open
Abstract
Most humans have a near-automatic inclination to tap, clap, or move to the beat of music. The capacity to extract a periodic beat from a complex musical segment is remarkable, as it requires abstraction from the temporal structure of the stimulus. It has been suggested that nonlinear interactions in neural networks result in cortical oscillations at the beat frequency, and that such entrained oscillations give rise to the percept of a beat or a pulse. Here we tested this neural resonance theory using MEG recordings as female and male individuals listened to 30 s sequences of complex syncopated drumbeats designed so that they contain no net energy at the pulse frequency when measured using linear analysis. We analyzed the spectrum of the neural activity while listening and compared it to the modulation spectrum of the stimuli. We found enhanced neural response in the auditory cortex at the pulse frequency. We also showed phase locking at the times of the missing pulse, even though the pulse was absent from the stimulus itself. Moreover, the strength of this pulse response correlated with individuals' speed in finding the pulse of these stimuli, as tested in a follow-up session. These findings demonstrate that neural activity at the pulse frequency in the auditory cortex is internally generated rather than stimulus-driven. The current results are both consistent with neural resonance theory and with models based on nonlinear response of the brain to rhythmic stimuli. The results thus help narrow the search for valid models of beat perception.SIGNIFICANCE STATEMENT Humans perceive music as having a regular pulse marking equally spaced points in time, within which musical notes are temporally organized. Neural resonance theory (NRT) provides a theoretical model explaining how an internal periodic representation of a pulse may emerge through nonlinear coupling between oscillating neural systems. After testing key falsifiable predictions of NRT using MEG recordings, we demonstrate the emergence of neural oscillations at the pulse frequency, which can be related to pulse perception. These findings rule out alternative explanations for neural entrainment and provide evidence linking neural synchronization to the perception of pulse, a widely debated topic in recent years.
Collapse
|
22
|
Rhythmic entrainment as a musical affect induction mechanism. Neuropsychologia 2017; 96:96-110. [DOI: 10.1016/j.neuropsychologia.2017.01.004] [Citation(s) in RCA: 58] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/29/2016] [Revised: 12/10/2016] [Accepted: 01/06/2017] [Indexed: 01/04/2023]
|
23
|
Lehmann A, Arias DJ, Schönwiesner M. Tracing the neural basis of auditory entrainment. Neuroscience 2016; 337:306-314. [PMID: 27667358 DOI: 10.1016/j.neuroscience.2016.09.011] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/19/2016] [Revised: 08/17/2016] [Accepted: 09/08/2016] [Indexed: 11/25/2022]
Abstract
Neurons in the auditory cortex synchronize their responses to temporal regularities in sound input. This coupling or "entrainment" is thought to facilitate beat extraction and rhythm perception in temporally structured sounds, such as music. As a consequence of such entrainment, the auditory cortex responds to an omitted (silent) sound in a regular sequence. Although previous studies suggest that the auditory brainstem frequency-following response (FFR) exhibits some of the beat-related effects found in the cortex, it is unknown whether omissions of sounds evoke a brainstem response. We simultaneously recorded cortical and brainstem responses to isochronous and irregular sequences of consonant-vowel syllable /da/ that contained sporadic omissions. The auditory cortex responded strongly to omissions, but we found no evidence of evoked responses to omitted stimuli from the auditory brainstem. However, auditory brainstem responses in the isochronous sound sequence were more consistent across trials than in the irregular sequence. These results indicate that the auditory brainstem faithfully encodes short-term acoustic properties of a stimulus and is sensitive to sequence regularity, but does not entrain to isochronous sequences sufficiently to generate overt omission responses, even for sequences that evoke such responses in the cortex. These findings add to our understanding of the processing of sound regularities, which is an important aspect of human cognitive abilities like rhythm, music and speech perception.
Collapse
Affiliation(s)
- Alexandre Lehmann
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, QC, Canada; Center for Research on Brain, Language and Music (CRBLM), Montreal, QC, Canada; Department of Psychology, University of Montreal, Montreal, QC, Canada; Department of Otolaryngology Head & Neck Surgery, McGill University, Montreal, QC, Canada
| | - Diana Jimena Arias
- University of Quebec at Montreal, Montreal, QC, Canada; International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, QC, Canada.
| | - Marc Schönwiesner
- International Laboratory for Brain, Music and Sound Research (BRAMS), Montreal, QC, Canada; Center for Research on Brain, Language and Music (CRBLM), Montreal, QC, Canada; Department of Psychology, University of Montreal, Montreal, QC, Canada
| |
Collapse
|
24
|
Su YH. Sensorimotor Synchronization with Different Metrical Levels of Point-Light Dance Movements. Front Hum Neurosci 2016; 10:186. [PMID: 27199709 PMCID: PMC4846664 DOI: 10.3389/fnhum.2016.00186] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2016] [Accepted: 04/12/2016] [Indexed: 11/13/2022] Open
Abstract
Rhythm perception and synchronization have been extensively investigated in the auditory domain, as they underlie means of human communication such as music and speech. Although recent studies suggest comparable mechanisms for synchronizing with periodically moving visual objects, the extent to which it applies to ecologically relevant information, such as the rhythm of complex biological motion, remains unknown. The present study addressed this issue by linking rhythm of music and dance in the framework of action-perception coupling. As a previous study showed that observers perceived multiple metrical periodicities in dance movements that embodied this structure, the present study examined whether sensorimotor synchronization (SMS) to dance movements resembles what is known of auditory SMS. Participants watched a point-light figure performing two basic steps of Swing dance cyclically, in which the trunk bounced at every beat and the limbs moved at every second beat, forming two metrical periodicities. Participants tapped synchronously to the bounce of the trunk with or without the limbs moving in the stimuli (Experiment 1), or tapped synchronously to the leg movements with or without the trunk bouncing simultaneously (Experiment 2). Results showed that, while synchronization with the bounce (lower-level pulse) was not influenced by the presence or absence of limb movements (metrical accent), synchronization with the legs (beat) was improved by the presence of the bounce (metrical subdivision) across different movement types. The latter finding parallels the “subdivision benefit” often demonstrated in auditory tasks, suggesting common sensorimotor mechanisms for visual rhythms in dance and auditory rhythms in music.
Collapse
Affiliation(s)
- Yi-Huang Su
- Department of Movement Science, Faculty of Sport and Health Sciences, Technical University of Munich Munich, Germany
| |
Collapse
|
25
|
Bouwer FL, Werner CM, Knetemann M, Honing H. Disentangling beat perception from sequential learning and examining the influence of attention and musical abilities on ERP responses to rhythm. Neuropsychologia 2016; 85:80-90. [PMID: 26972966 DOI: 10.1016/j.neuropsychologia.2016.02.018] [Citation(s) in RCA: 32] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2015] [Revised: 02/22/2016] [Accepted: 02/23/2016] [Indexed: 10/22/2022]
Abstract
Beat perception is the ability to perceive temporal regularity in musical rhythm. When a beat is perceived, predictions about upcoming events can be generated. These predictions can influence processing of subsequent rhythmic events. However, statistical learning of the order of sounds in a sequence can also affect processing of rhythmic events and must be differentiated from beat perception. In the current study, using EEG, we examined the effects of attention and musical abilities on beat perception. To ensure we measured beat perception and not absolute perception of temporal intervals, we used alternating loud and soft tones to create a rhythm with two hierarchical metrical levels. To control for sequential learning of the order of the different sounds, we used temporally regular (isochronous) and jittered rhythmic sequences. The order of sounds was identical in both conditions, but only the regular condition allowed for the perception of a beat. Unexpected intensity decrements were introduced on the beat and offbeat. In the regular condition, both beat perception and sequential learning were expected to enhance detection of these deviants on the beat. In the jittered condition, only sequential learning was expected to affect processing of the deviants. ERP responses to deviants were larger on the beat than offbeat in both conditions. Importantly, this difference was larger in the regular condition than in the jittered condition, suggesting that beat perception influenced responses to rhythmic events in addition to sequential learning. The influence of beat perception was present both with and without attention directed at the rhythm. Moreover, beat perception as measured with ERPs correlated with musical abilities, but only when attention was directed at the stimuli. Our study shows that beat perception is possible when attention is not directed at a rhythm. In addition, our results suggest that attention may mediate the influence of musical abilities on beat perception.
Collapse
Affiliation(s)
- Fleur L Bouwer
- Institute for Logic, Language and Computation, Amsterdam Brain and Cognition (ABC), University of Amsterdam, P.O. Box 94242, 1090 GE Amsterdam, The Netherlands.
| | - Carola M Werner
- Institute for Logic, Language and Computation, Amsterdam Brain and Cognition (ABC), University of Amsterdam, P.O. Box 94242, 1090 GE Amsterdam, The Netherlands
| | - Myrthe Knetemann
- Institute for Logic, Language and Computation, Amsterdam Brain and Cognition (ABC), University of Amsterdam, P.O. Box 94242, 1090 GE Amsterdam, The Netherlands
| | - Henkjan Honing
- Institute for Logic, Language and Computation, Amsterdam Brain and Cognition (ABC), University of Amsterdam, P.O. Box 94242, 1090 GE Amsterdam, The Netherlands
| |
Collapse
|
26
|
Beta-Band Oscillations Represent Auditory Beat and Its Metrical Hierarchy in Perception and Imagery. J Neurosci 2016; 35:15187-98. [PMID: 26558788 DOI: 10.1523/jneurosci.2397-15.2015] [Citation(s) in RCA: 130] [Impact Index Per Article: 16.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/27/2023] Open
Abstract
UNLABELLED Dancing to music involves synchronized movements, which can be at the basic beat level or higher hierarchical metrical levels, as in a march (groups of two basic beats, one-two-one-two …) or waltz (groups of three basic beats, one-two-three-one-two-three …). Our previous human magnetoencephalography studies revealed that the subjective sense of meter influences auditory evoked responses phase locked to the stimulus. Moreover, the timing of metronome clicks was represented in periodic modulation of induced (non-phase locked) β-band (13-30 Hz) oscillation in bilateral auditory and sensorimotor cortices. Here, we further examine whether acoustically accented and subjectively imagined metric processing in march and waltz contexts during listening to isochronous beats were reflected in neuromagnetic β-band activity recorded from young adult musicians. First, we replicated previous findings of beat-related β-power decrease at 200 ms after the beat followed by a predictive increase toward the onset of the next beat. Second, we showed that the β decrease was significantly influenced by the metrical structure, as reflected by differences across beat type for both perception and imagery conditions. Specifically, the β-power decrease associated with imagined downbeats (the count "one") was larger than that for both the upbeat (preceding the count "one") in the march, and for the middle beat in the waltz. Moreover, beamformer source analysis for the whole brain volume revealed that the metric contrasts involved auditory and sensorimotor cortices; frontal, parietal, and inferior temporal lobes; and cerebellum. We suggest that the observed β-band activities reflect a translation of timing information to auditory-motor coordination. SIGNIFICANCE STATEMENT With magnetoencephalography, we examined β-band oscillatory activities around 20 Hz while participants listened to metronome beats and imagined musical meters such as a march and waltz. We demonstrated that β-band event-related desynchronization in the auditory cortex differentiates between beat positions, specifically between downbeats and the following beat. This is the first demonstration of β-band oscillations related to hierarchical and internalized timing information. Moreover, the meter representation in the β oscillations was widespread across the brain, including sensorimotor and premotor cortices, parietal lobe, and cerebellum. The results extend current understanding of the role of β oscillations in neural processing of predictive timing.
Collapse
|
27
|
Abstract
The neural resonance theory of musical meter explains musical beat tracking as the result of entrainment of neural oscillations to the beat frequency and its higher harmonics. This theory has gained empirical support from experiments using simple, abstract stimuli. However, to date there has been no empirical evidence for a role of neural entrainment in the perception of the beat of ecologically valid music. Here we presented participants with a single pop song with a superimposed bassoon sound. This stimulus was either lined up with the beat of the music or shifted away from the beat by 25% of the average interbeat interval. Both conditions elicited a neural response at the beat frequency. However, although the on-the-beat condition elicited a clear response at the first harmonic of the beat, this frequency was absent in the neural response to the off-the-beat condition. These results support a role for neural entrainment in tracking the metrical structure of real music and show that neural meter tracking can be disrupted by the presentation of contradictory rhythmic cues.
Collapse
|
28
|
Fitzroy AB, Sanders LD. Musical Meter Modulates the Allocation of Attention across Time. J Cogn Neurosci 2015; 27:2339-51. [PMID: 26284995 DOI: 10.1162/jocn_a_00862] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Dynamic attending theory predicts that attention is allocated hierarchically across time during processing of hierarchical rhythmic structures such as musical meter. ERP research demonstrates that attention to a moment in time modulates early auditory processing as evidenced by the amplitude of the first negative peak (N1) approximately 100 msec after sound onset. ERPs elicited by tones presented at times of high and low metric strength in short melodies were compared to test the hypothesis that hierarchically structured rhythms direct attention in a manner that modulates early perceptual processing. A more negative N1 was observed for metrically strong beats compared with metrically weak beats; this result provides electrophysiological evidence that hierarchical rhythms direct attention to metrically strong times during engaged listening. The N1 effect was observed only on fast tempo trials, suggesting that listeners more consistently invoke selective processing based on hierarchical rhythms when sounds are presented rapidly. The N1 effect was not modulated by musical expertise, indicating that the allocation of attention to metrically strong times is not dependent on extensive training. Additionally, changes in P2 amplitude and a late negativity were associated with metric strength under some conditions, indicating that multiple cognitive processes are associated with metric perception.
Collapse
Affiliation(s)
- Ahren B Fitzroy
- Northwestern University.,University of Massachusetts, Amherst
| | | |
Collapse
|
29
|
Bouwer FL, Honing H. Temporal attending and prediction influence the perception of metrical rhythm: evidence from reaction times and ERPs. Front Psychol 2015; 6:1094. [PMID: 26284015 PMCID: PMC4518143 DOI: 10.3389/fpsyg.2015.01094] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2015] [Accepted: 07/16/2015] [Indexed: 12/11/2022] Open
Abstract
The processing of rhythmic events in music is influenced by the induced metrical structure. Two mechanisms underlying this may be temporal attending and temporal prediction. Temporal fluctuations in attentional resources may influence the processing of rhythmic events by heightening sensitivity at metrically strong positions. Temporal predictions may attenuate responses to events that are highly expected within a metrical structure. In the current study we aimed to disentangle these two mechanisms by examining responses to unexpected sounds, using intensity increments and decrements as deviants. Temporal attending was hypothesized to lead to better detection of deviants in metrically strong (on the beat) than weak (offbeat) positions due to heightened sensitivity on the beat. Temporal prediction was hypothesized to lead to best detection of increments in offbeat positions and decrements on the beat, as they would be most unexpected in these positions. We used a speeded detection task to measure detectability of the deviants under attended conditions (Experiment 1). Under unattended conditions (Experiment 2), we used EEG to measure the mismatch negativity (MMN), an ERP component known to index the detectability of unexpected auditory events. Furthermore, we examined the amplitude of the auditory evoked P1 and N1 responses, which are known to be sensitive to both attention and prediction. We found better detection of small increments in offbeat positions than on the beat, consistent with the influence of temporal prediction (Experiment 1). In addition, we found faster detection of large increments on the beat as opposed to offbeat (Experiment 1), and larger amplitude P1 responses on the beat as compared to offbeat, both in support of temporal attending (Experiment 2). As such, we showed that both temporal attending and temporal prediction shape our processing of metrical rhythm.
Collapse
Affiliation(s)
- Fleur L Bouwer
- Amsterdam Brain and Cognition, Institute for Logic, Language and Computation, University of Amsterdam Amsterdam, Netherlands
| | - Henkjan Honing
- Amsterdam Brain and Cognition, Institute for Logic, Language and Computation, University of Amsterdam Amsterdam, Netherlands
| |
Collapse
|
30
|
Paul BT, Sederberg PB, Feth LL. Imagined Temporal Groupings Tune Oscillatory Neural Activity for Processing Rhythmic Sounds. TIMING & TIME PERCEPTION 2015. [DOI: 10.1163/22134468-03002042] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
Temporal patterns within complex sound signals, such as music, are not merely processed after they are heard. We also focus attention to upcoming points in time to aid perception, contingent upon regularities we perceive in the sounds’ inherent rhythms. Such organized predictions are endogenously maintained as meter — the patterning of sounds into hierarchical timing levels that manifest as strong and weak events. Models of neural oscillations provide potential means for how meter could arise in the brain, but little evidence of dynamic neural activity has been offered. To this end, we conducted a study instructing participants to imagine two-based or three-based metric patterns over identical, equally-spaced sounds while we recorded the electroencephalogram (EEG). In the three-based metric pattern, multivariate analysis of the EEG showed contrasting patterns of neural oscillations between strong and weak events in the delta (2–4 Hz) and alpha (9–14 Hz), frequency bands, while theta (4–9 Hz) and beta (16–24 Hz) bands contrasted two hierarchically weaker events. In two-based metric patterns, neural activity did not drastically differ between strong and weak events. We suggest the findings reflect patterns of neural activation and suppression responsible for shaping perception through time.
Collapse
|
31
|
Escoffier N, Herrmann CS, Schirmer A. Auditory rhythms entrain visual processes in the human brain: Evidence from evoked oscillations and event-related potentials. Neuroimage 2015; 111:267-76. [DOI: 10.1016/j.neuroimage.2015.02.024] [Citation(s) in RCA: 37] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2014] [Revised: 02/09/2015] [Accepted: 02/11/2015] [Indexed: 10/24/2022] Open
|
32
|
Fujioka T, Fidali BC, Ross B. Neural correlates of intentional switching from ternary to binary meter in a musical hemiola pattern. Front Psychol 2014; 5:1257. [PMID: 25429274 PMCID: PMC4228837 DOI: 10.3389/fpsyg.2014.01257] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2014] [Accepted: 10/16/2014] [Indexed: 12/02/2022] Open
Abstract
Musical rhythms are often perceived and interpreted within a metrical framework that integrates timing information hierarchically based on interval ratios. Endogenous timing processes facilitate this metrical integration and allow us using the sensory context for predicting when an expected sensory event will happen (“predictive timing”). Previously, we showed that listening to metronomes and subjectively imagining the two different meters of march and waltz modulated the resulting auditory evoked responses in the temporal lobe and motor-related brain areas such as the motor cortex, basal ganglia, and cerebellum. Here we further explored the intentional transitions between the two metrical contexts, known as hemiola in the Western classical music dating back to the sixteenth century. We examined MEG from 12 musicians while they repeatedly listened to a sequence of 12 unaccented clicks with an interval of 390 ms, and tapped to them with the right hand according to a 3 + 3 + 2 + 2 + 2 hemiola accent pattern. While participants listened to the same metronome sequence and imagined the accents, their pattern of brain responses significantly changed just before the “pivot” point of metric transition from ternary to binary meter. Until 100 ms before the pivot point, brain activities were more similar to those in the simple ternary meter than those in the simple binary meter, but the pattern was reversed afterwards. A similar transition was also observed at the downbeat after the pivot. Brain areas related to the metric transition were identified from source reconstruction of the MEG using a beamformer and included auditory cortices, sensorimotor and premotor cortices, cerebellum, inferior/middle frontal gyrus, parahippocampal gyrus, inferior parietal lobule, cingulate cortex, and precuneus. The results strongly support that predictive timing processes related to auditory-motor, fronto-parietal, and medial limbic systems underlie metrical representation and its transitions.
Collapse
Affiliation(s)
- Takako Fujioka
- Rotman Research Institute, Baycrest Centre Toronto, ON, Canada ; Center for Computer Research in Music and Acoustics, Department of Music, Stanford University Stanford, CA, USA
| | - Brian C Fidali
- Rotman Research Institute, Baycrest Centre Toronto, ON, Canada ; Brain and Mind Research Institute, Weill Cornell Medical College New York, NY, USA
| | - Bernhard Ross
- Rotman Research Institute, Baycrest Centre Toronto, ON, Canada ; Department of Medical Biophysics, University of Toronto Toronto, ON, Canada
| |
Collapse
|
33
|
Honing H, Bouwer FL, Háden GP. Perceiving temporal regularity in music: the role of auditory event-related potentials (ERPs) in probing beat perception. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2014; 829:305-23. [PMID: 25358717 DOI: 10.1007/978-1-4939-1782-2_16] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/05/2022]
Abstract
The aim of this chapter is to give an overview of how the perception of a regular beat in music can be studied in humans adults, human newborns, and nonhuman primates using event-related brain potentials (ERPs). Next to a review of the recent literature on the perception of temporal regularity in music, we will discuss in how far ERPs, and especially the component called mismatch negativity (MMN), can be instrumental in probing beat perception. We conclude with a discussion on the pitfalls and prospects of using ERPs to probe the perception of a regular beat, in which we present possible constraints on stimulus design and discuss future perspectives.
Collapse
Affiliation(s)
- Henkjan Honing
- Institute for Logic, Language and Computation (ILLC), Amsterdam Brain and Cognition (ABC), University of Amsterdam, Amsterdam, The Netherlands,
| | | | | |
Collapse
|
34
|
Tierney A, Kraus N. Neural responses to sounds presented on and off the beat of ecologically valid music. Front Syst Neurosci 2013; 7:14. [PMID: 23717268 PMCID: PMC3650712 DOI: 10.3389/fnsys.2013.00014] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2012] [Accepted: 04/22/2013] [Indexed: 11/14/2022] Open
Abstract
The tracking of rhythmic structure is a vital component of speech and music perception. It is known that sequences of identical sounds can give rise to the percept of alternating strong and weak sounds, and that this percept is linked to enhanced cortical and oscillatory responses. The neural correlates of the perception of rhythm elicited by ecologically valid, complex stimuli, however, remain unexplored. Here we report the effects of a stimulus' alignment with the beat on the brain's processing of sound. Human subjects listened to short popular music pieces while simultaneously hearing a target sound. Cortical and brainstem electrophysiological onset responses to the sound were enhanced when it was presented on the beat of the music, as opposed to shifted away from it. Moreover, the size of the effect of alignment with the beat on the cortical response correlated strongly with the ability to tap to a beat, suggesting that the ability to synchronize to the beat of simple isochronous stimuli and the ability to track the beat of complex, ecologically valid stimuli may rely on overlapping neural resources. These results suggest that the perception of musical rhythm may have robust effects on processing throughout the auditory system.
Collapse
Affiliation(s)
- Adam Tierney
- Department of Communication Sciences, Auditory Neuroscience Laboratory, Northwestern University Evanston, IL, USA ; Department of Communication Sciences, Northwestern University Evanston, IL, USA
| | | |
Collapse
|
35
|
|
36
|
Foley JA, Della Sala S. Do shorter Cortex papers have greater impact? Cortex 2011; 47:635-42. [PMID: 21463860 DOI: 10.1016/j.cortex.2011.03.008] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2011] [Accepted: 03/18/2011] [Indexed: 01/02/2023]
|
37
|
Kung SJ, Tzeng OJL, Hung DL, Wu DH. Dynamic allocation of attention to metrical and grouping accents in rhythmic sequences. Exp Brain Res 2011; 210:269-82. [PMID: 21442222 DOI: 10.1007/s00221-011-2630-2] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2010] [Accepted: 03/07/2011] [Indexed: 11/30/2022]
Abstract
Most people find it easy to perform rhythmic movements in synchrony with music, which reflects their ability to perceive the temporal periodicity and to allocate attention in time accordingly. Musicians and non-musicians were tested in a click localization paradigm in order to investigate how grouping and metrical accents in metrical rhythms influence attention allocation, and to reveal the effect of musical expertise on such processing. We performed two experiments in which the participants were required to listen to isochronous metrical rhythms containing superimposed clicks and then to localize the click on graphical and ruler-like representations with and without grouping structure information, respectively. Both experiments revealed metrical and grouping influences on click localization. Musical expertise improved the precision of click localization, especially when the click coincided with a metrically strong beat. Critically, although all participants located the click accurately at the beginning of an intensity group, only musicians located it precisely when it coincided with a strong beat at the end of the group. Removal of the visual cue of grouping structures enhanced these effects in musicians and reduced them in non-musicians. These results indicate that musical expertise not only enhances attention to metrical accents but also heightens sensitivity to perceptual grouping.
Collapse
Affiliation(s)
- Shu-Jen Kung
- Institute of Neuroscience, National Yang-Ming University, Taipei, Taiwan
| | | | | | | |
Collapse
|
38
|
Schaefer RS, Vlek RJ, Desain P. Decomposing rhythm processing: electroencephalography of perceived and self-imposed rhythmic patterns. PSYCHOLOGICAL RESEARCH 2010; 75:95-106. [PMID: 20574661 PMCID: PMC3036830 DOI: 10.1007/s00426-010-0293-4] [Citation(s) in RCA: 56] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2010] [Accepted: 05/31/2010] [Indexed: 11/28/2022]
Abstract
Perceiving musical rhythms can be considered a process of attentional chunking over time, driven by accent patterns. A rhythmic structure can also be generated internally, by placing a subjective accent pattern on an isochronous stimulus train. Here, we investigate the event-related potential (ERP) signature of actual and subjective accents, thus disentangling low-level perceptual processes from the cognitive aspects of rhythm processing. The results show differences between accented and unaccented events, but also show that different types of unaccented events can be distinguished, revealing additional structure within the rhythmic pattern. This structure is further investigated by decomposing the ERP into subcomponents, using principal component analysis. In this way, the processes that are common for perceiving a pattern and self-generating it are isolated, and can be visualized for the tasks separately. The results suggest that top-down processes have a substantial role in the cerebral mechanisms of rhythm processing, independent of an externally presented stimulus.
Collapse
Affiliation(s)
- Rebecca S Schaefer
- Donders Institute for Brain, Cognition and Behaviour, Centre for Cognition, Radboud University, Montessorilaan 3, 6525 HE Nijmegen, The Netherlands.
| | | | | |
Collapse
|
39
|
|
40
|
Foley JA, Della Sala S. Geographical distribution of Cortex publications. Cortex 2010; 46:410-9. [DOI: 10.1016/j.cortex.2009.11.010] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2009] [Accepted: 11/23/2009] [Indexed: 01/05/2023]
|
41
|
Abecasis D, Brochard R, Del Río D, Dufour A, Ortiz T. Brain lateralization of metrical accenting in musicians. Ann N Y Acad Sci 2009; 1169:74-8. [PMID: 19673756 DOI: 10.1111/j.1749-6632.2009.04766.x] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
Abstract
The perception of meter, or the alternation of strong and weak beats, was assessed in musically trained listeners through magnetoencephalography. Metrical accents were examined with no temporal disruption of the serial grouping of tones. Results showed an effect of metrical processing among identical standard tones in the left hemisphere, with larger responses on strong than on weak beats. Moreover, processing of occasional increases in intensity (phenomenal accents) varied as a function of metrical position in the left hemisphere, but not in the right. Our findings support the view of a relatively early, left-hemispheric effect of metrical processing in musicians.
Collapse
Affiliation(s)
- Donna Abecasis
- Department of the Arts, Faculty of Humanities and Social Sciences, Ben-Gurion University of the Negev, Beer-Sheva, Israel.
| | | | | | | | | |
Collapse
|
42
|
|
43
|
Brochard R, Touzalin P, Després O, Dufour A. Evidence of beat perception via purely tactile stimulation. Brain Res 2008; 1223:59-64. [PMID: 18590909 DOI: 10.1016/j.brainres.2008.05.050] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2008] [Revised: 05/20/2008] [Accepted: 05/20/2008] [Indexed: 11/25/2022]
Abstract
Humans can easily tap in synchrony with an auditory beat but not with an equivalent visual rhythmic sequence, suggesting that the sensation of meter (i.e. of an underlying regular pulse) may be inherently auditory. We assessed whether the perception of meter could also be felt with tactile sensory inputs. We found that, when participants were presented with identical rhythmic sequences filled with either short tones or hand stimulations, they could more efficiently tap in synchrony with strongly rather than weakly metric sequences. These observations suggest that non-musician adults can extract the metric structure of purely tactile rhythms and use it to tap regularly with the beat induced by such sequences. This finding represents a challenge for present models of rhythm processing.
Collapse
Affiliation(s)
- Renaud Brochard
- Laboratoire SMPS, Université de Bourgogne, POLE AAFE, Esplanade Erasme, Dijon, France.
| | | | | | | |
Collapse
|