1
|
Kroger C, Kagerer FA, McAuley JD. Interdependence of movement amplitude and tempo during self-paced finger tapping: evaluation of a preferred velocity hypothesis. Exp Brain Res 2024; 242:1025-1036. [PMID: 38451320 DOI: 10.1007/s00221-024-06814-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2023] [Accepted: 02/23/2024] [Indexed: 03/08/2024]
Abstract
This study examined the relation between movement amplitude and tempo during self-paced rhythmic finger tapping to test a preferred velocity account of the preferred tempo construct. Preferred tempo refers to the concept that individuals have preferences for the pace of actions or events in their environment (e.g., the desired pace of walking or tempo of music). The preferred velocity hypothesis proposes that assessments of preferred tempo do not represent a pure time preference independent of spatial movement characteristics, but rather reflects a preference for an average movement velocity, predicting that preferred tempo will depend on movement amplitude. We tested this by having participants first perform a novel spontaneous motor amplitude (SMA) task in which they repetitively tapped their finger at their preferred amplitude without instructions about tapping tempo. Next, participants completed the spontaneous motor tempo (SMT) task in which they tapped their finger at their preferred tempo without instructions about tapping amplitude. Finally, participants completed a target amplitude version of the SMT task where they tapped at their preferred tempo at three target amplitudes (low, medium, and high). Participants (1) produced similar amplitudes and tempi regardless of instructions to produce either their preferred amplitude or preferred tempo, maintaining the same average movement velocity across SMA and SMT tasks and (2) altered their preferred tempo for different target amplitudes in the direction predicted by their estimated preferred velocity from the SMA and SMT tasks. Overall, results show the interdependence of movement amplitude and tempo in tapping assessments of preferred tempo.
Collapse
Affiliation(s)
- Carolyn Kroger
- Department of Psychology, Michigan State University, East Lansing, USA.
- Kresge Hearing Research Institute, University of Michigan, Ann Arbor, MI, 48109, USA.
| | - Florian A Kagerer
- Department of Kinesiology, Michigan State University, East Lansing, USA
- Neuroscience Program, Michigan State University, East Lansing, USA
| | - J Devin McAuley
- Department of Psychology, Michigan State University, East Lansing, USA
- Neuroscience Program, Michigan State University, East Lansing, USA
| |
Collapse
|
2
|
Bouwer FL, Háden GP, Honing H. Probing Beat Perception with Event-Related Potentials (ERPs) in Human Adults, Newborns, and Nonhuman Primates. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1455:227-256. [PMID: 38918355 DOI: 10.1007/978-3-031-60183-5_13] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/27/2024]
Abstract
The aim of this chapter is to give an overview of how the perception of rhythmic temporal regularity such as a regular beat in music can be studied in human adults, human newborns, and nonhuman primates using event-related brain potentials (ERPs). First, we discuss different aspects of temporal structure in general, and musical rhythm in particular, and we discuss the possible mechanisms underlying the perception of regularity (e.g., a beat) in rhythm. Additionally, we highlight the importance of dissociating beat perception from the perception of other types of structure in rhythm, such as predictable sequences of temporal intervals, ordinal structure, and rhythmic grouping. In the second section of the chapter, we start with a discussion of auditory ERPs elicited by infrequent and frequent sounds: ERP responses to regularity violations, such as mismatch negativity (MMN), N2b, and P3, as well as early sensory responses to sounds, such as P1 and N1, have been shown to be instrumental in probing beat perception. Subsequently, we discuss how beat perception can be probed by comparing ERP responses to sounds in regular and irregular sequences, and by comparing ERP responses to sounds in different metrical positions in a rhythm, such as on and off the beat or on strong and weak beats. Finally, we will discuss previous research that has used the aforementioned ERPs and paradigms to study beat perception in human adults, human newborns, and nonhuman primates. In doing so, we consider the possible pitfalls and prospects of the technique, as well as future perspectives.
Collapse
Affiliation(s)
- Fleur L Bouwer
- Cognitive Psychology Unit, Institute of Psychology, Leiden Institute for Brain and Cognition, Leiden University, Leiden, The Netherlands.
- Department of Psychology, Brain & Cognition, University of Amsterdam, Amsterdam, The Netherlands.
| | - Gábor P Háden
- Institute of Cognitive Neuroscience and Psychology, Budapest, Hungary
- Department of Telecommunications and Media Informatics, Faculty of Electrical Engineering and Informatics, Budapest University of Technology and Economics, Budapest, Hungary
| | - Henkjan Honing
- Music Cognition group (MCG), Institute for Logic, Language and Computation (ILLC), Amsterdam Brain and Cognition (ABC), University of Amsterdam, Amsterdam, The Netherlands
| |
Collapse
|
3
|
Noel GD, Mugno LE, Andres DS. From signals to music: a bottom-up approach to the structure of neuronal activity. Front Syst Neurosci 2023; 17:1171984. [PMID: 37637704 PMCID: PMC10450627 DOI: 10.3389/fnsys.2023.1171984] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2023] [Accepted: 07/24/2023] [Indexed: 08/29/2023] Open
Abstract
Introduction The search for the "neural code" has been a fundamental quest in neuroscience, concerned with the way neurons and neuronal systems process and transmit information. However, the term "code" has been mostly used as a metaphor, seldom acknowledging the formal definitions introduced by information theory, and the contributions of linguistics and semiotics not at all. The heuristic potential of the latter was suggested by structuralism, which turned the methods and findings of linguistics to other fields of knowledge. For the study of complex communication systems, such as human language and music, the necessity of an approach that considers multilayered, nested, structured organization of symbols becomes evident. We work under the hypothesis that the neural code might be as complex as these human-made codes. To test this, we propose a bottom-up approach, constructing a symbolic logic in order to translate neuronal signals into music scores. Methods We recorded single cells' activity from the rat's globus pallidus pars interna under conditions of full alertness, blindfoldedness and environmental silence. We analyzed the signals with statistical, spectral, and complex methods, including Fast Fourier Transform, Hurst exponent and recurrence plot analysis. Results The results indicated complex behavior and recurrence graphs consistent with fractality, and a Hurst exponent >0.5, evidencing temporal persistence. On the whole, these features point toward a complex behavior of the time series analyzed, also present in classical music, which upholds the hypothesis of structural similarities between music and neuronal activity. Furthermore, through our experiment we performed a comparison between music and raw neuronal activity. Our results point to the same conclusion, showing the structures of music and neuronal activity to be homologous. The scores were not only spontaneously tonal, but they exhibited structure and features normally present in human-made musical creations. Discussion The hypothesis of a structural homology between the neural code and the code of music holds, suggesting that some of the insights introduced by linguistic and semiotic theory might be a useful methodological resource to go beyond the limits set by metaphoric notions of "code."
Collapse
Affiliation(s)
- Gabriel D. Noel
- College of Interdisciplinary and Advanced Studies in the Social Sciences, National University of San Martin (UNSAM), San Martín, Argentina
- National Scientific and Research Council, National University of San Martin (UNSAM), Buenos Aires, Argentina
| | - Lionel E. Mugno
- School of Music of the Department of General San Martin “Alfredo Luis Schiuma”, San Martín, Argentina
| | - Daniela S. Andres
- Institute of Emergent Technologies and Applied Science, San Martín, Argentina
- Science and Technology School, National University of San Martin (UNSAM), San Martín, Argentina
| |
Collapse
|
4
|
Tervaniemi M. The neuroscience of music – towards ecological validity. Trends Neurosci 2023; 46:355-364. [PMID: 37012175 DOI: 10.1016/j.tins.2023.03.001] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2022] [Revised: 01/28/2023] [Accepted: 03/02/2023] [Indexed: 04/03/2023]
Abstract
Studies in the neuroscience of music gained momentum in the 1990s as an integrated part of the well-controlled experimental research tradition. However, during the past two decades, these studies have moved toward more naturalistic, ecologically valid paradigms. Here, I introduce this move in three frameworks: (i) sound stimulation and empirical paradigms, (ii) study participants, and (iii) methods and contexts of data acquisition. I wish to provide a narrative historical overview of the development of the field and, in parallel, to stimulate innovative thinking to further advance the ecological validity of the studies without overlooking experimental rigor.
Collapse
Affiliation(s)
- Mari Tervaniemi
- Centre of Excellence in Music, Mind, Body, and Brain, Faculty of Educational Sciences, University of Helsinki, Helsinki, Finland; Cognitive Brain Research Unit, Department of Psychology and Locopedics, Faculty of Medicine, University of Helsinki, Helsinki, Finland.
| |
Collapse
|
5
|
Kasdan A, Gordon RL, Lense MD. Neurophysiological Correlates of Dynamic Beat Tracking in Individuals With Williams Syndrome. BIOLOGICAL PSYCHIATRY. COGNITIVE NEUROSCIENCE AND NEUROIMAGING 2022; 7:1183-1191. [PMID: 33419711 PMCID: PMC8060366 DOI: 10.1016/j.bpsc.2020.10.003] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/14/2020] [Revised: 08/20/2020] [Accepted: 10/09/2020] [Indexed: 12/16/2022]
Abstract
BACKGROUND Williams syndrome (WS) is a neurodevelopmental disorder characterized by hypersociability, heightened auditory sensitivities, attention deficits, and strong musical interests despite differences in musical skills. Behavioral studies have reported that individuals with WS exhibit variable beat and rhythm perception skills. METHODS We sought to investigate the neural basis of beat tracking in individuals with WS using electroencephalography. Twenty-seven adults with WS and 16 age-matched, typically developing control subjects passively listened to musical rhythms with accents on either the first or second tone of the repeating pattern, leading to distinct beat percepts. RESULTS Consistent with the role of beta and gamma oscillations in rhythm processing, individuals with WS and typically developing control subjects showed strong evoked neural activity in both the beta (13-30 Hz) and gamma (31-55 Hz) frequency bands in response to beat onsets. This neural response was somewhat more distributed across the scalp for individuals with WS. Compared with typically developing control subjects, individuals with WS exhibited significantly greater amplitude of auditory evoked potentials (P1-N1-P2 complex) and modulations in evoked alpha (8-12 Hz) activity, reflective of sensory and attentional processes. Individuals with WS also exhibited markedly stable neural responses over the course of the experiment, and these responses were significantly more stable than those of control subjects. CONCLUSIONS These results provide neurophysiological evidence for dynamic beat tracking in WS and coincide with the atypical auditory phenotype and attentional difficulties seen in this population.
Collapse
Affiliation(s)
- Anna Kasdan
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, Tennessee; Curb Center for Art, Enterprise, and Public Policy, Nashville, Tennessee.
| | - Reyna L Gordon
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, Tennessee; Department of Psychology, Vanderbilt University, Nashville, Tennessee; Department of Otolaryngology-Head and Neck Surgery, Vanderbilt University Medical Center, Nashville, Tennessee; Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, Tennessee; Curb Center for Art, Enterprise, and Public Policy, Nashville, Tennessee
| | - Miriam D Lense
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, Tennessee; Department of Psychology, Vanderbilt University, Nashville, Tennessee; Department of Otolaryngology-Head and Neck Surgery, Vanderbilt University Medical Center, Nashville, Tennessee; Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, Tennessee; Curb Center for Art, Enterprise, and Public Policy, Nashville, Tennessee
| |
Collapse
|
6
|
Lenc T, Merchant H, Keller PE, Honing H, Varlet M, Nozaradan S. Mapping between sound, brain and behaviour: four-level framework for understanding rhythm processing in humans and non-human primates. Philos Trans R Soc Lond B Biol Sci 2021; 376:20200325. [PMID: 34420381 PMCID: PMC8380981 DOI: 10.1098/rstb.2020.0325] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/14/2021] [Indexed: 12/16/2022] Open
Abstract
Humans perceive and spontaneously move to one or several levels of periodic pulses (a meter, for short) when listening to musical rhythm, even when the sensory input does not provide prominent periodic cues to their temporal location. Here, we review a multi-levelled framework to understanding how external rhythmic inputs are mapped onto internally represented metric pulses. This mapping is studied using an approach to quantify and directly compare representations of metric pulses in signals corresponding to sensory inputs, neural activity and behaviour (typically body movement). Based on this approach, recent empirical evidence can be drawn together into a conceptual framework that unpacks the phenomenon of meter into four levels. Each level highlights specific functional processes that critically enable and shape the mapping from sensory input to internal meter. We discuss the nature, constraints and neural substrates of these processes, starting with fundamental mechanisms investigated in macaque monkeys that enable basic forms of mapping between simple rhythmic stimuli and internally represented metric pulse. We propose that human evolution has gradually built a robust and flexible system upon these fundamental processes, allowing more complex levels of mapping to emerge in musical behaviours. This approach opens promising avenues to understand the many facets of rhythmic behaviours across individuals and species. This article is part of the theme issue 'Synchrony and rhythm interaction: from the brain to behavioural ecology'.
Collapse
Affiliation(s)
- Tomas Lenc
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, New South Wales 2751, Australia
- Institute of Neuroscience (IONS), Université Catholique de Louvain (UCL), Brussels 1200, Belgium
| | - Hugo Merchant
- Instituto de Neurobiologia, UNAM, Campus Juriquilla, Querétaro 76230, Mexico
| | - Peter E. Keller
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, New South Wales 2751, Australia
| | - Henkjan Honing
- Amsterdam Brain and Cognition (ABC), Institute for Logic, Language and Computation (ILLC), University of Amsterdam, Amsterdam 1090 GE, The Netherlands
| | - Manuel Varlet
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, New South Wales 2751, Australia
- School of Psychology, Western Sydney University, Penrith, New South Wales 2751, Australia
| | - Sylvie Nozaradan
- Institute of Neuroscience (IONS), Université Catholique de Louvain (UCL), Brussels 1200, Belgium
| |
Collapse
|
7
|
What you hear first, is what you get: Initial metrical cue presentation modulates syllable detection in sentence processing. Atten Percept Psychophys 2021; 83:1861-1877. [PMID: 33709327 DOI: 10.3758/s13414-021-02251-y] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/14/2021] [Indexed: 11/08/2022]
Abstract
Auditory rhythms create powerful expectations for the listener. Rhythmic cues with the same temporal structure as subsequent sentences enhance processing compared with irregular or mismatched cues. In the present study, we focus on syllable detection following matched rhythmic cues. Cues were aligned with subsequent sentences at the syllable (low-level cue) or the accented syllable (high-level cue) level. A different group of participants performed the task without cues to provide a baseline. We hypothesized that unaccented syllable detection would be faster after low-level cues, and accented syllable detection would be faster after high-level cues. There was no difference in syllable detection depending on whether the sentence was preceded by a high-level or low-level cue. However, the results revealed a priming effect of the cue that participants heard first. Participants who heard a high-level cue first were faster to detect accented than unaccented syllables, and faster to detect accented syllables than participants who heard a low-level cue first. The low-level-first participants showed no difference between detection of accented and unaccented syllables. The baseline experiment confirmed that hearing a low-level cue first removed the benefit of the high-level grouping structure for accented syllables. These results suggest that the initially perceived rhythmic structure influenced subsequent cue perception and its influence on syllable detection. Results are discussed in terms of dynamic attending, temporal context effects, and implications for context effects in neural entrainment.
Collapse
|
8
|
Hickey P, Barnett-Young A, Patel AD, Race E. Environmental rhythms orchestrate neural activity at multiple stages of processing during memory encoding: Evidence from event-related potentials. PLoS One 2020; 15:e0234668. [PMID: 33206657 PMCID: PMC7673489 DOI: 10.1371/journal.pone.0234668] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2020] [Accepted: 11/03/2020] [Indexed: 11/19/2022] Open
Abstract
Accumulating evidence suggests that rhythmic temporal structures in the environment influence memory formation. For example, stimuli that appear in synchrony with the beat of background, environmental rhythms are better remembered than stimuli that appear out-of-synchrony with the beat. This rhythmic modulation of memory has been linked to entrained neural oscillations which are proposed to act as a mechanism of selective attention that prioritize processing of events that coincide with the beat. However, it is currently unclear whether rhythm influences memory formation by influencing early (sensory) or late (post-perceptual) processing of stimuli. The current study used stimulus-locked event-related potentials (ERPs) to investigate the locus of stimulus processing at which rhythm temporal cues operate in the service of memory formation. Participants viewed a series of visual objects that either appeared in-synchrony or out-of-synchrony with the beat of background music and made a semantic classification (living/non-living) for each object. Participants’ memory for the objects was then tested (in silence). The timing of stimulus presentation during encoding (in-synchrony or out-of-synchrony with the background beat) influenced later ERPs associated with post-perceptual selection and orienting attention in time rather than earlier ERPs associated with sensory processing. The magnitude of post-perceptual ERPs also differed according to whether or not participants demonstrated a mnemonic benefit for in-synchrony compared to out-of-synchrony stimuli, and was related to the magnitude of the rhythmic modulation of memory performance across participants. These results support two prominent theories in the field, the Dynamic Attending Theory and the Oscillation Selection Hypothesis, which propose that neural responses to rhythm act as a core mechanism of selective attention that optimize processing at specific moments in time. Furthermore, they reveal that in addition to acting as a mechanism of early attentional selection, rhythm influences later, post-perceptual cognitive processes as events are transformed into memory.
Collapse
Affiliation(s)
- Paige Hickey
- Department of Psychology, Tufts University, Medford, Massachusetts, United States of America
- * E-mail:
| | - Annie Barnett-Young
- Department of Psychology, Tufts University, Medford, Massachusetts, United States of America
| | - Aniruddh D. Patel
- Department of Psychology, Tufts University, Medford, Massachusetts, United States of America
- Program in Brain, Mind, and Consciousness, Canadian Institute for Advanced Research (CIFAR), Toronto, Ontario, Canada
| | - Elizabeth Race
- Department of Psychology, Tufts University, Medford, Massachusetts, United States of America
| |
Collapse
|
9
|
Meter enhances the subcortical processing of speech sounds at a strong beat. Sci Rep 2020; 10:15973. [PMID: 32994430 PMCID: PMC7525485 DOI: 10.1038/s41598-020-72714-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2019] [Accepted: 09/07/2020] [Indexed: 11/08/2022] Open
Abstract
The temporal structure of sound such as in music and speech increases the efficiency of auditory processing by providing listeners with a predictable context. Musical meter is a good example of a sound structure that is temporally organized in a hierarchical manner, with recent studies showing that meter optimizes neural processing, particularly for sounds located at a higher metrical position or strong beat. Whereas enhanced cortical auditory processing at times of high metric strength has been studied, there is to date no direct evidence showing metrical modulation of subcortical processing. In this work, we examined the effect of meter on the subcortical encoding of sounds by measuring human auditory frequency-following responses to speech presented at four different metrical positions. Results show that neural encoding of the fundamental frequency of the vowel was enhanced at the strong beat, and also that the neural consistency of the vowel was the highest at the strong beat. When comparing musicians to non-musicians, musicians were found, at the strong beat, to selectively enhance the behaviorally relevant component of the speech sound, namely the formant frequency of the transient part. Our findings indicate that the meter of sound influences subcortical processing, and this metrical modulation differs depending on musical expertise.
Collapse
|
10
|
Hickey P, Merseal H, Patel AD, Race E. Memory in time: Neural tracking of low-frequency rhythm dynamically modulates memory formation. Neuroimage 2020; 213:116693. [DOI: 10.1016/j.neuroimage.2020.116693] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2019] [Revised: 02/18/2020] [Accepted: 02/26/2020] [Indexed: 12/12/2022] Open
|
11
|
Pesnot Lerousseau J, Hidalgo C, Schön D. Musical Training for Auditory Rehabilitation in Hearing Loss. J Clin Med 2020; 9:jcm9041058. [PMID: 32276390 PMCID: PMC7230165 DOI: 10.3390/jcm9041058] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2020] [Revised: 04/02/2020] [Accepted: 04/06/2020] [Indexed: 01/17/2023] Open
Abstract
Despite the overall success of cochlear implantation, language outcomes remain suboptimal and subject to large inter-individual variability. Early auditory rehabilitation techniques have mostly focused on low-level sensory abilities. However, a new body of literature suggests that cognitive operations are critical for auditory perception remediation. We argue in this paper that musical training is a particularly appealing candidate for such therapies, as it involves highly relevant cognitive abilities, such as temporal predictions, hierarchical processing, and auditory-motor interactions. We review recent studies demonstrating that music can enhance both language perception and production at multiple levels, from syllable processing to turn-taking in natural conversation.
Collapse
|
12
|
Bouwer FL, Honing H, Slagter HA. Beat-based and Memory-based Temporal Expectations in Rhythm: Similar Perceptual Effects, Different Underlying Mechanisms. J Cogn Neurosci 2020; 32:1221-1241. [PMID: 31933432 DOI: 10.1162/jocn_a_01529] [Citation(s) in RCA: 26] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Predicting the timing of incoming information allows the brain to optimize information processing in dynamic environments. Behaviorally, temporal expectations have been shown to facilitate processing of events at expected time points, such as sounds that coincide with the beat in musical rhythm. Yet, temporal expectations can develop based on different forms of structure in the environment, not just the regularity afforded by a musical beat. Little is still known about how different types of temporal expectations are neurally implemented and affect performance. Here, we orthogonally manipulated the periodicity and predictability of rhythmic sequences to examine the mechanisms underlying beat-based and memory-based temporal expectations, respectively. Behaviorally and using EEG, we looked at the effects of beat-based and memory-based expectations on auditory processing when rhythms were task-relevant or task-irrelevant. At expected time points, both beat-based and memory-based expectations facilitated target detection and led to attenuation of P1 and N1 responses, even when expectations were task-irrelevant (unattended). For beat-based expectations, we additionally found reduced target detection and enhanced N1 responses for events at unexpected time points (e.g., off-beat), regardless of the presence of memory-based expectations or task relevance. This latter finding supports the notion that periodicity selectively induces rhythmic fluctuations in neural excitability and furthermore indicates that, although beat-based and memory-based expectations may similarly affect auditory processing of expected events, their underlying neural mechanisms may be different.
Collapse
|
13
|
Rajendran VG, Teki S, Schnupp JWH. Temporal Processing in Audition: Insights from Music. Neuroscience 2018; 389:4-18. [PMID: 29108832 PMCID: PMC6371985 DOI: 10.1016/j.neuroscience.2017.10.041] [Citation(s) in RCA: 20] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2017] [Revised: 10/24/2017] [Accepted: 10/27/2017] [Indexed: 11/28/2022]
Abstract
Music is a curious example of a temporally patterned acoustic stimulus, and a compelling pan-cultural phenomenon. This review strives to bring some insights from decades of music psychology and sensorimotor synchronization (SMS) literature into the mainstream auditory domain, arguing that musical rhythm perception is shaped in important ways by temporal processing mechanisms in the brain. The feature that unites these disparate disciplines is an appreciation of the central importance of timing, sequencing, and anticipation. Perception of musical rhythms relies on an ability to form temporal predictions, a general feature of temporal processing that is equally relevant to auditory scene analysis, pattern detection, and speech perception. By bringing together findings from the music and auditory literature, we hope to inspire researchers to look beyond the conventions of their respective fields and consider the cross-disciplinary implications of studying auditory temporal sequence processing. We begin by highlighting music as an interesting sound stimulus that may provide clues to how temporal patterning in sound drives perception. Next, we review the SMS literature and discuss possible neural substrates for the perception of, and synchronization to, musical beat. We then move away from music to explore the perceptual effects of rhythmic timing in pattern detection, auditory scene analysis, and speech perception. Finally, we review the neurophysiology of general timing processes that may underlie aspects of the perception of rhythmic patterns. We conclude with a brief summary and outlook for future research.
Collapse
Affiliation(s)
- Vani G Rajendran
- Auditory Neuroscience Group, University of Oxford, Department of Physiology, Anatomy, and Genetics, Oxford, UK
| | - Sundeep Teki
- Auditory Neuroscience Group, University of Oxford, Department of Physiology, Anatomy, and Genetics, Oxford, UK
| | - Jan W H Schnupp
- City University of Hong Kong, Department of Biomedical Sciences, 31 To Yuen Street, Kowloon Tong, Hong Kong.
| |
Collapse
|
14
|
Slater JL, Tate MC. Timing Deficits in ADHD: Insights From the Neuroscience of Musical Rhythm. Front Comput Neurosci 2018; 12:51. [PMID: 30034331 PMCID: PMC6043674 DOI: 10.3389/fncom.2018.00051] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2018] [Accepted: 06/18/2018] [Indexed: 12/22/2022] Open
Abstract
Everyday human behavior relies upon extraordinary feats of coordination within the brain. In this perspective paper, we argue that the rich temporal structure of music provides an informative context in which to investigate how the brain coordinates its complex activities in time, and how that coordination can be disrupted. We bring insights from the neuroscience of musical rhythm to considerations of timing deficits in Attention Deficit/Hyperactivity Disorder (ADHD), highlighting the significant overlap between neural systems involved in processing musical rhythm and those implicated in ADHD. We suggest that timing deficits warrant closer investigation since they could lead to the identification of potentially informative phenotypes, tied to neurobiological and genetic factors. Our novel interdisciplinary approach builds upon recent trends in both fields of research: in the neuroscience of rhythm, an increasingly nuanced understanding of the specific contributions of neural systems to rhythm processing, and in ADHD, an increasing focus on differentiating phenotypes and identifying distinct etiological pathways associated with the disorder. Finally, we consider the impact of musical experience on rhythm processing and the potential value of musical rhythm in therapeutic interventions.
Collapse
Affiliation(s)
- Jessica L. Slater
- Department of Neurological Surgery, Northwestern University, Chicago, IL, United States
| | - Matthew C. Tate
- Department of Neurological Surgery, Northwestern University, Chicago, IL, United States
- Department of Neurology, Northwestern University, Chicago, IL, United States
| |
Collapse
|
15
|
Haumann NT, Vuust P, Bertelsen F, Garza-Villarreal EA. Influence of Musical Enculturation on Brain Responses to Metric Deviants. Front Neurosci 2018; 12:218. [PMID: 29720932 PMCID: PMC5915898 DOI: 10.3389/fnins.2018.00218] [Citation(s) in RCA: 42] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2017] [Accepted: 03/19/2018] [Indexed: 11/13/2022] Open
Abstract
The ability to recognize metric accents is fundamental in both music and language perception. It has been suggested that music listeners prefer rhythms that follow simple binary meters, which are common in Western music. This means that listeners expect odd-numbered beats to be strong and even-numbered beats to be weak. In support of this, studies have shown that listeners exposed to Western music show stronger novelty and incongruity related P3 and irregularity detection related mismatch negativity (MMN) brain responses to attenuated odd- than attenuated even-numbered metric positions. Furthermore, behavioral evidence suggests that music listeners' preferences can be changed by long-term exposure to non-Western rhythms and meters, e.g., by listening to African or Balkan music. In our study, we investigated whether it might be possible to measure effects of music enculturation on neural responses to attenuated tones on specific metric positions. We compared the magnetic mismatch negativity (MMNm) to attenuated beats in a “Western group” of listeners (n = 12) mainly exposed to Western music and a “Bicultural group” of listeners (n = 13) exposed for at least 1 year to both Sub-Saharan African music in addition to Western music. We found that in the “Western group” the MMNm was higher in amplitude to deviant tones on odd compared to even metric positions, but not in the “Bicultural group.” In support of this finding, there was also a trend of the “Western group” to rate omitted beats as more surprising on odd than even metric positions, whereas the “Bicultural group” seemed to discriminate less between metric positions in terms of surprise ratings. Also, we observed that the overall latency of the MMNm was significantly shorter in the Bicultural group compared to the Western group. These effects were not biased by possible differences in rhythm perception ability or music training, measured with the Musical Ear Test (MET). Furthermore, source localization analyses suggest that auditory, inferior temporal, sensory-motor, superior frontal, and parahippocampal regions might be involved in eliciting the MMNm to the metric deviants. These findings suggest that effects of music enculturation can be measured on MMNm responses to attenuated tones on specific metric positions.
Collapse
Affiliation(s)
- Niels T Haumann
- Department of Aesthetics and Communication (Musicology), Faculty of Arts, Aarhus University, Aarhus, Denmark.,Department of Clinical Medicine, Center for Music in the Brain, Royal Academy of Music, Aarhus University, Aarhus, Denmark
| | - Peter Vuust
- Department of Clinical Medicine, Center for Music in the Brain, Royal Academy of Music, Aarhus University, Aarhus, Denmark
| | - Freja Bertelsen
- Center of Functionally Integrative Neuroscience, Aarhus University, Aarhus, Denmark.,Department of Nuclear Medicine and PET Centre, Aarhus University Hospital, Aarhus, Denmark
| | - Eduardo A Garza-Villarreal
- Department of Clinical Medicine, Center for Music in the Brain, Royal Academy of Music, Aarhus University, Aarhus, Denmark.,Clinical Research Division, Instituto Nacional de Psiquiatría Ramón de la Fuente Muñiz (INPRFM), Mexico City, Mexico.,Department of Neurology, Faculty of Medicine and University Hospital, Universidad Autonoma de Nuevo Leon, Monterrey, Mexico
| |
Collapse
|
16
|
Bouwer FL, Burgoyne JA, Odijk D, Honing H, Grahn JA. What makes a rhythm complex? The influence of musical training and accent type on beat perception. PLoS One 2018; 13:e0190322. [PMID: 29320533 PMCID: PMC5761885 DOI: 10.1371/journal.pone.0190322] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2017] [Accepted: 12/12/2017] [Indexed: 11/18/2022] Open
Abstract
Perception of a regular beat in music is inferred from different types of accents. For example, increases in loudness cause intensity accents, and the grouping of time intervals in a rhythm creates temporal accents. Accents are expected to occur on the beat: when accents are "missing" on the beat, the beat is more difficult to find. However, it is unclear whether accents occurring off the beat alter beat perception similarly to missing accents on the beat. Moreover, no one has examined whether intensity accents influence beat perception more or less strongly than temporal accents, nor how musical expertise affects sensitivity to each type of accent. In two experiments, we obtained ratings of difficulty in finding the beat in rhythms with either temporal or intensity accents, and which varied in the number of accents on the beat as well as the number of accents off the beat. In both experiments, the occurrence of accents on the beat facilitated beat detection more in musical experts than in musical novices. In addition, the number of accents on the beat affected beat finding more in rhythms with temporal accents than in rhythms with intensity accents. The effect of accents off the beat was much weaker than the effect of accents on the beat and appeared to depend on musical expertise, as well as on the number of accents on the beat: when many accents on the beat are missing, beat perception is quite difficult, and adding accents off the beat may not reduce beat perception further. Overall, the different types of accents were processed qualitatively differently, depending on musical expertise. Therefore, these findings indicate the importance of designing ecologically valid stimuli when testing beat perception in musical novices, who may need different types of accent information than musical experts to be able to find a beat. Furthermore, our findings stress the importance of carefully designing rhythms for social and clinical applications of beat perception, as not all listeners treat all rhythms alike.
Collapse
Affiliation(s)
- Fleur L. Bouwer
- Institute for Logic, Language and Computation, University of Amsterdam, Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Amsterdam, The Netherlands
| | - J. Ashley Burgoyne
- Institute for Logic, Language and Computation, University of Amsterdam, Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Amsterdam, The Netherlands
| | - Daan Odijk
- Informatics Institute, University of Amsterdam, Amsterdam, The Netherlands
| | - Henkjan Honing
- Institute for Logic, Language and Computation, University of Amsterdam, Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Amsterdam, The Netherlands
| | - Jessica A. Grahn
- Brain and Mind Institute, Department of Psychology, University of Western Ontario, London (ON), Canada
| |
Collapse
|
17
|
Slater J, Ashley R, Tierney A, Kraus N. Got Rhythm? Better Inhibitory Control Is Linked with More Consistent Drumming and Enhanced Neural Tracking of the Musical Beat in Adult Percussionists and Nonpercussionists. J Cogn Neurosci 2017; 30:14-24. [PMID: 28949825 DOI: 10.1162/jocn_a_01189] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Musical rhythm engages motor and reward circuitry that is important for cognitive control, and there is evidence for enhanced inhibitory control in musicians. We recently revealed an inhibitory control advantage in percussionists compared with vocalists, highlighting the potential importance of rhythmic expertise in mediating this advantage. Previous research has shown that better inhibitory control is associated with less variable performance in simple sensorimotor synchronization tasks; however, this relationship has not been examined through the lens of rhythmic expertise. We hypothesize that the development of rhythm skills strengthens inhibitory control in two ways: by fine-tuning motor networks through the precise coordination of movements "in time" and by activating reward-based mechanisms, such as predictive processing and conflict monitoring, which are involved in tracking temporal structure in music. Here, we assess adult percussionists and nonpercussionists on inhibitory control, selective attention, basic drumming skills (self-paced, paced, and continuation drumming), and cortical evoked responses to an auditory stimulus presented on versus off the beat of music. Consistent with our hypotheses, we find that better inhibitory control is correlated with more consistent drumming and enhanced neural tracking of the musical beat. Drumming variability and the neural index of beat alignment each contribute unique predictive power to a regression model, explaining 57% of variance in inhibitory control. These outcomes present the first evidence that enhanced inhibitory control in musicians may be mediated by rhythmic expertise and provide a foundation for future research investigating the potential for rhythm-based training to strengthen cognitive function.
Collapse
|
18
|
Paquette S, Fujii S, Li HC, Schlaug G. The cerebellum's contribution to beat interval discrimination. Neuroimage 2017; 163:177-182. [PMID: 28916178 DOI: 10.1016/j.neuroimage.2017.09.017] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2017] [Revised: 08/30/2017] [Accepted: 09/08/2017] [Indexed: 11/29/2022] Open
Abstract
From expert percussionists to individuals who cannot dance, there are widespread differences in people's abilities to perceive and synchronize with a musical beat. The aim of our study was to identify candidate brain regions that might be associated with these abilities. For this purpose, we used Voxel-Based-Morphometry to correlate inter-individual differences in performance on the Harvard Beat Assessment Tests (H-BAT) with local inter-individual variations in gray matter volumes across the entire brain space in 60 individuals. Analysis revealed significant co-variations between performances on two perceptual tasks of the Harvard Beat Assessment Tests associated with beat interval change discrimination (faster, slower) and gray matter volume variations in the cerebellum. Participant discrimination thresholds for the Beat Finding Interval Test (quarter note beat) were positively associated with gray matter volume variation in cerebellum lobule IX in the left hemisphere and crus I bilaterally. Discrimination thresholds for the Beat Interval Test (simple series of tones) revealed the tendency for a positive association with gray matter volume variations in crus I/II of the left cerebellum. Our results demonstrate the importance of the cerebellum in beat interval discrimination skills, as measured by two perceptual tasks of the Harvard Beat Assessment Tests. Current findings, in combination with evidence from patients with cerebellar degeneration and expert dancers, suggest that cerebellar gray matter and overall cerebellar integrity are important for temporal discrimination abilities.
Collapse
Affiliation(s)
- S Paquette
- Music, Stroke Recovery, and Neuroimaging Laboratory, Neurology Department, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA, USA
| | - S Fujii
- Faculty of Environment and Information Studies, Keio University, Fujisawa, Kanagawa, Japan
| | - H C Li
- Music, Stroke Recovery, and Neuroimaging Laboratory, Neurology Department, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA, USA
| | - G Schlaug
- Music, Stroke Recovery, and Neuroimaging Laboratory, Neurology Department, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA, USA.
| |
Collapse
|
19
|
Familiarity affects electrocortical power spectra during dance imagery, listening to different music genres: independent component analysis of Alpha and Beta rhythms. SPORT SCIENCES FOR HEALTH 2017. [DOI: 10.1007/s11332-017-0379-0] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
20
|
Rhythmic entrainment as a musical affect induction mechanism. Neuropsychologia 2017; 96:96-110. [DOI: 10.1016/j.neuropsychologia.2017.01.004] [Citation(s) in RCA: 58] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/29/2016] [Revised: 12/10/2016] [Accepted: 01/06/2017] [Indexed: 01/04/2023]
|
21
|
Abstract
The neural resonance theory of musical meter explains musical beat tracking as the result of entrainment of neural oscillations to the beat frequency and its higher harmonics. This theory has gained empirical support from experiments using simple, abstract stimuli. However, to date there has been no empirical evidence for a role of neural entrainment in the perception of the beat of ecologically valid music. Here we presented participants with a single pop song with a superimposed bassoon sound. This stimulus was either lined up with the beat of the music or shifted away from the beat by 25% of the average interbeat interval. Both conditions elicited a neural response at the beat frequency. However, although the on-the-beat condition elicited a clear response at the first harmonic of the beat, this frequency was absent in the neural response to the off-the-beat condition. These results support a role for neural entrainment in tracking the metrical structure of real music and show that neural meter tracking can be disrupted by the presentation of contradictory rhythmic cues.
Collapse
|
22
|
Bouwer FL, Honing H. Temporal attending and prediction influence the perception of metrical rhythm: evidence from reaction times and ERPs. Front Psychol 2015; 6:1094. [PMID: 26284015 PMCID: PMC4518143 DOI: 10.3389/fpsyg.2015.01094] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2015] [Accepted: 07/16/2015] [Indexed: 12/11/2022] Open
Abstract
The processing of rhythmic events in music is influenced by the induced metrical structure. Two mechanisms underlying this may be temporal attending and temporal prediction. Temporal fluctuations in attentional resources may influence the processing of rhythmic events by heightening sensitivity at metrically strong positions. Temporal predictions may attenuate responses to events that are highly expected within a metrical structure. In the current study we aimed to disentangle these two mechanisms by examining responses to unexpected sounds, using intensity increments and decrements as deviants. Temporal attending was hypothesized to lead to better detection of deviants in metrically strong (on the beat) than weak (offbeat) positions due to heightened sensitivity on the beat. Temporal prediction was hypothesized to lead to best detection of increments in offbeat positions and decrements on the beat, as they would be most unexpected in these positions. We used a speeded detection task to measure detectability of the deviants under attended conditions (Experiment 1). Under unattended conditions (Experiment 2), we used EEG to measure the mismatch negativity (MMN), an ERP component known to index the detectability of unexpected auditory events. Furthermore, we examined the amplitude of the auditory evoked P1 and N1 responses, which are known to be sensitive to both attention and prediction. We found better detection of small increments in offbeat positions than on the beat, consistent with the influence of temporal prediction (Experiment 1). In addition, we found faster detection of large increments on the beat as opposed to offbeat (Experiment 1), and larger amplitude P1 responses on the beat as compared to offbeat, both in support of temporal attending (Experiment 2). As such, we showed that both temporal attending and temporal prediction shape our processing of metrical rhythm.
Collapse
Affiliation(s)
- Fleur L Bouwer
- Amsterdam Brain and Cognition, Institute for Logic, Language and Computation, University of Amsterdam Amsterdam, Netherlands
| | - Henkjan Honing
- Amsterdam Brain and Cognition, Institute for Logic, Language and Computation, University of Amsterdam Amsterdam, Netherlands
| |
Collapse
|
23
|
Vuilleumier P, Trost W. Music and emotions: from enchantment to entrainment. Ann N Y Acad Sci 2015; 1337:212-22. [DOI: 10.1111/nyas.12676] [Citation(s) in RCA: 119] [Impact Index Per Article: 13.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/09/2023]
Affiliation(s)
- Patrik Vuilleumier
- Department of Neuroscience, Medical School; University of Geneva; Geneva Switzerland
- Department of Neurology; University Hospital of Geneva; Geneva Switzerland
- Swiss Center for Affective Sciences; University of Geneva; Geneva Switzerland
| | - Wiebke Trost
- Swiss Center for Affective Sciences; University of Geneva; Geneva Switzerland
| |
Collapse
|
24
|
Trost W, Frühholz S, Schön D, Labbé C, Pichon S, Grandjean D, Vuilleumier P. Getting the beat: Entrainment of brain activity by musical rhythm and pleasantness. Neuroimage 2014; 103:55-64. [DOI: 10.1016/j.neuroimage.2014.09.009] [Citation(s) in RCA: 60] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/12/2014] [Revised: 08/28/2014] [Accepted: 09/04/2014] [Indexed: 01/08/2023] Open
|
25
|
Bouwer FL, Van Zuijen TL, Honing H. Beat processing is pre-attentive for metrically simple rhythms with clear accents: an ERP study. PLoS One 2014; 9:e97467. [PMID: 24870123 PMCID: PMC4037171 DOI: 10.1371/journal.pone.0097467] [Citation(s) in RCA: 43] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2013] [Accepted: 04/20/2014] [Indexed: 12/03/2022] Open
Abstract
The perception of a regular beat is fundamental to music processing. Here we examine whether the detection of a regular beat is pre-attentive for metrically simple, acoustically varying stimuli using the mismatch negativity (MMN), an ERP response elicited by violations of acoustic regularity irrespective of whether subjects are attending to the stimuli. Both musicians and non-musicians were presented with a varying rhythm with a clear accent structure in which occasionally a sound was omitted. We compared the MMN response to the omission of identical sounds in different metrical positions. Most importantly, we found that omissions in strong metrical positions, on the beat, elicited higher amplitude MMN responses than omissions in weak metrical positions, not on the beat. This suggests that the detection of a beat is pre-attentive when highly beat inducing stimuli are used. No effects of musical expertise were found. Our results suggest that for metrically simple rhythms with clear accents beat processing does not require attention or musical expertise. In addition, we discuss how the use of acoustically varying stimuli may influence ERP results when studying beat processing.
Collapse
Affiliation(s)
- Fleur L. Bouwer
- Institute for Logic, Language and Computation, University of Amsterdam, Amsterdam, The Netherlands
- Amsterdam Brain and Cognition (ABC), University of Amsterdam, Amsterdam, The Netherlands
| | - Titia L. Van Zuijen
- Research Institute of Child Development and Education, University of Amsterdam, Amsterdam, The Netherlands
| | - Henkjan Honing
- Institute for Logic, Language and Computation, University of Amsterdam, Amsterdam, The Netherlands
- Amsterdam Brain and Cognition (ABC), University of Amsterdam, Amsterdam, The Netherlands
| |
Collapse
|
26
|
Lin YP, Duann JR, Feng W, Chen JH, Jung TP. Revealing spatio-spectral electroencephalographic dynamics of musical mode and tempo perception by independent component analysis. J Neuroeng Rehabil 2014; 11:18. [PMID: 24581119 PMCID: PMC3941612 DOI: 10.1186/1743-0003-11-18] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2012] [Accepted: 02/20/2014] [Indexed: 11/21/2022] Open
Abstract
Background Music conveys emotion by manipulating musical structures, particularly musical mode- and tempo-impact. The neural correlates of musical mode and tempo perception revealed by electroencephalography (EEG) have not been adequately addressed in the literature. Method This study used independent component analysis (ICA) to systematically assess spatio-spectral EEG dynamics associated with the changes of musical mode and tempo. Results Empirical results showed that music with major mode augmented delta-band activity over the right sensorimotor cortex, suppressed theta activity over the superior parietal cortex, and moderately suppressed beta activity over the medial frontal cortex, compared to minor-mode music, whereas fast-tempo music engaged significant alpha suppression over the right sensorimotor cortex. Conclusion The resultant EEG brain sources were comparable with previous studies obtained by other neuroimaging modalities, such as functional magnetic resonance imaging (fMRI) and positron emission tomography (PET). In conjunction with advanced dry and mobile EEG technology, the EEG results might facilitate the translation from laboratory-oriented research to real-life applications for music therapy, training and entertainment in naturalistic environments.
Collapse
Affiliation(s)
| | | | | | | | - Tzyy-Ping Jung
- Institute for Neural Computation and Institute of Engineering in Medicine, University of California, San Diego, La Jolla, CA, USA.
| |
Collapse
|
27
|
Kraus N, Nicol T. The Cognitive Auditory System: The Role of Learning in Shaping the Biology of the Auditory System. PERSPECTIVES ON AUDITORY RESEARCH 2014. [DOI: 10.1007/978-1-4614-9102-6_17] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/05/2023]
|
28
|
Honing H, Bouwer FL, Háden GP. Perceiving temporal regularity in music: the role of auditory event-related potentials (ERPs) in probing beat perception. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2014; 829:305-23. [PMID: 25358717 DOI: 10.1007/978-1-4939-1782-2_16] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/05/2022]
Abstract
The aim of this chapter is to give an overview of how the perception of a regular beat in music can be studied in humans adults, human newborns, and nonhuman primates using event-related brain potentials (ERPs). Next to a review of the recent literature on the perception of temporal regularity in music, we will discuss in how far ERPs, and especially the component called mismatch negativity (MMN), can be instrumental in probing beat perception. We conclude with a discussion on the pitfalls and prospects of using ERPs to probe the perception of a regular beat, in which we present possible constraints on stimulus design and discuss future perspectives.
Collapse
Affiliation(s)
- Henkjan Honing
- Institute for Logic, Language and Computation (ILLC), Amsterdam Brain and Cognition (ABC), University of Amsterdam, Amsterdam, The Netherlands,
| | | | | |
Collapse
|