1
|
Pesnot Lerousseau J, Hidalgo C, Schön D. Musical Training for Auditory Rehabilitation in Hearing Loss. J Clin Med 2020; 9:jcm9041058. [PMID: 32276390 PMCID: PMC7230165 DOI: 10.3390/jcm9041058] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2020] [Revised: 04/02/2020] [Accepted: 04/06/2020] [Indexed: 01/17/2023] Open
Abstract
Despite the overall success of cochlear implantation, language outcomes remain suboptimal and subject to large inter-individual variability. Early auditory rehabilitation techniques have mostly focused on low-level sensory abilities. However, a new body of literature suggests that cognitive operations are critical for auditory perception remediation. We argue in this paper that musical training is a particularly appealing candidate for such therapies, as it involves highly relevant cognitive abilities, such as temporal predictions, hierarchical processing, and auditory-motor interactions. We review recent studies demonstrating that music can enhance both language perception and production at multiple levels, from syllable processing to turn-taking in natural conversation.
Collapse
|
2
|
Silva S, Castro SL. Structural meter perception is pre-attentive. Neuropsychologia 2019; 133:107184. [PMID: 31518576 DOI: 10.1016/j.neuropsychologia.2019.107184] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2018] [Revised: 06/24/2019] [Accepted: 09/04/2019] [Indexed: 11/19/2022]
Abstract
A prominent question in timing research is whether meter perception is possible without attention to meter. So far, research has probed attention effects on meter perception with a surface-based approach that may create confounds between meter and rhythm, and not with a structural approach requiring abstraction from surface patterns. The available pattern of findings suggests that different meter dimensions (meter as beat hierarchy vs. meter as regular cycle length) may yield different attention effects: meter as cycle-length regularity may require attention (it is attentive but not pre-attentive), while meter as beat-hierarchy may be pre-attentive. However, it is unknown whether this dissociation prevails under structural meter processing. We examined attention effects on the EEG correlates of structural meter-processing, considering the two dimensions of meter perception: hierarchy and cycle-length. While the results for hierarchy violations were inconclusive, cycle-length violations induced pre-attentive, but not attentive, responses. These pre-attentive responses corresponded to late ERPs (300-600 ms), consistent with deep, structural meter-processing. Our findings highlight the importance of pre-attentive processing in meter perception, and they raise the hypothesis of dissociation between surface- and structure-based meter processing.
Collapse
Affiliation(s)
- Susana Silva
- Center for Psychology at University of Porto (CPUP), Porto, Portugal.
| | - São Luís Castro
- Center for Psychology at University of Porto (CPUP), Porto, Portugal.
| |
Collapse
|
3
|
Multisensory Integration in Short-term Memory: Musicians do Rock. Neuroscience 2018; 389:141-151. [PMID: 28461217 DOI: 10.1016/j.neuroscience.2017.04.031] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2017] [Accepted: 04/20/2017] [Indexed: 01/08/2023]
Abstract
Demonstrated interactions between seeing and hearing led us to assess the link between music training and short-term memory for auditory, visual and audiovisual sequences of rapidly presented, quasi-random components. Visual sequences' components varied in luminance; auditory sequences' components varied in frequency. Concurrent components in audiovisual sequences were either congruent (the frequency of an auditory item increased monotonically with the luminance of the visual item it accompanied), or incongruent (an item's frequency was uncorrelated with luminance of the item it accompanied). Subjects judged whether the last four items in a sequence replicated its first four items. With audiovisual sequences, subjects were instructed to ignore the sequence's auditory components, basing their judgments solely on the visual input. Subjects with prior instrumental training significantly outperformed their untrained counterparts, with both auditory and visual sequences, and with sequences of correlated auditory and visual items. Reverse correlation showed that the presence of a correlated, concurrent auditory stream altered subjects' reliance on particular visual items in a sequence. Moreover, congruence between auditory and visual items produced performance above what would be predicted from simple summation of information from the two modalities, a result that might reflect a contribution from special-purpose, multimodal neural mechanisms.
Collapse
|
4
|
Bouwer FL, Burgoyne JA, Odijk D, Honing H, Grahn JA. What makes a rhythm complex? The influence of musical training and accent type on beat perception. PLoS One 2018; 13:e0190322. [PMID: 29320533 PMCID: PMC5761885 DOI: 10.1371/journal.pone.0190322] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2017] [Accepted: 12/12/2017] [Indexed: 11/18/2022] Open
Abstract
Perception of a regular beat in music is inferred from different types of accents. For example, increases in loudness cause intensity accents, and the grouping of time intervals in a rhythm creates temporal accents. Accents are expected to occur on the beat: when accents are "missing" on the beat, the beat is more difficult to find. However, it is unclear whether accents occurring off the beat alter beat perception similarly to missing accents on the beat. Moreover, no one has examined whether intensity accents influence beat perception more or less strongly than temporal accents, nor how musical expertise affects sensitivity to each type of accent. In two experiments, we obtained ratings of difficulty in finding the beat in rhythms with either temporal or intensity accents, and which varied in the number of accents on the beat as well as the number of accents off the beat. In both experiments, the occurrence of accents on the beat facilitated beat detection more in musical experts than in musical novices. In addition, the number of accents on the beat affected beat finding more in rhythms with temporal accents than in rhythms with intensity accents. The effect of accents off the beat was much weaker than the effect of accents on the beat and appeared to depend on musical expertise, as well as on the number of accents on the beat: when many accents on the beat are missing, beat perception is quite difficult, and adding accents off the beat may not reduce beat perception further. Overall, the different types of accents were processed qualitatively differently, depending on musical expertise. Therefore, these findings indicate the importance of designing ecologically valid stimuli when testing beat perception in musical novices, who may need different types of accent information than musical experts to be able to find a beat. Furthermore, our findings stress the importance of carefully designing rhythms for social and clinical applications of beat perception, as not all listeners treat all rhythms alike.
Collapse
Affiliation(s)
- Fleur L. Bouwer
- Institute for Logic, Language and Computation, University of Amsterdam, Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Amsterdam, The Netherlands
| | - J. Ashley Burgoyne
- Institute for Logic, Language and Computation, University of Amsterdam, Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Amsterdam, The Netherlands
| | - Daan Odijk
- Informatics Institute, University of Amsterdam, Amsterdam, The Netherlands
| | - Henkjan Honing
- Institute for Logic, Language and Computation, University of Amsterdam, Amsterdam, The Netherlands
- Amsterdam Brain and Cognition, University of Amsterdam, Amsterdam, The Netherlands
| | - Jessica A. Grahn
- Brain and Mind Institute, Department of Psychology, University of Western Ontario, London (ON), Canada
| |
Collapse
|
5
|
|
6
|
Burunat I, Tsatsishvili V, Brattico E, Toiviainen P. Coupling of Action-Perception Brain Networks during Musical Pulse Processing: Evidence from Region-of-Interest-Based Independent Component Analysis. Front Hum Neurosci 2017; 11:230. [PMID: 28536514 PMCID: PMC5422442 DOI: 10.3389/fnhum.2017.00230] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2016] [Accepted: 04/21/2017] [Indexed: 01/20/2023] Open
Abstract
Our sense of rhythm relies on orchestrated activity of several cerebral and cerebellar structures. Although functional connectivity studies have advanced our understanding of rhythm perception, this phenomenon has not been sufficiently studied as a function of musical training and beyond the General Linear Model (GLM) approach. Here, we studied pulse clarity processing during naturalistic music listening using a data-driven approach (independent component analysis; ICA). Participants' (18 musicians and 18 controls) functional magnetic resonance imaging (fMRI) responses were acquired while listening to music. A targeted region of interest (ROI) related to pulse clarity processing was defined, comprising auditory, somatomotor, basal ganglia, and cerebellar areas. The ICA decomposition was performed under different model orders, i.e., under a varying number of assumed independent sources, to avoid relying on prior model order assumptions. The components best predicted by a measure of the pulse clarity of the music, extracted computationally from the musical stimulus, were identified. Their corresponding spatial maps uncovered a network of auditory (perception) and motor (action) areas in an excitatory-inhibitory relationship at lower model orders, while mainly constrained to the auditory areas at higher model orders. Results revealed (a) a strengthened functional integration of action-perception networks associated with pulse clarity perception hidden from GLM analyses, and (b) group differences between musicians and non-musicians in pulse clarity processing, suggesting lifelong musical training as an important factor that may influence beat processing.
Collapse
Affiliation(s)
- Iballa Burunat
- Department of Music, Arts and Culture Studies, Finnish Centre for Interdisciplinary Music Research, University of JyväskyläJyväskylä, Finland
| | - Valeri Tsatsishvili
- Department of Mathematical Information Technology, University of JyväskyläJyväskylä, Finland
| | - Elvira Brattico
- Department of Clinical Medicine, Center for Music in the Brain, Aarhus University and The Royal Academy of Music Aarhus/AalborgAarhus, Denmark
| | - Petri Toiviainen
- Department of Music, Arts and Culture Studies, Finnish Centre for Interdisciplinary Music Research, University of JyväskyläJyväskylä, Finland
| |
Collapse
|
7
|
Celma-Miralles A, de Menezes RF, Toro JM. Look at the Beat, Feel the Meter: Top-Down Effects of Meter Induction on Auditory and Visual Modalities. Front Hum Neurosci 2016; 10:108. [PMID: 27047358 PMCID: PMC4803728 DOI: 10.3389/fnhum.2016.00108] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2015] [Accepted: 02/28/2016] [Indexed: 11/13/2022] Open
Abstract
Recent research has demonstrated top-down effects on meter induction in the auditory modality. However, little is known about these effects in the visual domain, especially without the involvement of motor acts such as tapping. In the present study, we aim to assess whether the projection of meter on auditory beats is also present in the visual domain. We asked 16 musicians to internally project binary (i.e., a strong-weak pattern) and ternary (i.e., a strong-weak-weak pattern) meter onto separate, but analog, visual and auditory isochronous stimuli. Participants were presented with sequences of tones or blinking circular shapes (i.e., flashes) at 2.4 Hz while their electrophysiological responses were recorded. A frequency analysis of the elicited steady-state evoked potentials allowed us to compare the frequencies of the beat (2.4 Hz), its first harmonic (4.8 Hz), the binary subharmonic (1.2 Hz), and the ternary subharmonic (0.8 Hz) within and across modalities. Taking the amplitude spectra into account, we observed an enhancement of the amplitude at 0.8 Hz in the ternary condition for both modalities, suggesting meter induction across modalities. There was an interaction between modality and voltage at 2.4 and 4.8 Hz. Looking at the power spectra, we also observed significant differences from zero in the auditory, but not in the visual, binary condition at 1.2 Hz. These findings suggest that meter processing is modulated by top-down mechanisms that interact with our perception of rhythmic events and that such modulation can also be found in the visual domain. The reported cross-modal effects of meter may shed light on the origins of our timing mechanisms, partially developed in primates and allowing humans to synchronize across modalities accurately.
Collapse
Affiliation(s)
- Alexandre Celma-Miralles
- Information and Communication Technologies Engineering (ETIC), Language and Comparative Cognition Group - Center for Brain and Cognition, Universitat Pompeu Fabra Barcelona, Spain
| | - Robert F de Menezes
- Information and Communication Technologies Engineering (ETIC), Language and Comparative Cognition Group - Center for Brain and Cognition, Universitat Pompeu Fabra Barcelona, Spain
| | - Juan M Toro
- Information and Communication Technologies Engineering (ETIC), Language and Comparative Cognition Group - Center for Brain and Cognition, Universitat Pompeu FabraBarcelona, Spain; Institució Catalana de Recerca i Estudis AvançatsBarcelona, Spain
| |
Collapse
|
8
|
Bodeck S, Lappe C, Evers S. Tic-reducing effects of music in patients with Tourette's syndrome: Self-reported and objective analysis. J Neurol Sci 2015; 352:41-7. [PMID: 25805454 DOI: 10.1016/j.jns.2015.03.016] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2014] [Revised: 03/09/2015] [Accepted: 03/10/2015] [Indexed: 01/02/2023]
Abstract
BACKGROUND Self-reports by musicians affected with Tourette's syndrome and other sources of anecdotal evidence suggest that tics stop when subjects are involved in musical activity. For the first time, we studied this effect systematically using a questionnaire design to investigate the subjectively assessed impact of musical activity on tic frequency (study 1) and an experimental design to confirm these results (study 2). METHODS A questionnaire was sent to 29 patients assessing whether listening to music and musical performance would lead to a tic frequency reduction or increase. Then, a within-subject repeated measures design was conducted with eight patients. Five experimental conditions were tested: baseline, musical performance, short time period after musical performance, listening to music and music imagery. Tics were counted based on videotapes. RESULTS Analysis of the self-reports (study 1) yielded in a significant tic reduction both by listening to music and musical performance. In study 2, musical performance, listening to music and mental imagery of musical performance reduced tic frequency significantly. We found the largest reduction in the condition of musical performance, when tics almost completely stopped. Furthermore, we could find a short-term tic decreasing effect after musical performance. CONCLUSIONS Self-report assessment revealed that active and passive participation in musical activity can significantly reduce tic frequency. Experimental testing confirmed patients' perception. Active and passive participation in musical activity reduces tic frequency including a short-term lasting tic decreasing effect. Fine motor control, focused attention and goal directed behavior are believed to be relevant factors for this observation.
Collapse
Affiliation(s)
- Sabine Bodeck
- Münster University Hospital, Institute for Biomagnetism and Biosignalanalysis, University of Münster, Malmedyweg 15, Münster 48149, Germany.
| | - Claudia Lappe
- Münster University Hospital, Institute for Biomagnetism and Biosignalanalysis, University of Münster, Malmedyweg 15, Münster 48149, Germany.
| | - Stefan Evers
- Department of Neurology, Krankenhaus Lindenbrunn, Lindenbrunn 1, Coppenbrügge 31863, Germany.
| |
Collapse
|
9
|
Fujioka T, Fidali BC, Ross B. Neural correlates of intentional switching from ternary to binary meter in a musical hemiola pattern. Front Psychol 2014; 5:1257. [PMID: 25429274 PMCID: PMC4228837 DOI: 10.3389/fpsyg.2014.01257] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2014] [Accepted: 10/16/2014] [Indexed: 12/02/2022] Open
Abstract
Musical rhythms are often perceived and interpreted within a metrical framework that integrates timing information hierarchically based on interval ratios. Endogenous timing processes facilitate this metrical integration and allow us using the sensory context for predicting when an expected sensory event will happen (“predictive timing”). Previously, we showed that listening to metronomes and subjectively imagining the two different meters of march and waltz modulated the resulting auditory evoked responses in the temporal lobe and motor-related brain areas such as the motor cortex, basal ganglia, and cerebellum. Here we further explored the intentional transitions between the two metrical contexts, known as hemiola in the Western classical music dating back to the sixteenth century. We examined MEG from 12 musicians while they repeatedly listened to a sequence of 12 unaccented clicks with an interval of 390 ms, and tapped to them with the right hand according to a 3 + 3 + 2 + 2 + 2 hemiola accent pattern. While participants listened to the same metronome sequence and imagined the accents, their pattern of brain responses significantly changed just before the “pivot” point of metric transition from ternary to binary meter. Until 100 ms before the pivot point, brain activities were more similar to those in the simple ternary meter than those in the simple binary meter, but the pattern was reversed afterwards. A similar transition was also observed at the downbeat after the pivot. Brain areas related to the metric transition were identified from source reconstruction of the MEG using a beamformer and included auditory cortices, sensorimotor and premotor cortices, cerebellum, inferior/middle frontal gyrus, parahippocampal gyrus, inferior parietal lobule, cingulate cortex, and precuneus. The results strongly support that predictive timing processes related to auditory-motor, fronto-parietal, and medial limbic systems underlie metrical representation and its transitions.
Collapse
Affiliation(s)
- Takako Fujioka
- Rotman Research Institute, Baycrest Centre Toronto, ON, Canada ; Center for Computer Research in Music and Acoustics, Department of Music, Stanford University Stanford, CA, USA
| | - Brian C Fidali
- Rotman Research Institute, Baycrest Centre Toronto, ON, Canada ; Brain and Mind Research Institute, Weill Cornell Medical College New York, NY, USA
| | - Bernhard Ross
- Rotman Research Institute, Baycrest Centre Toronto, ON, Canada ; Department of Medical Biophysics, University of Toronto Toronto, ON, Canada
| |
Collapse
|
10
|
Cohn N, Jackendoff R, Holcomb PJ, Kuperberg GR. The grammar of visual narrative: Neural evidence for constituent structure in sequential image comprehension. Neuropsychologia 2014; 64:63-70. [PMID: 25241329 DOI: 10.1016/j.neuropsychologia.2014.09.018] [Citation(s) in RCA: 41] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2014] [Revised: 08/26/2014] [Accepted: 09/10/2014] [Indexed: 10/24/2022]
Abstract
Constituent structure has long been established as a central feature of human language. Analogous to how syntax organizes words in sentences, a narrative grammar organizes sequential images into hierarchic constituents. Here we show that the brain draws upon this constituent structure to comprehend wordless visual narratives. We recorded neural responses as participants viewed sequences of visual images (comics strips) in which blank images either disrupted individual narrative constituents or fell at natural constituent boundaries. A disruption of either the first or the second narrative constituent produced a left-lateralized anterior negativity effect between 500 and 700ms. Disruption of the second constituent also elicited a posteriorly-distributed positivity (P600) effect. These neural responses are similar to those associated with structural violations in language and music. These findings provide evidence that comprehenders use a narrative structure to comprehend visual sequences and that the brain engages similar neurocognitive mechanisms to build structure across multiple domains.
Collapse
Affiliation(s)
- Neil Cohn
- Department of Psychology, Tufts University, 490 Boston Ave, Medford, MA 02155, USA; Department of Cognitive Science, University of California, San Diego, 9500 Gilman Dr. Dept 0526, La Jolla, CA 92093-0526, USA.
| | - Ray Jackendoff
- Department of Philosophy and Center for Cognitive Studies, Tufts University, 115 Miner Hall, Medford, MA 02155, USA; Department of Psychology, Tufts University, 490 Boston Ave, Medford, MA 02155, USA
| | - Phillip J Holcomb
- Department of Psychology, Tufts University, 490 Boston Ave, Medford, MA 02155, USA
| | - Gina R Kuperberg
- Department of Psychology, Tufts University, 490 Boston Ave, Medford, MA 02155, USA; Department of Psychiatry and Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Bldg 149, 13th Street, Charlestown, MA 02129, USA
| |
Collapse
|
11
|
Kung SJ, Chen JL, Zatorre RJ, Penhune VB. Interacting Cortical and Basal Ganglia Networks Underlying Finding and Tapping to the Musical Beat. J Cogn Neurosci 2013; 25:401-20. [DOI: 10.1162/jocn_a_00325] [Citation(s) in RCA: 106] [Impact Index Per Article: 9.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/25/2022]
Abstract
Abstract
Humans are able to find and tap to the beat of musical rhythms varying in complexity from children's songs to modern jazz. Musical beat has no one-to-one relationship with auditory features—it is an abstract perceptual representation that emerges from the interaction between sensory cues and higher-level cognitive organization. Previous investigations have examined the neural basis of beat processing but have not tested the core phenomenon of finding and tapping to the musical beat. To test this, we used fMRI and had musicians find and tap to the beat of rhythms that varied from metrically simple to metrically complex—thus from a strong to a weak beat. Unlike most previous studies, we measured beat tapping performance during scanning and controlled for possible effects of scanner noise on beat perception. Results showed that beat finding and tapping recruited largely overlapping brain regions, including the superior temporal gyrus (STG), premotor cortex, and ventrolateral PFC (VLPFC). Beat tapping activity in STG and VLPFC was correlated with both perception and performance, suggesting that they are important for retrieving, selecting, and maintaining the musical beat. In contrast BG activity was similar in all conditions and was not correlated with either perception or production, suggesting that it may be involved in detecting auditory temporal regularity or in associating auditory stimuli with a motor response. Importantly, functional connectivity analyses showed that these systems interact, indicating that more basic sensorimotor mechanisms instantiated in the BG work in tandem with higher-order cognitive mechanisms in PFC.
Collapse
Affiliation(s)
- Shu-Jen Kung
- 1National Yang-Ming University, Taipei City, Taiwan
| | | | - Robert J. Zatorre
- 3Montreal Neurological Institute
- 4International Laboratory for Brain, Music and Sound
| | - Virginia B. Penhune
- 4International Laboratory for Brain, Music and Sound
- 5Concordia University, Montréal, Canada
| |
Collapse
|
12
|
Abstract
Perception of temporal patterns is fundamental to normal hearing, speech, motor control, and music. Certain types of pattern understanding are unique to humans, such as musical rhythm. Although human responses to musical rhythm are universal, there is much we do not understand about how rhythm is processed in the brain. Here, I consider findings from research into basic timing mechanisms and models through to the neuroscience of rhythm and meter. A network of neural areas, including motor regions, is regularly implicated in basic timing as well as processing of musical rhythm. However, fractionating the specific roles of individual areas in this network has remained a challenge. Distinctions in activity patterns appear between "automatic" and "cognitively controlled" timing processes, but the perception of musical rhythm requires features of both automatic and controlled processes. In addition, many experimental manipulations rely on participants directing their attention toward or away from certain stimulus features, and measuring corresponding differences in neural activity. Many temporal features, however, are implicitly processed whether attended to or not, making it difficult to create controlled baseline conditions for experimental comparisons. The variety of stimuli, paradigms, and definitions can further complicate comparisons across domains or methodologies. Despite these challenges, the high level of interest and multitude of methodological approaches from different cognitive domains (including music, language, and motor learning) have yielded new insights and hold promise for future progress.
Collapse
Affiliation(s)
- Jessica A Grahn
- Brain and Mind Institute & Department of Psychology, University of Western Ontario, London, Ontario N6A 5B7, Canada.
| |
Collapse
|