1
|
Chabin T, Pazart L, Gabriel D. Vocal melody and musical background are simultaneously processed by the brain for musical predictions. Ann N Y Acad Sci 2022; 1512:126-140. [PMID: 35229293 DOI: 10.1111/nyas.14755] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/20/2021] [Accepted: 01/18/2022] [Indexed: 12/18/2022]
Abstract
Musical pleasure is related to the capacity to predict and anticipate the music. By recording early cerebral responses of 16 participants with electroencephalography during periods of silence inserted in known and unknown songs, we aimed to measure the contribution of different musical attributes to musical predictions. We investigated the mismatch between past encoded musical features and the current sensory inputs when listening to lyrics associated with vocal melody, only background instrumental material, or both attributes grouped together. When participants were listening to chords and lyrics for known songs, the brain responses related to musical violation produced event-related potential responses around 150-200 ms that were of a larger amplitude than for chords or lyrics only. Microstate analysis also revealed that for chords and lyrics, the global field power had an increased stability and a longer duration. The source localization identified that the right superior temporal and frontal gyri and the inferior and medial frontal gyri were activated for a longer time for chords and lyrics, likely caused by the increased complexity of the stimuli. We conclude that grouped together, a broader integration and retrieval of several musical attributes at the same time recruit larger neuronal networks that lead to more accurate predictions.
Collapse
Affiliation(s)
- Thibault Chabin
- Centre Hospitalier Universitaire de Besançon, Centre d'Investigation Clinique INSERM CIC 1431, Besançon, France
| | - Lionel Pazart
- Plateforme de Neuroimagerie Fonctionnelle et Neurostimulation Neuraxess, Centre Hospitalier Universitaire de Besançon, Université de Bourgogne Franche-Comté, Bourgogne Franche-Comté, France
| | - Damien Gabriel
- Laboratoire de Recherches Intégratives en Neurosciences et Psychologie Cognitive, Université Bourgogne Franche-Comté, Besançon, France
| |
Collapse
|
2
|
What's what in auditory cortices? Neuroimage 2018; 176:29-40. [DOI: 10.1016/j.neuroimage.2018.04.028] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2018] [Revised: 04/04/2018] [Accepted: 04/12/2018] [Indexed: 11/30/2022] Open
|
3
|
Caclin A, Tillmann B. Musical and verbal short-term memory: insights from neurodevelopmental and neurological disorders. Ann N Y Acad Sci 2018; 1423:155-165. [PMID: 29744897 DOI: 10.1111/nyas.13733] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2017] [Revised: 03/17/2018] [Accepted: 03/22/2018] [Indexed: 12/28/2022]
Abstract
Auditory short-term memory (STM) is a fundamental ability to make sense of auditory information as it unfolds over time. Whether separate STM systems exist for different types of auditory information (music and speech, in particular) is a matter of debate. The present paper reviews studies that have investigated both musical and verbal STM in healthy individuals and in participants with neurodevelopmental and neurological disorders. Overall, the results are in favor of only partly shared networks for musical and verbal STM. Evidence for a distinction in STM for the two materials stems from (1) behavioral studies in healthy participants, in particular from the comparison between nonmusicians and musicians; (2) behavioral studies in congenital amusia, where a selective pitch STM deficit is observed; and (3) studies in brain-damaged patients with cases of double dissociation. In this review we highlight the need for future studies comparing STM for the same perceptual dimension (e.g., pitch) in different materials (e.g., music and speech), as well as for studies aiming at a more insightful characterization of shared and distinct mechanisms for speech and music in the different components of STM, namely encoding, retention, and retrieval.
Collapse
Affiliation(s)
- Anne Caclin
- Lyon Neuroscience Research Center (CRNL), Brain Dynamics and Cognition Team (DYCOG) and Auditory Cognition and Psychoacoustics Team, INSERM, U1028, CNRS, UMR5292, Lyon, France
- Université Lyon 1, Lyon, France
| | - Barbara Tillmann
- Lyon Neuroscience Research Center (CRNL), Brain Dynamics and Cognition Team (DYCOG) and Auditory Cognition and Psychoacoustics Team, INSERM, U1028, CNRS, UMR5292, Lyon, France
- Université Lyon 1, Lyon, France
| |
Collapse
|
4
|
Peters B, Bledowski C, Rieder M, Kaiser J. Recurrence of task set-related MEG signal patterns during auditory working memory. Brain Res 2015; 1640:232-42. [PMID: 26683086 DOI: 10.1016/j.brainres.2015.12.006] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2015] [Revised: 11/19/2015] [Accepted: 12/05/2015] [Indexed: 11/30/2022]
Abstract
Processing of auditory spatial and non-spatial information in working memory has been shown to rely on separate cortical systems. While previous studies have demonstrated differences in spatial versus non-spatial processing from the encoding of to-be-remembered stimuli onwards, here we investigated whether such differences would be detectable already prior to presentation of the sample stimulus. We analyzed broad-band magnetoencephalography data from 15 healthy adults during an auditory working memory paradigm starting with a visual cue indicating the task-relevant stimulus feature for a given trial (lateralization or pitch) and a subsequent 1.5-s pre-encoding phase. This was followed by a sample sound (0.2s), the delay phase (0.8s) and a test stimulus (0.2s) after which participants made a match/non-match decision. Linear discriminant functions were trained to decode task-specific signal patterns throughout the task, and temporal generalization was used to assess whether the neural codes discriminating between the tasks during the pre-encoding phase would recur during later task periods. The spatial versus non-spatial tasks could indeed be discriminated after the onset of the cue onwards, and decoders trained during the pre-encoding phase successfully discriminated the tasks during both sample stimulus encoding and during the delay phase. This demonstrates that task-specific neural codes are established already before the memorandum is presented and that the same patterns are reestablished during stimulus encoding and maintenance. This article is part of a Special Issue entitled SI: Auditory working memory.
Collapse
Affiliation(s)
- Benjamin Peters
- Institute of Medical Psychology, Goethe University, Heinrich-Hoffmann-Str.10, 60528 Frankfurt am Main, Germany.
| | - Christoph Bledowski
- Institute of Medical Psychology, Goethe University, Heinrich-Hoffmann-Str.10, 60528 Frankfurt am Main, Germany
| | - Maria Rieder
- Institute of Medical Psychology, Goethe University, Heinrich-Hoffmann-Str.10, 60528 Frankfurt am Main, Germany
| | - Jochen Kaiser
- Institute of Medical Psychology, Goethe University, Heinrich-Hoffmann-Str.10, 60528 Frankfurt am Main, Germany
| |
Collapse
|
5
|
Abstract
Working memory denotes the ability to retain stimuli in mind that are no longer physically present and to perform mental operations on them. Electro- and magnetoencephalography allow investigating the short-term maintenance of acoustic stimuli at a high temporal resolution. Studies investigating working memory for non-spatial and spatial auditory information have suggested differential roles of regions along the putative auditory ventral and dorsal streams, respectively, in the processing of the different sound properties. Analyses of event-related potentials have shown sustained, memory load-dependent deflections over the retention periods. The topography of these waves suggested an involvement of modality-specific sensory storage regions. Spectral analysis has yielded information about the temporal dynamics of auditory working memory processing of individual stimuli, showing activation peaks during the delay phase whose timing was related to task performance. Coherence at different frequencies was enhanced between frontal and sensory cortex. In summary, auditory working memory seems to rely on the dynamic interplay between frontal executive systems and sensory representation regions.
Collapse
Affiliation(s)
- Jochen Kaiser
- Institute of Medical Psychology, Goethe University , Frankfurt am Main, Germany
| |
Collapse
|
6
|
Plakke B, Romanski LM. Auditory connections and functions of prefrontal cortex. Front Neurosci 2014; 8:199. [PMID: 25100931 PMCID: PMC4107948 DOI: 10.3389/fnins.2014.00199] [Citation(s) in RCA: 85] [Impact Index Per Article: 8.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2014] [Accepted: 06/26/2014] [Indexed: 12/17/2022] Open
Abstract
The functional auditory system extends from the ears to the frontal lobes with successively more complex functions occurring as one ascends the hierarchy of the nervous system. Several areas of the frontal lobe receive afferents from both early and late auditory processing regions within the temporal lobe. Afferents from the early part of the cortical auditory system, the auditory belt cortex, which are presumed to carry information regarding auditory features of sounds, project to only a few prefrontal regions and are most dense in the ventrolateral prefrontal cortex (VLPFC). In contrast, projections from the parabelt and the rostral superior temporal gyrus (STG) most likely convey more complex information and target a larger, widespread region of the prefrontal cortex. Neuronal responses reflect these anatomical projections as some prefrontal neurons exhibit responses to features in acoustic stimuli, while other neurons display task-related responses. For example, recording studies in non-human primates indicate that VLPFC is responsive to complex sounds including vocalizations and that VLPFC neurons in area 12/47 respond to sounds with similar acoustic morphology. In contrast, neuronal responses during auditory working memory involve a wider region of the prefrontal cortex. In humans, the frontal lobe is involved in auditory detection, discrimination, and working memory. Past research suggests that dorsal and ventral subregions of the prefrontal cortex process different types of information with dorsal cortex processing spatial/visual information and ventral cortex processing non-spatial/auditory information. While this is apparent in the non-human primate and in some neuroimaging studies, most research in humans indicates that specific task conditions, stimuli or previous experience may bias the recruitment of specific prefrontal regions, suggesting a more flexible role for the frontal lobe during auditory cognition.
Collapse
Affiliation(s)
- Bethany Plakke
- Department of Neurobiology and Anatomy, University of Rochester School of Medicine and Dentistry Rochester, NY, USA
| | - Lizabeth M Romanski
- Department of Neurobiology and Anatomy, University of Rochester School of Medicine and Dentistry Rochester, NY, USA
| |
Collapse
|
7
|
Cloutman LL. Interaction between dorsal and ventral processing streams: where, when and how? BRAIN AND LANGUAGE 2013; 127:251-263. [PMID: 22968092 DOI: 10.1016/j.bandl.2012.08.003] [Citation(s) in RCA: 107] [Impact Index Per Article: 9.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/06/2012] [Revised: 08/02/2012] [Accepted: 08/13/2012] [Indexed: 06/01/2023]
Abstract
The execution of complex visual, auditory, and linguistic behaviors requires a dynamic interplay between spatial ('where/how') and non-spatial ('what') information processed along the dorsal and ventral processing streams. However, while it is acknowledged that there must be some degree of interaction between the two processing networks, how they interact, both anatomically and functionally, is a question which remains little explored. The current review examines the anatomical, temporal, and behavioral evidence regarding three potential models of dual stream interaction: (1) computations along the two pathways proceed independently and in parallel, reintegrating within shared target brain regions; (2) processing along the separate pathways is modulated by the existence of recurrent feedback loops; and (3) information is transferred directly between the two pathways at multiple stages and locations along their trajectories.
Collapse
Affiliation(s)
- Lauren L Cloutman
- Neuroscience and Aphasia Research Unit (NARU), Zochonis Building, School of Psychological Sciences, University of Manchester, Oxford Road, Manchester M13 9PL, UK.
| |
Collapse
|
8
|
Listening to an audio drama activates two processing networks, one for all sounds, another exclusively for speech. PLoS One 2013; 8:e64489. [PMID: 23734202 PMCID: PMC3667190 DOI: 10.1371/journal.pone.0064489] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2012] [Accepted: 04/16/2013] [Indexed: 11/19/2022] Open
Abstract
Earlier studies have shown considerable intersubject synchronization of brain activity when subjects watch the same movie or listen to the same story. Here we investigated the across-subjects similarity of brain responses to speech and non-speech sounds in a continuous audio drama designed for blind people. Thirteen healthy adults listened for ∼19 min to the audio drama while their brain activity was measured with 3 T functional magnetic resonance imaging (fMRI). An intersubject-correlation (ISC) map, computed across the whole experiment to assess the stimulus-driven extrinsic brain network, indicated statistically significant ISC in temporal, frontal and parietal cortices, cingulate cortex, and amygdala. Group-level independent component (IC) analysis was used to parcel out the brain signals into functionally coupled networks, and the dependence of the ICs on external stimuli was tested by comparing them with the ISC map. This procedure revealed four extrinsic ICs of which two-covering non-overlapping areas of the auditory cortex-were modulated by both speech and non-speech sounds. The two other extrinsic ICs, one left-hemisphere-lateralized and the other right-hemisphere-lateralized, were speech-related and comprised the superior and middle temporal gyri, temporal poles, and the left angular and inferior orbital gyri. In areas of low ISC four ICs that were defined intrinsic fluctuated similarly as the time-courses of either the speech-sound-related or all-sounds-related extrinsic ICs. These ICs included the superior temporal gyrus, the anterior insula, and the frontal, parietal and midline occipital cortices. Taken together, substantial intersubject synchronization of cortical activity was observed in subjects listening to an audio drama, with results suggesting that speech is processed in two separate networks, one dedicated to the processing of speech sounds and the other to both speech and non-speech sounds.
Collapse
|
9
|
Leavitt VM, Molholm S, Gomez-Ramirez M, Foxe JJ. "What" and "where" in auditory sensory processing: a high-density electrical mapping study of distinct neural processes underlying sound object recognition and sound localization. Front Integr Neurosci 2011; 5:23. [PMID: 21734870 PMCID: PMC3124831 DOI: 10.3389/fnint.2011.00023] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2010] [Accepted: 05/17/2011] [Indexed: 11/13/2022] Open
Abstract
Functionally distinct dorsal and ventral auditory pathways for sound localization (WHERE) and sound object recognition (WHAT) have been described in non-human primates. A handful of studies have explored differential processing within these streams in humans, with highly inconsistent findings. Stimuli employed have included simple tones, noise bursts, and speech sounds, with simulated left–right spatial manipulations, and in some cases participants were not required to actively discriminate the stimuli. Our contention is that these paradigms were not well suited to dissociating processing within the two streams. Our aim here was to determine how early in processing we could find evidence for dissociable pathways using better titrated WHAT and WHERE task conditions. The use of more compelling tasks should allow us to amplify differential processing within the dorsal and ventral pathways. We employed high-density electrical mapping using a relatively large and environmentally realistic stimulus set (seven animal calls) delivered from seven free-field spatial locations; with stimulus configuration identical across the “WHERE” and “WHAT” tasks. Topographic analysis revealed distinct dorsal and ventral auditory processing networks during the WHERE and WHAT tasks with the earliest point of divergence seen during the N1 component of the auditory evoked response, beginning at approximately 100 ms. While this difference occurred during the N1 timeframe, it was not a simple modulation of N1 amplitude as it displayed a wholly different topographic distribution to that of the N1. Global dissimilarity measures using topographic modulation analysis confirmed that this difference between tasks was driven by a shift in the underlying generator configuration. Minimum-norm source reconstruction revealed distinct activations that corresponded well with activity within putative dorsal and ventral auditory structures.
Collapse
Affiliation(s)
- Victoria M Leavitt
- The Cognitive Neurophysiology Laboratory, Nathan S. Kline Institute for Psychiatric Research Orangeburg, New York, NY, USA
| | | | | | | |
Collapse
|
10
|
Leung AWS, Alain C. Working memory load modulates the auditory "What" and "Where" neural networks. Neuroimage 2010; 55:1260-9. [PMID: 21195187 DOI: 10.1016/j.neuroimage.2010.12.055] [Citation(s) in RCA: 47] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2010] [Revised: 12/16/2010] [Accepted: 12/20/2010] [Indexed: 11/15/2022] Open
Abstract
Working memory for sound identity (What) and sound location (Where) has been associated with increased neural activity in ventral and dorsal brain regions, respectively. To further ascertain this domain specificity, we measured fMRI signals during an n-back (n=1, 2) working memory task for sound identity or location, where stimuli selected randomly from three semantic categories (human, animal, and music) were presented at three possible virtual locations. Accuracy and reaction times were comparable in both "What" and "Where" tasks, albeit worse for the 2-back than for the 1-back condition. The analysis of fMRI data revealed greater activity in ventral and dorsal brain regions during sound identity and sound location, respectively. More importantly, there was an interaction between task and working memory load in the inferior parietal lobule (IPL). Within the right IPL, there were two sub-regions modulated differentially by working memory load: an anterior ventromedial region modulated by location load and a posterior dorsolateral region modulated by category load. These specific changes in neural activity as a function of working memory load reveal domain-specificity within the parietal cortex.
Collapse
Affiliation(s)
- Ada W S Leung
- Rotman Research Institute, Baycrest Centre for Geriatric Care, Toronto, Ontario, Canada M6A 2E1
| | | |
Collapse
|
11
|
Faraco CC, Unsworth N, Langley J, Terry D, Li K, Zhang D, Liu T, Miller LS. Complex span tasks and hippocampal recruitment during working memory. Neuroimage 2010; 55:773-87. [PMID: 21182968 DOI: 10.1016/j.neuroimage.2010.12.033] [Citation(s) in RCA: 76] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2010] [Revised: 12/06/2010] [Accepted: 12/09/2010] [Indexed: 11/17/2022] Open
Abstract
The working memory (WM) system is vital to performing everyday functions that require attentive, non-automatic processing of information. However, its interaction with long term memory (LTM) is highly debated. Here, we used fMRI to examine whether a popular complex WM span task, thought to force the displacement of to-be-remembered items in the focus of attention to LTM, recruited medial temporal regions typically associated with LTM functioning to a greater extent and in a different manner than traditional neuroimaging WM tasks during WM encoding and maintenance. fMRI scans were acquired while participants performed the operation span (OSPAN) task and an arithmetic task. Results indicated that performance of both tasks resulted in significant activation in regions typically associated with WM function. More importantly, significant bilateral activation was observed in the hippocampus, suggesting it is recruited during WM encoding and maintenance. Right posterior hippocampus activation was greater during OSPAN than arithmetic. Persitimulus graphs indicate a possible specialization of function for bilateral posterior hippocampus and greater involvement of the left for WM performance. Recall time-course activity within this region hints at LTM involvement during complex span.
Collapse
Affiliation(s)
- Carlos Cesar Faraco
- Biomedical Health Sciences Institute, Division of Neuroscience, University of Georgia, Athens, GA 30602, USA.
| | | | | | | | | | | | | | | |
Collapse
|
12
|
Koiwa N, Masaoka Y, Kusumi T, Homma I. Sound localization difficulty affects early and late processing of auditory spatial information: Investigation using the dipole tracing method. Clin Neurophysiol 2010; 121:1526-1539. [DOI: 10.1016/j.clinph.2010.03.016] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2009] [Revised: 02/18/2010] [Accepted: 03/15/2010] [Indexed: 11/15/2022]
|
13
|
Multisensory integration of sounds and vibrotactile stimuli in processing streams for "what" and "where". J Neurosci 2009; 29:10950-60. [PMID: 19726653 DOI: 10.1523/jneurosci.0910-09.2009] [Citation(s) in RCA: 87] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
The segregation between cortical pathways for the identification and localization of objects is thought of as a general organizational principle in the brain. Yet, little is known about the unimodal versus multimodal nature of these processing streams. The main purpose of the present study was to test whether the auditory and tactile dual pathways converged into specialized multisensory brain areas. We used functional magnetic resonance imaging (fMRI) to compare directly in the same subjects the brain activation related to localization and identification of comparable auditory and vibrotactile stimuli. Results indicate that the right inferior frontal gyrus (IFG) and both left and right insula were more activated during identification conditions than during localization in both touch and audition. The reverse dissociation was found for the left and right inferior parietal lobules (IPL), the left superior parietal lobule (SPL) and the right precuneus-SPL, which were all more activated during localization conditions in the two modalities. We propose that specialized areas in the right IFG and the left and right insula are multisensory operators for the processing of stimulus identity whereas parts of the left and right IPL and SPL are specialized for the processing of spatial attributes independently of sensory modality.
Collapse
|
14
|
Common coding of auditory and visual spatial information in working memory. Brain Res 2008; 1230:158-67. [PMID: 18652807 DOI: 10.1016/j.brainres.2008.07.005] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2007] [Revised: 06/21/2008] [Accepted: 07/02/2008] [Indexed: 11/22/2022]
Abstract
We compared spatial short-term memory for visual and auditory stimuli in an event-related slow potentials study. Subjects encoded object locations of either four or six sequentially presented auditory or visual stimuli and maintained them during a retention period of 6 s. Slow potentials recorded during encoding were modulated by the modality of the stimuli. Stimulus related activity was stronger for auditory items at frontal and for visual items at posterior sites. At frontal electrodes, negative potentials incrementally increased with the sequential presentation of visual items, whereas a strong transient component occurred during encoding of each auditory item without the cumulative increment. During maintenance, frontal slow potentials were affected by modality and memory load according to task difficulty. In contrast, at posterior recording sites, slow potential activity was only modulated by memory load independent of modality. We interpret the frontal effects as correlates of different encoding strategies and the posterior effects as a correlate of common coding of visual and auditory object locations.
Collapse
|
15
|
Auditory event-related potentials during a spatial working memory task. Clin Neurophysiol 2008; 119:1176-89. [PMID: 18313978 DOI: 10.1016/j.clinph.2008.01.014] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2007] [Revised: 12/19/2007] [Accepted: 01/17/2008] [Indexed: 11/21/2022]
Abstract
OBJECTIVE Sensory cortical activity can be jointly governed by bottom-up (e.g. stimulus features) and top-down (e.g. memory, attention) factors. We tested the hypothesis that auditory sensory cortical activity is affected by encoding and retrieval of spatial information. METHODS Auditory event-related potentials (ERPs) were recorded during working memory and passive listening conditions. Trials contained three noise bursts (two "items" at different locations, followed by a "probe"). In the working memory task subjects determined if the probe matched an item location. The influence of long-term memory was evaluated by training to one location that was always a non-match. Auditory ERPs were analyzed to items and probes (N100, P200, late positive wave-LPW). RESULTS Reaction times varied significantly among probes (trained non-match<matches<non-match). In only the Passive condition N100 and P200 amplitudes to the first item were significantly larger than the second item. Probe ERP amplitudes (N100, LPW) were comparable for match and trained non-match probes relative to non-matches. CONCLUSIONS Findings suggest that top-down factors during encoding modify sensory responses to successive items. Probe ERPs reflect sequence factors, such as recency and stimulus probability, and retrieval mechanisms not evident in passive listening. SIGNIFICANCE Results support a contribution of auditory cortex to working memory.
Collapse
|
16
|
Lehnert G, Zimmer HD. Modality and domain specific components in auditory and visual working memory tasks. Cogn Process 2007; 9:53-61. [PMID: 17891428 DOI: 10.1007/s10339-007-0187-6] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2007] [Revised: 09/04/2007] [Accepted: 09/05/2007] [Indexed: 11/24/2022]
Abstract
In the tripartite model of working memory (WM) it is postulated that a unique part system-the visuo-spatial sketchpad (VSSP)-processes non-verbal content. Due to behavioral and neurophysiological findings, the VSSP was later subdivided into visual object and visual spatial processing, the former representing objects' appearance and the latter spatial information. This distinction is well supported. However, a challenge to this model is the question how spatial information from non-visual sensory modalities, for example the auditory one, is processed. Only a few studies so far have directly compared visual and auditory spatial WM. They suggest that the distinction of two processing domains--one for object and one for spatial information--also holds true for auditory WM, but that only a part of the processes is modality specific. We propose that processing in the object domain (the item's appearance) is modality specific, while spatial WM as well as object-location binding relies on modality general processes.
Collapse
Affiliation(s)
- Günther Lehnert
- Brain and Cognition Unit, Department of Psychology, Saarland University, Saarbrücken, Germany.
| | | |
Collapse
|
17
|
Rämä P. Domain-dependent activation during spatial and nonspatial auditory working memory. Cogn Process 2007; 9:29-34. [PMID: 17885775 DOI: 10.1007/s10339-007-0182-y] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/19/2007] [Revised: 06/07/2007] [Accepted: 09/03/2007] [Indexed: 11/27/2022]
Abstract
Visual system has been proposed to be divided into two, the ventral and dorsal, processing streams. The ventral pathway is thought to be involved in object identification whereas the dorsal pathway processes information regarding the spatial locations of objects and the spatial relationships among objects. Several studies on working memory (WM) processing have further suggested that there is a dissociable domain-dependent functional organization within the prefrontal cortex for processing of spatial and nonspatial visual information. Also the auditory system is proposed to be organized into two domain-specific processing streams, similar to that seen in the visual system. Recent studies on auditory WM have further suggested that maintenance of nonspatial and spatial auditory information activates a distributed neural network including temporal, parietal, and frontal regions but the magnitude of activation within these activated areas shows a different functional topography depending on the type of information being maintained. The dorsal prefrontal cortex, specifically an area of the superior frontal sulcus (SFS), has been shown to exhibit greater activity for spatial than for nonspatial auditory tasks. Conversely, ventral frontal regions have been shown to be more recruited by nonspatial than by spatial auditory tasks. It has also been shown that the magnitude of this dissociation is dependent on the cognitive operations required during WM processing. Moreover, there is evidence that within the nonspatial domain in the ventral prefrontal cortex, there is an across-modality dissociation during maintenance of visual and auditory information. Taken together, human neuroimaging results on both visual and auditory sensory systems support the idea that the prefrontal cortex is organized according to the type of information being maintained in WM.
Collapse
Affiliation(s)
- Pia Rämä
- Cognitive Brain Research Unit, Department of Psychology, University of Helsinki, Helsinki, Finland.
| |
Collapse
|
18
|
Anurova I, Artchakov D, Korvenoja A, Ilmoniemi RJ, Aronen HJ, Carlson S. Cortical generators of slow evoked responses elicited by spatial and nonspatial auditory working memory tasks. Clin Neurophysiol 2005; 116:1644-54. [PMID: 15897006 DOI: 10.1016/j.clinph.2005.02.029] [Citation(s) in RCA: 15] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/20/2004] [Revised: 01/21/2005] [Accepted: 02/20/2005] [Indexed: 10/25/2022]
Abstract
OBJECTIVE Slow evoked responses have been extensively studied using electrophysiological and neuroimaging methods, but there is no consensus regarding their generators. We investigated the generators of the P3 and positive slow wave (PSW) in the evoked responses to probes recorded during auditory working memory tasks to find out whether there is dissociation between functional networks involved in the generation of the P3 and PSW and between spatial and nonspatial auditory processing within this time window. METHODS Whole-head magneto-(MEG) and electroencephalography (EEG); analysis of MEG data using minimum-norm current estimates. RESULTS The associative temporal, occipito-temporal and parietal areas contributed to the generation of the slow evoked responses. The temporal source increased while the occipito-temporal source diminished activity during transition from the P3 to PSW. The occipito-temporal generator of the P3 was activated more during the spatial than nonspatial task, and the left temporal generator of the PSW tended to be more strongly activated during the nonspatial task. CONCLUSIONS These findings indicate that partially distinct functional networks generate the P3 and PSW and provide evidence for segregation of spatial and nonspatial auditory information processing in associative areas beyond the supratemporal auditory cortex. SIGNIFICANCE The present results support the dual-stream model for auditory information processing.
Collapse
Affiliation(s)
- Irina Anurova
- Neuroscience Unit, Institute of Biomedicine/Physiology, University of Helsinki, P.O. Box 63 (Haartmaninkatu 8), 00014 Helsinki, Finland
| | | | | | | | | | | |
Collapse
|
19
|
Novitski N, Anourova I, Martinkauppi S, Aronen HJ, Näätänen R, Carlson S. Effects of noise from functional magnetic resonance imaging on auditory event-related potentials in working memory task. Neuroimage 2003; 20:1320-8. [PMID: 14568500 DOI: 10.1016/s1053-8119(03)00390-2] [Citation(s) in RCA: 35] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2003] [Revised: 06/16/2003] [Accepted: 06/26/2003] [Indexed: 11/17/2022] Open
Abstract
The effects of functional magnetic resonance imaging (fMRI) acoustic noise were investigated on the parameters of event-related responses (ERPs) elicited during auditory matching-to-sample location and pitch working memory tasks. Stimuli were tones with varying location (left or right) and frequency (high or low). Subjects were instructed to memorize and compare either the locations or frequencies of the stimuli with each other. Tape-recorded fMRI acoustic noise was presented in half of the experimental blocks. The fMRI noise considerably enhanced the P1 component, reduced the amplitude and increased the latency of the N1, shortened the latency of the N2, and enhanced the amplitude of the P3 in both tasks. The N1 amplitude was higher in the location than pitch task in both noise and no-noise blocks, whereas the task-related N1 latency difference was present in the no-noise blocks only. Although the task-related differences between spatial and nonspatial auditory responses were partially preserved in noise, the finding that the acoustic gradient noise accompanying functional MR imaging modulated the auditory ERPs implies that the noise may confound the results of auditory fMRI experiments especially when studying higher cognitive processing.
Collapse
Affiliation(s)
- Nikolai Novitski
- Neuroscience Unit, Institute of Biomedicine/Physiology, University of, Helsinki, Finland.
| | | | | | | | | | | |
Collapse
|