1
|
Axelrod V, Rozier C, Lehongre K, Adam C, Lambrecq V, Navarro V, Naccache L. Neural modulations in the auditory cortex during internal and external attention tasks: A single-patient intracranial recording study. Cortex 2022; 157:211-230. [PMID: 36335821 DOI: 10.1016/j.cortex.2022.09.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2021] [Revised: 05/12/2022] [Accepted: 09/27/2022] [Indexed: 12/15/2022]
Abstract
Brain sensory processing is not passive, but is rather modulated by our internal state. Different research methods such as non-invasive imaging methods and intracranial recording of the local field potential (LFP) have been used to study to what extent sensory processing and the auditory cortex in particular are modulated by selective attention. However, at the level of the single- or multi-units the selective attention in humans has not been tested. In addition, most previous research on selective attention has explored externally-oriented attention, but attention can be also directed inward (i.e., internal attention), like spontaneous self-generated thoughts and mind-wandering. In the present study we had a rare opportunity to record multi-unit activity (MUA) in the auditory cortex of a patient. To complement, we also analyzed the LFP signal of the macro-contact in the auditory cortex. Our experiment consisted of two conditions with periodic beeping sounds. The participants were asked either to count the beeps (i.e., an "external attention" condition) or to recall the events of the previous day (i.e., an "internal attention" condition). We found that the four out of seven recorded units in the auditory cortex showed increased firing rates in "external attention" compared to "internal attention" condition. The beginning of this attentional modulation varied across multi-units between 30-50 msec and 130-150 msec from stimulus onset, a result that is compatible with an early selection view. The LFP evoked potential and induced high gamma activity both showed attentional modulation starting at about 70-80 msec. As the control, for the same experiment we recorded MUA activity in the amygdala and hippocampus of two additional patients. No major attentional modulation was found in the control regions. Overall, we believe that our results provide new empirical information and support for existing theoretical views on selective attention and spontaneous self-generated cognition.
Collapse
Affiliation(s)
- Vadim Axelrod
- The Gonda Multidisciplinary Brain Research Center, Bar Ilan University, Ramat Gan, Israel.
| | - Camille Rozier
- Sorbonne Université, Institut du Cerveau - Paris Brain Institute, ICM, INSERM U1127, CNRS UMR 7225, Paris, France
| | - Katia Lehongre
- Sorbonne Université, Institut du Cerveau - Paris Brain Institute, ICM, INSERM U1127, CNRS UMR 7225, Paris, France; Centre de NeuroImagerie de Recherche-CENIR, Paris Brain Institute, UMRS 1127, CNRS UMR 7225, Pitié-Salpêtriere Hospital, Paris, France
| | - Claude Adam
- AP-HP, GH Pitie-Salpêtrière-Charles Foix, Epilepsy Unit, Neurology Department, Paris, France
| | - Virginie Lambrecq
- Sorbonne Université, Institut du Cerveau - Paris Brain Institute, ICM, INSERM U1127, CNRS UMR 7225, Paris, France; AP-HP, Groupe hospitalier Pitié-Salpêtrière, Department of Neurophysiology, Paris, France; Sorbonne Université, UMR S1127, Paris, France
| | - Vincent Navarro
- Sorbonne Université, Institut du Cerveau - Paris Brain Institute, ICM, INSERM U1127, CNRS UMR 7225, Paris, France; AP-HP, GH Pitie-Salpêtrière-Charles Foix, Epilepsy Unit, Neurology Department, Paris, France; Sorbonne Université, UMR S1127, Paris, France
| | - Lionel Naccache
- Sorbonne Université, Institut du Cerveau - Paris Brain Institute, ICM, INSERM U1127, CNRS UMR 7225, Paris, France; AP-HP, Groupe hospitalier Pitié-Salpêtrière, Department of Neurophysiology, Paris, France
| |
Collapse
|
2
|
Kiremitçi I, Yilmaz Ö, Çelik E, Shahdloo M, Huth AG, Çukur T. Attentional Modulation of Hierarchical Speech Representations in a Multitalker Environment. Cereb Cortex 2021; 31:4986-5005. [PMID: 34115102 PMCID: PMC8491717 DOI: 10.1093/cercor/bhab136] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/03/2020] [Revised: 04/01/2021] [Accepted: 04/21/2021] [Indexed: 11/13/2022] Open
Abstract
Humans are remarkably adept in listening to a desired speaker in a crowded environment, while filtering out nontarget speakers in the background. Attention is key to solving this difficult cocktail-party task, yet a detailed characterization of attentional effects on speech representations is lacking. It remains unclear across what levels of speech features and how much attentional modulation occurs in each brain area during the cocktail-party task. To address these questions, we recorded whole-brain blood-oxygen-level-dependent (BOLD) responses while subjects either passively listened to single-speaker stories, or selectively attended to a male or a female speaker in temporally overlaid stories in separate experiments. Spectral, articulatory, and semantic models of the natural stories were constructed. Intrinsic selectivity profiles were identified via voxelwise models fit to passive listening responses. Attentional modulations were then quantified based on model predictions for attended and unattended stories in the cocktail-party task. We find that attention causes broad modulations at multiple levels of speech representations while growing stronger toward later stages of processing, and that unattended speech is represented up to the semantic level in parabelt auditory cortex. These results provide insights on attentional mechanisms that underlie the ability to selectively listen to a desired speaker in noisy multispeaker environments.
Collapse
Affiliation(s)
- Ibrahim Kiremitçi
- Neuroscience Program, Sabuncu Brain Research Center, Bilkent University, Ankara TR-06800, Turkey
- National Magnetic Resonance Research Center (UMRAM), Bilkent University, Ankara TR-06800, Turkey
| | - Özgür Yilmaz
- National Magnetic Resonance Research Center (UMRAM), Bilkent University, Ankara TR-06800, Turkey
- Department of Electrical and Electronics Engineering, Bilkent University, Ankara TR-06800, Turkey
| | - Emin Çelik
- Neuroscience Program, Sabuncu Brain Research Center, Bilkent University, Ankara TR-06800, Turkey
- National Magnetic Resonance Research Center (UMRAM), Bilkent University, Ankara TR-06800, Turkey
| | - Mo Shahdloo
- National Magnetic Resonance Research Center (UMRAM), Bilkent University, Ankara TR-06800, Turkey
- Department of Experimental Psychology, Wellcome Centre for Integrative Neuroimaging, University of Oxford, Oxford OX3 9DU, UK
| | - Alexander G Huth
- Department of Neuroscience, The University of Texas at Austin, Austin, TX 78712, USA
- Department of Computer Science, The University of Texas at Austin, Austin, TX 78712, USA
- Helen Wills Neuroscience Institute, University of California, Berkeley, CA 94702, USA
| | - Tolga Çukur
- Neuroscience Program, Sabuncu Brain Research Center, Bilkent University, Ankara TR-06800, Turkey
- National Magnetic Resonance Research Center (UMRAM), Bilkent University, Ankara TR-06800, Turkey
- Department of Electrical and Electronics Engineering, Bilkent University, Ankara TR-06800, Turkey
- Helen Wills Neuroscience Institute, University of California, Berkeley, CA 94702, USA
| |
Collapse
|
3
|
Wikman P, Rinne T, Petkov CI. Reward cues readily direct monkeys' auditory performance resulting in broad auditory cortex modulation and interaction with sites along cholinergic and dopaminergic pathways. Sci Rep 2019; 9:3055. [PMID: 30816142 PMCID: PMC6395775 DOI: 10.1038/s41598-019-38833-y] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2018] [Accepted: 12/28/2018] [Indexed: 11/18/2022] Open
Abstract
In natural settings, the prospect of reward often influences the focus of our attention, but how cognitive and motivational systems influence sensory cortex is not well understood. Also, challenges in training nonhuman animals on cognitive tasks complicate cross-species comparisons and interpreting results on the neurobiological bases of cognition. Incentivized attention tasks could expedite training and evaluate the impact of attention on sensory cortex. Here we develop an Incentivized Attention Paradigm (IAP) and use it to show that macaque monkeys readily learn to use auditory or visual reward cues, drastically influencing their performance within a simple auditory task. Next, this paradigm was used with functional neuroimaging to measure activation modulation in the monkey auditory cortex. The results show modulation of extensive auditory cortical regions throughout primary and non-primary regions, which although a hallmark of attentional modulation in human auditory cortex, has not been studied or observed as broadly in prior data from nonhuman animals. Psycho-physiological interactions were identified between the observed auditory cortex effects and regions including basal forebrain sites along acetylcholinergic and dopaminergic pathways. The findings reveal the impact and regional interactions in the primate brain during an incentivized attention engaging auditory task.
Collapse
Affiliation(s)
- Patrik Wikman
- Department of Psychology and Logopedics, University of Helsinki, 00014, Helsinki, Finland.
| | - Teemu Rinne
- Turku Brain and Mind Center, Department of Clinical Medicine, University of Turku, 20014, Turku, Finland.
| | - Christopher I Petkov
- Institute of Neuroscience, Newcastle University, NE1 7RU, Newcastle upon Tyne, United Kingdom.
- Centre for Behaviour and Evolution, Newcastle University, NE1 7RU, Newcastle upon Tyne, United Kingdom.
| |
Collapse
|
4
|
Interaction of the effects associated with auditory-motor integration and attention-engaging listening tasks. Neuropsychologia 2019; 124:322-336. [PMID: 30444980 DOI: 10.1016/j.neuropsychologia.2018.11.006] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2018] [Revised: 09/20/2018] [Accepted: 11/08/2018] [Indexed: 11/22/2022]
Abstract
A number of previous studies have implicated regions in posterior auditory cortex (AC) in auditory-motor integration during speech production. Other studies, in turn, have shown that activation in AC and adjacent regions in the inferior parietal lobule (IPL) is strongly modulated during active listening and depends on task requirements. The present fMRI study investigated whether auditory-motor effects interact with those related to active listening tasks in AC and IPL. In separate task blocks, our subjects performed either auditory discrimination or 2-back memory tasks on phonemic or nonphonemic vowels. They responded to targets by either overtly repeating the last vowel of a target pair, overtly producing a given response vowel, or by pressing a response button. We hypothesized that the requirements for auditory-motor integration, and the associated activation, would be stronger during repetition than production responses and during repetition of nonphonemic than phonemic vowels. We also hypothesized that if auditory-motor effects are independent of task-dependent modulations, then the auditory-motor effects should not differ during discrimination and 2-back tasks. We found that activation in AC and IPL was significantly modulated by task (discrimination vs. 2-back), vocal-response type (repetition vs. production), and motor-response type (vocal vs. button). Motor-response and task effects interacted in IPL but not in AC. Overall, the results support the view that regions in posterior AC are important in auditory-motor integration. However, the present study shows that activation in wide AC and IPL regions is modulated by the motor requirements of active listening tasks in a more general manner. Further, the results suggest that activation modulations in AC associated with attention-engaging listening tasks and those associated with auditory-motor performance are mediated by independent mechanisms.
Collapse
|
5
|
Rinne T, Muers RS, Salo E, Slater H, Petkov CI. Functional Imaging of Audio-Visual Selective Attention in Monkeys and Humans: How do Lapses in Monkey Performance Affect Cross-Species Correspondences? Cereb Cortex 2018; 27:3471-3484. [PMID: 28419201 PMCID: PMC5654311 DOI: 10.1093/cercor/bhx092] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2016] [Indexed: 11/22/2022] Open
Abstract
The cross-species correspondences and differences in how attention modulates brain responses in humans and animal models are poorly understood. We trained 2 monkeys to perform an audio–visual selective attention task during functional magnetic resonance imaging (fMRI), rewarding them to attend to stimuli in one modality while ignoring those in the other. Monkey fMRI identified regions strongly modulated by auditory or visual attention. Surprisingly, auditory attention-related modulations were much more restricted in monkeys than humans performing the same tasks during fMRI. Further analyses ruled out trivial explanations, suggesting that labile selective-attention performance was associated with inhomogeneous modulations in wide cortical regions in the monkeys. The findings provide initial insights into how audio–visual selective attention modulates the primate brain, identify sources for “lost” attention effects in monkeys, and carry implications for modeling the neurobiology of human cognition with nonhuman animals.
Collapse
Affiliation(s)
- Teemu Rinne
- Department of Psychology and Logopedics, University of Helsinki, Helsinki, Finland.,Advanced Magnetic Imaging Centre, Aalto University School of Science, Espoo, Finland
| | - Ross S Muers
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, UK.,Centre for Behaviour and Evolution, Newcastle University, Newcastle upon Tyne, UK
| | - Emma Salo
- Department of Psychology and Logopedics, University of Helsinki, Helsinki, Finland
| | - Heather Slater
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, UK.,Centre for Behaviour and Evolution, Newcastle University, Newcastle upon Tyne, UK
| | - Christopher I Petkov
- Institute of Neuroscience, Newcastle University, Newcastle upon Tyne, UK.,Centre for Behaviour and Evolution, Newcastle University, Newcastle upon Tyne, UK
| |
Collapse
|
6
|
Wiens S, Szychowska M, Nilsson ME. Visual Task Demands and the Auditory Mismatch Negativity: An Empirical Study and a Meta-Analysis. PLoS One 2016; 11:e0146567. [PMID: 26741815 PMCID: PMC4704804 DOI: 10.1371/journal.pone.0146567] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2015] [Accepted: 12/18/2015] [Indexed: 02/04/2023] Open
Abstract
Because the auditory system is particularly useful in monitoring the environment, previous research has examined whether task-irrelevant, auditory distracters are processed even if subjects focus their attention on visual stimuli. This research suggests that attentionally demanding visual tasks decrease the auditory mismatch negativity (MMN) to simultaneously presented auditory distractors. Because a recent behavioral study found that high visual perceptual load decreased detection sensitivity of simultaneous tones, we used a similar task (n = 28) to determine if high visual perceptual load would reduce the auditory MMN. Results suggested that perceptual load did not decrease the MMN. At face value, these nonsignificant findings may suggest that effects of perceptual load on the MMN are smaller than those of other demanding visual tasks. If so, effect sizes should differ systematically between the present and previous studies. We conducted a selective meta-analysis of published studies in which the MMN was derived from the EEG, the visual task demands were continuous and varied between high and low within the same task, and the task-irrelevant tones were presented in a typical oddball paradigm simultaneously with the visual stimuli. Because the meta-analysis suggested that the present (null) findings did not differ systematically from previous findings, the available evidence was combined. Results of this meta-analysis confirmed that demanding visual tasks reduce the MMN to auditory distracters. However, because the meta-analysis was based on small studies and because of the risk for publication biases, future studies should be preregistered with large samples (n > 150) to provide confirmatory evidence for the results of the present meta-analysis. These future studies should also use control conditions that reduce confounding effects of neural adaptation, and use load manipulations that are defined independently from their effects on the MMN.
Collapse
Affiliation(s)
- Stefan Wiens
- Gösta Ekman Laboratory, Department of Psychology, Stockholm University, Stockholm, Sweden
| | - Malina Szychowska
- Gösta Ekman Laboratory, Department of Psychology, Stockholm University, Stockholm, Sweden
- Institute of Acoustics, Department of Physics, Adam Mickiewicz University, Poznan, Poland
| | - Mats E. Nilsson
- Gösta Ekman Laboratory, Department of Psychology, Stockholm University, Stockholm, Sweden
| |
Collapse
|
7
|
Wikman PA, Vainio L, Rinne T. The effect of precision and power grips on activations in human auditory cortex. Front Neurosci 2015; 9:378. [PMID: 26528121 PMCID: PMC4606019 DOI: 10.3389/fnins.2015.00378] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2015] [Accepted: 09/28/2015] [Indexed: 11/23/2022] Open
Abstract
The neuroanatomical pathways interconnecting auditory and motor cortices play a key role in current models of human auditory cortex (AC). Evidently, auditory-motor interaction is important in speech and music production, but the significance of these cortical pathways in other auditory processing is not well known. We investigated the general effects of motor responding on AC activations to sounds during auditory and visual tasks (motor regions were not imaged). During all task blocks, subjects detected targets in the designated modality, reported the relative number of targets at the end of the block, and ignored the stimuli presented in the opposite modality. In each block, they were also instructed to respond to targets either using a precision grip, power grip, or to give no overt target responses. We found that motor responding strongly modulated AC activations. First, during both visual and auditory tasks, activations in widespread regions of AC decreased when subjects made precision and power grip responses to targets. Second, activations in AC were modulated by grip type during the auditory but not during the visual task. Further, the motor effects were distinct from the present strong attention-related modulations in AC. These results are consistent with the idea that operations in AC are shaped by its connections with motor cortical regions.
Collapse
Affiliation(s)
- Patrik A Wikman
- Institute of Behavioural Sciences, University of Helsinki Helsinki, Finland
| | - Lari Vainio
- Institute of Behavioural Sciences, University of Helsinki Helsinki, Finland
| | - Teemu Rinne
- Institute of Behavioural Sciences, University of Helsinki Helsinki, Finland ; Advanced Magnetic Imaging Centre, Aalto University School of Science Espoo, Finland
| |
Collapse
|
8
|
Monaural and binaural contributions to interaural-level-difference sensitivity in human auditory cortex. Neuroimage 2015; 120:456-66. [PMID: 26163805 PMCID: PMC4589528 DOI: 10.1016/j.neuroimage.2015.07.007] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2015] [Revised: 06/08/2015] [Accepted: 07/03/2015] [Indexed: 11/20/2022] Open
Abstract
Whole-brain functional magnetic resonance imaging was used to measure blood-oxygenation-level-dependent (BOLD) responses in human auditory cortex (AC) to sounds with intensity varying independently in the left and right ears. Echoplanar images were acquired at 3 Tesla with sparse image acquisition once per 12-second block of sound stimulation. Combinations of binaural intensity and stimulus presentation rate were varied between blocks, and selected to allow measurement of response-intensity functions in three configurations: monaural 55–85 dB SPL, binaural 55–85 dB SPL with intensity equal in both ears, and binaural with average binaural level of 70 dB SPL and interaural level differences (ILD) ranging ±30 dB (i.e., favoring the left or right ear). Comparison of response functions equated for contralateral intensity revealed that BOLD-response magnitudes (1) generally increased with contralateral intensity, consistent with positive drive of the BOLD response by the contralateral ear, (2) were larger for contralateral monaural stimulation than for binaural stimulation, consistent with negative effects (e.g., inhibition) of ipsilateral input, which were strongest in the left hemisphere, and (3) also increased with ipsilateral intensity when contralateral input was weak, consistent with additional, positive, effects of ipsilateral stimulation. Hemispheric asymmetries in the spatial extent and overall magnitude of BOLD responses were generally consistent with previous studies demonstrating greater bilaterality of responses in the right hemisphere and stricter contralaterality in the left hemisphere. Finally, comparison of responses to fast (40/s) and slow (5/s) stimulus presentation rates revealed significant rate-dependent adaptation of the BOLD response that varied across ILD values.
Collapse
|
9
|
Attention modulates cortical processing of pitch feedback errors in voice control. Sci Rep 2015; 5:7812. [PMID: 25589447 PMCID: PMC4295089 DOI: 10.1038/srep07812] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2014] [Accepted: 12/10/2014] [Indexed: 11/23/2022] Open
Abstract
Considerable evidence has shown that unexpected alterations in auditory feedback elicit fast compensatory adjustments in vocal production. Although generally thought to be involuntary in nature, whether these adjustments can be influenced by attention remains unknown. The present event-related potential (ERP) study aimed to examine whether neurobehavioral processing of auditory-vocal integration can be affected by attention. While sustaining a vowel phonation and hearing pitch-shifted feedback, participants were required to either ignore the pitch perturbations, or attend to them with low (counting the number of perturbations) or high attentional load (counting the type of perturbations). Behavioral results revealed no systematic change of vocal response to pitch perturbations irrespective of whether they were attended or not. At the level of cortex, there was an enhancement of P2 response to attended pitch perturbations in the low-load condition as compared to when they were ignored. In the high-load condition, however, P2 response did not differ from that in the ignored condition. These findings provide the first neurophysiological evidence that auditory-motor integration in voice control can be modulated as a function of attention at the level of cortex. Furthermore, this modulatory effect does not lead to a general enhancement but is subject to attentional load.
Collapse
|
10
|
Takeda Y, Yamanaka K, Yamagishi N, Sato MA. Revealing time-unlocked brain activity from MEG measurements by common waveform estimation. PLoS One 2014; 9:e98014. [PMID: 24879410 PMCID: PMC4039443 DOI: 10.1371/journal.pone.0098014] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2013] [Accepted: 04/28/2014] [Indexed: 11/19/2022] Open
Abstract
Brain activities related to cognitive functions, such as attention, occur with unknown and variable delays after stimulus onsets. Recently, we proposed a method (Common Waveform Estimation, CWE) that could extract such brain activities from magnetoencephalography (MEG) or electroencephalography (EEG) measurements. CWE estimates spatiotemporal MEG/EEG patterns occurring with unknown and variable delays, referred to here as unlocked waveforms, without hypotheses about their shapes. The purpose of this study is to demonstrate the usefulness of CWE for cognitive neuroscience. For this purpose, we show procedures to estimate unlocked waveforms using CWE and to examine their role. We applied CWE to the MEG epochs during Go trials of a visual Go/NoGo task. This revealed unlocked waveforms with interesting properties, specifically large alpha oscillations around the temporal areas. To examine the role of the unlocked waveform, we attempted to estimate the strength of the brain activity of the unlocked waveform in various conditions. We made a spatial filter to extract the component reflecting the brain activity of the unlocked waveform, applied this spatial filter to MEG data under different conditions (a passive viewing, a simple reaction time, and Go/NoGo tasks), and calculated the powers of the extracted components. Comparing the powers across these conditions suggests that the unlocked waveforms may reflect the inhibition of the task-irrelevant activities in the temporal regions while the subject attends to the visual stimulus. Our results demonstrate that CWE is a potential tool for revealing new findings of cognitive brain functions without any hypothesis in advance.
Collapse
Affiliation(s)
- Yusuke Takeda
- Department of Computational Brain Imaging, ATR Neural Information Analysis Laboratories, Kyoto, Japan
- * E-mail:
| | - Kentaro Yamanaka
- Graduate School of Human Life Sciences, Showa Women’s University, Tokyo, Japan
| | - Noriko Yamagishi
- Department of Cognitive Neuroscience, ATR Cognitive Mechanisms Laboratories, Kyoto, Japan
- Brain Networks and Communication Laboratory, Center for Information and Neural Networks, National Institute of Information and Communications Technology, Osaka, Japan
- Japan Science and Technology Agency, PRESTO, Saitama, Japan
| | - Masa-aki Sato
- Department of Computational Brain Imaging, ATR Neural Information Analysis Laboratories, Kyoto, Japan
| |
Collapse
|
11
|
Mittag M, Inauri K, Huovilainen T, Leminen M, Salo E, Rinne T, Kujala T, Alho K. Attention effects on the processing of task-relevant and task-irrelevant speech sounds and letters. Front Neurosci 2013; 7:231. [PMID: 24348324 PMCID: PMC3847663 DOI: 10.3389/fnins.2013.00231] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2013] [Accepted: 11/16/2013] [Indexed: 11/21/2022] Open
Abstract
We used event-related brain potentials (ERPs) to study effects of selective attention on the processing of attended and unattended spoken syllables and letters. Participants were presented with syllables randomly occurring in the left or right ear and spoken by different voices and with a concurrent foveal stream of consonant letters written in darker or lighter fonts. During auditory phonological (AP) and non-phonological tasks, they responded to syllables in a designated ear starting with a vowel and spoken by female voices, respectively. These syllables occurred infrequently among standard syllables starting with a consonant and spoken by male voices. During visual phonological and non-phonological tasks, they responded to consonant letters with names starting with a vowel and to letters written in dark fonts, respectively. These letters occurred infrequently among standard letters with names starting with a consonant and written in light fonts. To examine genuine effects of attention and task on ERPs not overlapped by ERPs associated with target processing or deviance detection, these effects were studied only in ERPs to auditory and visual standards. During selective listening to syllables in a designated ear, ERPs to the attended syllables were negatively displaced during both phonological and non-phonological auditory tasks. Selective attention to letters elicited an early negative displacement and a subsequent positive displacement (Pd) of ERPs to attended letters being larger during the visual phonological than non-phonological task suggesting a higher demand for attention during the visual phonological task. Active suppression of unattended speech during the AP and non-phonological tasks and during the visual phonological tasks was suggested by a rejection positivity (RP) to unattended syllables. We also found evidence for suppression of the processing of task-irrelevant visual stimuli in visual ERPs during auditory tasks involving left-ear syllables.
Collapse
Affiliation(s)
- Maria Mittag
- Cognitive Brain Research Unit, Cognitive Science, Institute of Behavioural Sciences, University of Helsinki Helsinki, Finland
| | - Karina Inauri
- Division of Cognitive Psychology and Neuropsychology, Institute of Behavioural Sciences, University of Helsinki Helsinki, Finland
| | - Tatu Huovilainen
- Division of Cognitive Psychology and Neuropsychology, Institute of Behavioural Sciences, University of Helsinki Helsinki, Finland
| | - Miika Leminen
- Cognitive Brain Research Unit, Cognitive Science, Institute of Behavioural Sciences, University of Helsinki Helsinki, Finland ; Finnish Centre of Excellence in Interdisciplinary Music Research, University of Jyväskylä Jyväskylä, Finland
| | - Emma Salo
- Division of Cognitive Psychology and Neuropsychology, Institute of Behavioural Sciences, University of Helsinki Helsinki, Finland
| | - Teemu Rinne
- Division of Cognitive Psychology and Neuropsychology, Institute of Behavioural Sciences, University of Helsinki Helsinki, Finland
| | - Teija Kujala
- Cognitive Brain Research Unit, Cognitive Science, Institute of Behavioural Sciences, University of Helsinki Helsinki, Finland ; Cicero Learning Network, University of Helsinki Helsinki, Finland
| | - Kimmo Alho
- Division of Cognitive Psychology and Neuropsychology, Institute of Behavioural Sciences, University of Helsinki Helsinki, Finland ; Helsinki Collegium for Advanced Studies, University of Helsinki Helsinki, Finland
| |
Collapse
|
12
|
Jiwani S, Papsin BC, Gordon KA. Central auditory development after long-term cochlear implant use. Clin Neurophysiol 2013; 124:1868-80. [DOI: 10.1016/j.clinph.2013.03.023] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2012] [Revised: 02/27/2013] [Accepted: 03/08/2013] [Indexed: 11/26/2022]
|
13
|
Alho K, Rinne T, Herron TJ, Woods DL. Stimulus-dependent activations and attention-related modulations in the auditory cortex: a meta-analysis of fMRI studies. Hear Res 2013; 307:29-41. [PMID: 23938208 DOI: 10.1016/j.heares.2013.08.001] [Citation(s) in RCA: 99] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/17/2013] [Revised: 07/22/2013] [Accepted: 08/01/2013] [Indexed: 11/28/2022]
Abstract
We meta-analyzed 115 functional magnetic resonance imaging (fMRI) studies reporting auditory-cortex (AC) coordinates for activations related to active and passive processing of pitch and spatial location of non-speech sounds, as well as to the active and passive speech and voice processing. We aimed at revealing any systematic differences between AC surface locations of these activations by statistically analyzing the activation loci using the open-source Matlab toolbox VAMCA (Visualization and Meta-analysis on Cortical Anatomy). AC activations associated with pitch processing (e.g., active or passive listening to tones with a varying vs. fixed pitch) had median loci in the middle superior temporal gyrus (STG), lateral to Heschl's gyrus. However, median loci of activations due to the processing of infrequent pitch changes in a tone stream were centered in the STG or planum temporale (PT), significantly posterior to the median loci for other types of pitch processing. Median loci of attention-related modulations due to focused attention to pitch (e.g., attending selectively to low or high tones delivered in concurrent sequences) were, in turn, centered in the STG or superior temporal sulcus (STS), posterior to median loci for passive pitch processing. Activations due to spatial processing were centered in the posterior STG or PT, significantly posterior to pitch processing loci (processing of infrequent pitch changes excluded). In the right-hemisphere AC, the median locus of spatial attention-related modulations was in the STS, significantly inferior to the median locus for passive spatial processing. Activations associated with speech processing and those associated with voice processing had indistinguishable median loci at the border of mid-STG and mid-STS. Median loci of attention-related modulations due to attention to speech were in the same mid-STG/STS region. Thus, while attention to the pitch or location of non-speech sounds seems to recruit AC areas less involved in passive pitch or location processing, focused attention to speech predominantly enhances activations in regions that already respond to human vocalizations during passive listening. This suggests that distinct attention mechanisms might be engaged by attention to speech and attention to more elemental auditory features such as tone pitch or location. This article is part of a Special Issue entitled Human Auditory Neuroimaging.
Collapse
Affiliation(s)
- Kimmo Alho
- Helsinki Collegium for Advanced Studies, University of Helsinki, PO Box 4, FI 00014 Helsinki, Finland; Institute of Behavioural Sciences, University of Helsinki, PO Box 9, FI 00014 Helsinki, Finland.
| | | | | | | |
Collapse
|
14
|
Lee AKC, Larson E, Maddox RK, Shinn-Cunningham BG. Using neuroimaging to understand the cortical mechanisms of auditory selective attention. Hear Res 2013; 307:111-20. [PMID: 23850664 DOI: 10.1016/j.heares.2013.06.010] [Citation(s) in RCA: 62] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/13/2013] [Revised: 06/20/2013] [Accepted: 06/25/2013] [Indexed: 11/30/2022]
Abstract
Over the last four decades, a range of different neuroimaging tools have been used to study human auditory attention, spanning from classic event-related potential studies using electroencephalography to modern multimodal imaging approaches (e.g., combining anatomical information based on magnetic resonance imaging with magneto- and electroencephalography). This review begins by exploring the different strengths and limitations inherent to different neuroimaging methods, and then outlines some common behavioral paradigms that have been adopted to study auditory attention. We argue that in order to design a neuroimaging experiment that produces interpretable, unambiguous results, the experimenter must not only have a deep appreciation of the imaging technique employed, but also a sophisticated understanding of perception and behavior. Only with the proper caveats in mind can one begin to infer how the cortex supports a human in solving the "cocktail party" problem. This article is part of a Special Issue entitled Human Auditory Neuroimaging.
Collapse
Affiliation(s)
- Adrian K C Lee
- Institute for Learning and Brain Sciences, University of Washington, WA 98195, USA; Department of Speech & Hearing Sciences, University of Washington, Seattle, WA 98195, USA.
| | | | | | | |
Collapse
|
15
|
Sabri M, Humphries C, Verber M, Mangalathu J, Desai A, Binder JR, Liebenthal E. Perceptual demand modulates activation of human auditory cortex in response to task-irrelevant sounds. J Cogn Neurosci 2013; 25:1553-62. [PMID: 23647558 DOI: 10.1162/jocn_a_00416] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
In the visual modality, perceptual demand on a goal-directed task has been shown to modulate the extent to which irrelevant information can be disregarded at a sensory-perceptual stage of processing. In the auditory modality, the effect of perceptual demand on neural representations of task-irrelevant sounds is unclear. We compared simultaneous ERPs and fMRI responses associated with task-irrelevant sounds across parametrically modulated perceptual task demands in a dichotic-listening paradigm. Participants performed a signal detection task in one ear (Attend ear) while ignoring task-irrelevant syllable sounds in the other ear (Ignore ear). Results revealed modulation of syllable processing by auditory perceptual demand in an ROI in middle left superior temporal gyrus and in negative ERP activity 130-230 msec post stimulus onset. Increasing the perceptual demand in the Attend ear was associated with a reduced neural response in both fMRI and ERP to task-irrelevant sounds. These findings are in support of a selection model whereby ongoing perceptual demands modulate task-irrelevant sound processing in auditory cortex.
Collapse
Affiliation(s)
- Merav Sabri
- Department of Neurology, Medical College of Wisconsin, 8701Watertown Plank Road, Milwaukee, WI 53226, USA.
| | | | | | | | | | | | | |
Collapse
|
16
|
Yang Z, Mayer AR. An event-related FMRI study of exogenous orienting across vision and audition. Hum Brain Mapp 2013; 35:964-74. [PMID: 23288620 DOI: 10.1002/hbm.22227] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2012] [Revised: 11/02/2012] [Accepted: 11/05/2012] [Indexed: 11/11/2022] Open
Abstract
The orienting of attention to the spatial location of sensory stimuli in one modality based on sensory stimuli presented in another modality (i.e., cross-modal orienting) is a common mechanism for controlling attentional shifts. The neuronal mechanisms of top-down cross-modal orienting have been studied extensively. However, the neuronal substrates of bottom-up audio-visual cross-modal spatial orienting remain to be elucidated. Therefore, behavioral and event-related functional magnetic resonance imaging (FMRI) data were collected while healthy volunteers (N = 26) performed a spatial cross-modal localization task modeled after the Posner cuing paradigm. Behavioral results indicated that although both visual and auditory cues were effective in producing bottom-up shifts of cross-modal spatial attention, reorienting effects were greater for the visual cues condition. Statistically significant evidence of inhibition of return was not observed for either condition. Functional results also indicated that visual cues with auditory targets resulted in greater activation within ventral and dorsal frontoparietal attention networks, visual and auditory "where" streams, primary auditory cortex, and thalamus during reorienting across both short and long stimulus onset asynchronys. In contrast, no areas of unique activation were associated with reorienting following auditory cues with visual targets. In summary, current results question whether audio-visual cross-modal orienting is supramodal in nature, suggesting rather that the initial modality of cue presentation heavily influences both behavioral and functional results. In the context of localization tasks, reorienting effects accompanied by the activation of the frontoparietal reorienting network are more robust for visual cues with auditory targets than for auditory cues with visual targets.
Collapse
Affiliation(s)
- Zhen Yang
- The Mind Research Network/Lovelace Biomedical and Environmental Research Institute, Albuquerque, New Mexico 87106
| | | |
Collapse
|
17
|
Harinen K, Aaltonen O, Salo E, Salonen O, Rinne T. Task-dependent activations of human auditory cortex to prototypical and nonprototypical vowels. Hum Brain Mapp 2012; 34:1272-81. [PMID: 22287197 DOI: 10.1002/hbm.21506] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2011] [Revised: 10/06/2011] [Accepted: 10/10/2011] [Indexed: 11/10/2022] Open
Abstract
Research in auditory neuroscience has largely neglected the possible effects of different listening tasks on activations of auditory cortex (AC). In the present study, we used high-resolution fMRI to compare human AC activations with sounds presented during three auditory and one visual task. In all tasks, subjects were presented with pairs of Finnish vowels, noise bursts with pitch and Gabor patches. In the vowel pairs, one vowel was always either a prototypical /i/ or /ae/ (separately defined for each subject) or a nonprototype. In different task blocks, subjects were either required to discriminate (same/different) vowel pairs, to rate vowel "goodness" (first/second sound was a better exemplar of the vowel class), to discriminate pitch changes in the noise bursts, or to discriminate Gabor orientation changes. We obtained distinctly different AC activation patterns to identical sounds presented during the four task conditions. In particular, direct comparisons between the vowel tasks revealed stronger activations during vowel discrimination in the anterior and posterior superior temporal gyrus (STG), while the vowel rating task was associated with increased activations in the inferior parietal lobule (IPL). We also found that AC areas in or near Heschl's gyrus (HG) were sensitive to the speech-specific difference between a vowel prototype and nonprototype during active listening tasks. These results show that AC activations to speech sounds are strongly dependent on the listening tasks.
Collapse
Affiliation(s)
- Kirsi Harinen
- Institute of Behavioural Sciences, University of Helsinki, Finland.
| | | | | | | | | |
Collapse
|
18
|
Alho K, Salonen J, Rinne T, Medvedev SV, Hugdahl K, Hämäläinen H. Attention-related modulation of auditory-cortex responses to speech sounds during dichotic listening. Brain Res 2012; 1442:47-54. [PMID: 22300726 DOI: 10.1016/j.brainres.2012.01.007] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2011] [Revised: 12/23/2011] [Accepted: 01/05/2012] [Indexed: 11/24/2022]
Abstract
Event-related magnetic fields (ERFs) were measured with magnetoencephalography (MEG) in fifteen healthy right-handed participants listening to sequences of consonant-vowel syllable pairs delivered dichotically (one syllable presented to the left ear and another syllable simultaneously to the right ear). The participants were instructed to press a response button to occurrences of a particular target syllable. In a condition with no other instruction (the non-forced condition, NF), they showed the well-known right-ear advantage (REA), that is, the participants responded more often to target syllables delivered to the right ear than to targets delivered to the left ear. The same was true in the forced-right (FR) condition, where the participants were instructed to attend selectively to the right-ear syllables and respond only to targets among them. In the forced-left (FL) condition, where they were instructed to respond only to left-ear targets, they responded more often to targets in this ear than to targets in the right ear. At 300-500 ms from syllable pair onset, a sustained field (SF) in ERFs to the syllable pairs was stronger in the left auditory cortex than in the right auditory cortex in the NF and FR conditions, while the opposite was true in the FL condition. Thus selective attention during dichotic listening leads to stronger processing of speech sounds in the auditory cortex contralateral to the attended direction. Our results also suggest that the REA observed for dichotic speech may involve a bias of attention to the right side even when there is no instruction to do so. This supports Kinsbourne's (1970) model of attention bias as a general principle of laterality.
Collapse
Affiliation(s)
- Kimmo Alho
- Division of Cognitive Psychology and Neuropsychology, Institute of Behavioural Sciences, University of Helsinki, Helsinki, Finland.
| | | | | | | | | | | |
Collapse
|
19
|
Rinne T, Koistinen S, Talja S, Wikman P, Salonen O. Task-dependent activations of human auditory cortex during spatial discrimination and spatial memory tasks. Neuroimage 2011; 59:4126-31. [PMID: 22062190 DOI: 10.1016/j.neuroimage.2011.10.069] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2011] [Revised: 10/10/2011] [Accepted: 10/18/2011] [Indexed: 11/24/2022] Open
Abstract
In the present study, we applied high-resolution functional magnetic resonance imaging (fMRI) of the human auditory cortex (AC) and adjacent areas to compare activations during spatial discrimination and spatial n-back memory tasks that were varied parametrically in difficulty. We found that activations in the anterior superior temporal gyrus (STG) were stronger during spatial discrimination than during spatial memory, while spatial memory was associated with stronger activations in the inferior parietal lobule (IPL). We also found that wide AC areas were strongly deactivated during the spatial memory tasks. The present AC activation patterns associated with spatial discrimination and spatial memory tasks were highly similar to those obtained in our previous study comparing AC activations during pitch discrimination and pitch memory (Rinne et al., 2009). Together our previous and present results indicate that discrimination and memory tasks activate anterior and posterior AC areas differently and that this anterior-posterior division is present both when these tasks are performed on spatially invariant (pitch discrimination vs. memory) or spatially varying (spatial discrimination vs. memory) sounds. These results also further strengthen the view that activations of human AC cannot be explained only by stimulus-level parameters (e.g., spatial vs. nonspatial stimuli) but that the activations observed with fMRI are strongly dependent on the characteristics of the behavioral task. Thus, our results suggest that in order to understand the functional structure of AC a more systematic investigation of task-related factors affecting AC activations is needed.
Collapse
Affiliation(s)
- Teemu Rinne
- Institute of Behavioural Sciences, University of Helsinki, Finland.
| | | | | | | | | |
Collapse
|