1
|
Li AY, Ladyka-Wojcik N, Qazilbash H, Golestani A, Walther DB, Martin CB, Barense MD. Experience transforms crossmodal object representations in the anterior temporal lobes. eLife 2024; 13:e83382. [PMID: 38647143 PMCID: PMC11081630 DOI: 10.7554/elife.83382] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2022] [Accepted: 04/19/2024] [Indexed: 04/25/2024] Open
Abstract
Combining information from multiple senses is essential to object recognition, core to the ability to learn concepts, make new inferences, and generalize across distinct entities. Yet how the mind combines sensory input into coherent crossmodal representations - the crossmodal binding problem - remains poorly understood. Here, we applied multi-echo fMRI across a 4-day paradigm, in which participants learned three-dimensional crossmodal representations created from well-characterized unimodal visual shape and sound features. Our novel paradigm decoupled the learned crossmodal object representations from their baseline unimodal shapes and sounds, thus allowing us to track the emergence of crossmodal object representations as they were learned by healthy adults. Critically, we found that two anterior temporal lobe structures - temporal pole and perirhinal cortex - differentiated learned from non-learned crossmodal objects, even when controlling for the unimodal features that composed those objects. These results provide evidence for integrated crossmodal object representations in the anterior temporal lobes that were different from the representations for the unimodal features. Furthermore, we found that perirhinal cortex representations were by default biased toward visual shape, but this initial visual bias was attenuated by crossmodal learning. Thus, crossmodal learning transformed perirhinal representations such that they were no longer predominantly grounded in the visual modality, which may be a mechanism by which object concepts gain their abstraction.
Collapse
Affiliation(s)
- Aedan Yue Li
- Department of Psychology, University of TorontoTorontoCanada
| | | | - Heba Qazilbash
- Department of Psychology, University of TorontoTorontoCanada
| | - Ali Golestani
- Department of Physics and Astronomy, University of CalgaryCalgaryCanada
| | - Dirk B Walther
- Department of Psychology, University of TorontoTorontoCanada
- Rotman Research Institute, Baycrest Health SciencesNorth YorkCanada
| | - Chris B Martin
- Department of Psychology, Florida State UniversityTallahasseeUnited States
| | - Morgan D Barense
- Department of Psychology, University of TorontoTorontoCanada
- Rotman Research Institute, Baycrest Health SciencesNorth YorkCanada
| |
Collapse
|
2
|
Jaap C, Maack MC, Taesler P, Steinicke F, Rose M. Enriched environments enhance the development of explicit memory in an incidental learning task. Sci Rep 2022; 12:18717. [PMID: 36333393 PMCID: PMC9636381 DOI: 10.1038/s41598-022-23226-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2022] [Accepted: 10/27/2022] [Indexed: 11/06/2022] Open
Abstract
Learning, rendered in an implicit (unconscious) or explicit (conscious) way, is a crucial part of our daily life. Different factors, like attention or motivation, influence the transformation from implicit to explicit memory. Via virtual reality a lively and engaging surrounding can be created, whereby motivational processes are assumed to be a vital part of the transition from implicit to explicit memory. In the present study, we tested the impact of an enriched virtual reality compared to two conventional, non-enriched 2D-computer-screen based tasks on implicit to explicit memory transformation, using an audio-visual sequential association task. We hypothesized, that the immersive nature of the VR surrounding enhances the transfer from implicit to explicit memory. Notably, the overall amount of learned sequence pairs were not significantly different between experimental groups, but the degree of awareness was affected by the different settings. However, we observed an increased level of explicitly remembered pairs within the VR group compared to two screen-based groups. This finding clearly demonstrates that a near-natural experimental setting affects the transformation process from implicit to explicit memory.
Collapse
Affiliation(s)
- Carina Jaap
- grid.13648.380000 0001 2180 3484NeuroImage Nord, Department for Systems Neuroscience, University Medical Center Hamburg Eppendorf, Martinistrasse 52, 20246 Hamburg, Germany
| | - Marike C. Maack
- grid.13648.380000 0001 2180 3484NeuroImage Nord, Department for Systems Neuroscience, University Medical Center Hamburg Eppendorf, Martinistrasse 52, 20246 Hamburg, Germany
| | - Philipp Taesler
- grid.13648.380000 0001 2180 3484NeuroImage Nord, Department for Systems Neuroscience, University Medical Center Hamburg Eppendorf, Martinistrasse 52, 20246 Hamburg, Germany
| | - Frank Steinicke
- grid.9026.d0000 0001 2287 2617Human-Computer Interaction, Department of Informatics, University of Hamburg, Vogt-Kölln-Str. 30, 22527 Hamburg, Germany
| | - Michael Rose
- grid.13648.380000 0001 2180 3484NeuroImage Nord, Department for Systems Neuroscience, University Medical Center Hamburg Eppendorf, Martinistrasse 52, 20246 Hamburg, Germany
| |
Collapse
|
3
|
Billig AJ, Lad M, Sedley W, Griffiths TD. The hearing hippocampus. Prog Neurobiol 2022; 218:102326. [PMID: 35870677 PMCID: PMC10510040 DOI: 10.1016/j.pneurobio.2022.102326] [Citation(s) in RCA: 24] [Impact Index Per Article: 12.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2022] [Revised: 06/08/2022] [Accepted: 07/18/2022] [Indexed: 11/17/2022]
Abstract
The hippocampus has a well-established role in spatial and episodic memory but a broader function has been proposed including aspects of perception and relational processing. Neural bases of sound analysis have been described in the pathway to auditory cortex, but wider networks supporting auditory cognition are still being established. We review what is known about the role of the hippocampus in processing auditory information, and how the hippocampus itself is shaped by sound. In examining imaging, recording, and lesion studies in species from rodents to humans, we uncover a hierarchy of hippocampal responses to sound including during passive exposure, active listening, and the learning of associations between sounds and other stimuli. We describe how the hippocampus' connectivity and computational architecture allow it to track and manipulate auditory information - whether in the form of speech, music, or environmental, emotional, or phantom sounds. Functional and structural correlates of auditory experience are also identified. The extent of auditory-hippocampal interactions is consistent with the view that the hippocampus makes broad contributions to perception and cognition, beyond spatial and episodic memory. More deeply understanding these interactions may unlock applications including entraining hippocampal rhythms to support cognition, and intervening in links between hearing loss and dementia.
Collapse
Affiliation(s)
| | - Meher Lad
- Translational and Clinical Research Institute, Newcastle University Medical School, Newcastle upon Tyne, UK
| | - William Sedley
- Translational and Clinical Research Institute, Newcastle University Medical School, Newcastle upon Tyne, UK
| | - Timothy D Griffiths
- Biosciences Institute, Newcastle University Medical School, Newcastle upon Tyne, UK; Wellcome Centre for Human Neuroimaging, UCL Queen Square Institute of Neurology, University College London, London, UK; Human Brain Research Laboratory, Department of Neurosurgery, University of Iowa Hospitals and Clinics, Iowa City, USA
| |
Collapse
|
4
|
Li J, Deng SW. Facilitation and interference effects of the multisensory context on learning: a systematic review and meta-analysis. PSYCHOLOGICAL RESEARCH 2022; 87:1334-1352. [DOI: 10.1007/s00426-022-01733-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2022] [Accepted: 08/28/2022] [Indexed: 11/24/2022]
|
5
|
Mathias B, Sureth L, Hartwigsen G, Macedonia M, Mayer KM, von Kriegstein K. Visual Sensory Cortices Causally Contribute to Auditory Word Recognition Following Sensorimotor-Enriched Vocabulary Training. Cereb Cortex 2021; 31:513-528. [PMID: 32959878 PMCID: PMC7727387 DOI: 10.1093/cercor/bhaa240] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2020] [Revised: 08/04/2020] [Accepted: 08/04/2020] [Indexed: 12/31/2022] Open
Abstract
Despite a rise in the use of "learning by doing" pedagogical methods in praxis, little is known as to how the brain benefits from these methods. Learning by doing strategies that utilize complementary information ("enrichment") such as gestures have been shown to optimize learning outcomes in several domains including foreign language (L2) training. Here we tested the hypothesis that behavioral benefits of gesture-based enrichment are critically supported by integrity of the biological motion visual cortices (bmSTS). Prior functional neuroimaging work has implicated the visual motion cortices in L2 translation following sensorimotor-enriched training; the current study is the first to investigate the causal relevance of these structures in learning by doing contexts. Using neuronavigated transcranial magnetic stimulation and a gesture-enriched L2 vocabulary learning paradigm, we found that the bmSTS causally contributed to behavioral benefits of gesture-enriched learning. Visual motion cortex integrity benefitted both short- and long-term learning outcomes, as well as the learning of concrete and abstract words. These results adjudicate between opposing predictions of two neuroscientific learning theories: While reactivation-based theories predict no functional role of specialized sensory cortices in vocabulary learning outcomes, the current study supports the predictive coding theory view that these cortices precipitate sensorimotor-based learning benefits.
Collapse
Affiliation(s)
- Brian Mathias
- Chair of Cognitive and Clinical Neuroscience, Faculty of Psychology, Technical University Dresden, Dresden 01187, Germany
- Research Group Neural Mechanisms of Human Communication, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Leona Sureth
- Research Group Neural Mechanisms of Human Communication, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Gesa Hartwigsen
- Lise Meitner Research Group Cognition and Plasticity, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Manuela Macedonia
- Research Group Neural Mechanisms of Human Communication, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
- Institute for Information Engineering, Johannes Kepler University Linz, Linz, Austria
| | - Katja M Mayer
- Institute of Psychology, University of Münster, Münster, Germany
| | - Katharina von Kriegstein
- Chair of Cognitive and Clinical Neuroscience, Faculty of Psychology, Technical University Dresden, Dresden 01187, Germany
- Research Group Neural Mechanisms of Human Communication, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| |
Collapse
|
6
|
Borders AA, Aly M, Parks CM, Yonelinas AP. The hippocampus is particularly important for building associations across stimulus domains. Neuropsychologia 2017; 99:335-342. [PMID: 28377162 PMCID: PMC5493148 DOI: 10.1016/j.neuropsychologia.2017.03.032] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2016] [Revised: 03/06/2017] [Accepted: 03/31/2017] [Indexed: 01/03/2023]
Abstract
The medial temporal lobe (MTL) is critical for binding together different attributes that together form memory for prior episodes, but whether it is preferentially involved in supporting specific types of associations is a topic of much debate. Some have argued that the MTL, specifically the hippocampus, may be specialized for binding information from different stimulus domains (e.g., linking visual and auditory stimuli). In the current study, we examined the role of the MTL in memory for associations within- vs. across-domains. Patients with either selective hippocampal lesions or more extensive MTL lesions studied pairs of items within the same stimulus domain (i.e., image-image or sound-sound pairs) or across different domains (i.e., image-sound pairs). Associative memory was subsequently tested by having participants discriminate between previously studied and rearranged pairs. Compared to healthy controls, the patients were significantly more impaired in the across-domain condition than the within-domain conditions. Similar deficits were observed for patients with hippocampal lesions and those with more extensive MTL lesions, suggesting that the hippocampus itself is particularly important for binding associations across stimulus domains.
Collapse
Affiliation(s)
- Alyssa A Borders
- Department of Psychology, University of California, Davis, CA 95616, USA.
| | - Mariam Aly
- Department of Psychology, Columbia University, New York, NY 10027, USA
| | - Colleen M Parks
- Department of Psychology, University of Nevada, Las Vegas, NV 89154, USA
| | - Andrew P Yonelinas
- Department of Psychology, University of California, Davis, CA 95616, USA; Center for Mind and Brain, University of California, Davis, CA 95616, USA
| |
Collapse
|
7
|
Functional overlap of top-down emotion regulation and generation: An fMRI study identifying common neural substrates between cognitive reappraisal and cognitively generated emotions. COGNITIVE AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2014; 14:923-38. [DOI: 10.3758/s13415-013-0240-0] [Citation(s) in RCA: 41] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
|
8
|
Pillai AS, Gilbert JR, Horwitz B. Early sensory cortex is activated in the absence of explicit input during crossmodal item retrieval: evidence from MEG. Behav Brain Res 2013; 238:265-72. [PMID: 23084971 PMCID: PMC3513489 DOI: 10.1016/j.bbr.2012.10.011] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2012] [Revised: 10/03/2012] [Accepted: 10/09/2012] [Indexed: 11/18/2022]
Abstract
Crossmodal associations form a fundamental aspect of our daily lives. In this study we investigated the neural correlates of crossmodal association in early sensory cortices using magnetoencephalography (MEG). We used a paired associate recognition paradigm in which subjects were tested after multiple training sessions over a span of four weeks. Subjects had to learn 12 abstract, nonlinguistic, pairs of auditory and visual objects that consisted of crossmodal (visual-auditory, VA; auditory-visual, AV) and unimodal (visual-visual, VV; auditory-auditory, AA) paired items. Visual objects included abstract, non-nameable, fractal-like images, and auditory objects included abstract tone sequences. During scanning, subjects were shown the first item of a pair (S1), followed by a delay, then the simultaneous presentation of a visual and auditory stimulus (S2). Subjects were instructed to indicate whether either of the S2 stimuli contained the correct paired associate of S1. Synthetic aperture magnetometry (SAMspm), a minimum variance beamformer, was then used to assess source power differences between the crossmodal conditions and their corresponding unimodal conditions (i.e., AV-AA and VA-VV) in the beta (15-30 Hz) and low gamma frequencies (31-54 Hz) during the S1 period. We found greater power during S1 in the corresponding modality-specific association areas for crossmodal compared with unimodal stimuli. Thus, even in the absence of explicit sensory input, the retrieval of well-learned, crossmodal pairs activate sensory areas associated with the corresponding modality. These findings support theories which posit that modality-specific regions of cortex are involved in the storage and retrieval of sensory-specific items from long-term memory.
Collapse
Affiliation(s)
| | | | - Barry Horwitz
- Brain Imaging and Modeling Section, National Institute on Deafness and other Communication Disorders, National Institutes of Health, Bethesda MD 20892
| |
Collapse
|