1
|
Melchior J, Altamimi A, Bayati M, Cheng S, Wiskott L. A neural network model for online one-shot storage of pattern sequences. PLoS One 2024; 19:e0304076. [PMID: 38900733 PMCID: PMC11189254 DOI: 10.1371/journal.pone.0304076] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2023] [Accepted: 05/06/2024] [Indexed: 06/22/2024] Open
Abstract
Based on the CRISP theory (Content Representation, Intrinsic Sequences, and Pattern completion), we present a computational model of the hippocampus that allows for online one-shot storage of pattern sequences without the need for a consolidation process. In our model, CA3 provides a pre-trained sequence that is hetero-associated with the input sequence, rather than storing a sequence in CA3. That is, plasticity on a short timescale only occurs in the incoming and outgoing connections of CA3, not in its recurrent connections. We use a single learning rule named Hebbian descent to train all plastic synapses in the network. A forgetting mechanism in the learning rule allows the network to continuously store new patterns while forgetting those stored earlier. We find that a single cue pattern can reliably trigger the retrieval of sequences, even when cues are noisy or missing information. Furthermore, pattern separation in subregion DG is necessary when sequences contain correlated patterns. Besides artificially generated input sequences, the model works with sequences of handwritten digits and natural images. Notably, our model is capable of improving itself without external input, in a process that can be referred to as 'replay' or 'offline-learning', which helps in improving the associations and consolidating the learned patterns.
Collapse
Affiliation(s)
- Jan Melchior
- Institute for Neural Computation, Faculty of Computer Science, Ruhr University Bochum, Bochum, Germany
| | - Aya Altamimi
- Institute for Neural Computation, Faculty of Computer Science, Ruhr University Bochum, Bochum, Germany
| | - Mehdi Bayati
- Institute for Neural Computation, Faculty of Computer Science, Ruhr University Bochum, Bochum, Germany
| | - Sen Cheng
- Institute for Neural Computation, Faculty of Computer Science, Ruhr University Bochum, Bochum, Germany
| | - Laurenz Wiskott
- Institute for Neural Computation, Faculty of Computer Science, Ruhr University Bochum, Bochum, Germany
| |
Collapse
|
2
|
Milstein AD, Tran S, Ng G, Soltesz I. Offline memory replay in recurrent neuronal networks emerges from constraints on online dynamics. J Physiol 2023; 601:3241-3264. [PMID: 35907087 PMCID: PMC9885000 DOI: 10.1113/jp283216] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2022] [Accepted: 07/22/2022] [Indexed: 02/01/2023] Open
Abstract
During spatial exploration, neural circuits in the hippocampus store memories of sequences of sensory events encountered in the environment. When sensory information is absent during 'offline' resting periods, brief neuronal population bursts can 'replay' sequences of activity that resemble bouts of sensory experience. These sequences can occur in either forward or reverse order, and can even include spatial trajectories that have not been experienced, but are consistent with the topology of the environment. The neural circuit mechanisms underlying this variable and flexible sequence generation are unknown. Here we demonstrate in a recurrent spiking network model of hippocampal area CA3 that experimental constraints on network dynamics such as population sparsity, stimulus selectivity, rhythmicity and spike rate adaptation, as well as associative synaptic connectivity, enable additional emergent properties, including variable offline memory replay. In an online stimulus-driven state, we observed the emergence of neuronal sequences that swept from representations of past to future stimuli on the timescale of the theta rhythm. In an offline state driven only by noise, the network generated both forward and reverse neuronal sequences, and recapitulated the experimental observation that offline memory replay events tend to include salient locations like the site of a reward. These results demonstrate that biological constraints on the dynamics of recurrent neural circuits are sufficient to enable memories of sensory events stored in the strengths of synaptic connections to be flexibly read out during rest and sleep, which is thought to be important for memory consolidation and planning of future behaviour. KEY POINTS: A recurrent spiking network model of hippocampal area CA3 was optimized to recapitulate experimentally observed network dynamics during simulated spatial exploration. During simulated offline rest, the network exhibited the emergent property of generating flexible forward, reverse and mixed direction memory replay events. Network perturbations and analysis of model diversity and degeneracy identified associative synaptic connectivity and key features of network dynamics as important for offline sequence generation. Network simulations demonstrate that population over-representation of salient positions like the site of reward results in biased memory replay.
Collapse
Affiliation(s)
- Aaron D. Milstein
- Department of Neurosurgery, Stanford University School of Medicine, Stanford CA
- Department of Neuroscience and Cell Biology, Robert Wood Johnson Medical School and Center for Advanced Biotechnology and Medicine, Rutgers University, Piscataway, NJ
| | - Sarah Tran
- Department of Neurosurgery, Stanford University School of Medicine, Stanford CA
| | - Grace Ng
- Department of Neurosurgery, Stanford University School of Medicine, Stanford CA
| | - Ivan Soltesz
- Department of Neurosurgery, Stanford University School of Medicine, Stanford CA
| |
Collapse
|
3
|
Fukai T. Computational models of Idling brain activity for memory processing. Neurosci Res 2022; 189:75-82. [PMID: 36592825 DOI: 10.1016/j.neures.2022.12.024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2022] [Accepted: 12/29/2022] [Indexed: 01/01/2023]
Abstract
Studying the underlying neural mechanisms of cognitive functions of the brain is one of the central questions in modern biology. Moreover, it has significantly impacted the development of novel technologies in artificial intelligence. Spontaneous activity is a unique feature of the brain and is currently lacking in many artificially constructed intelligent machines. Spontaneous activity may represent the brain's idling states, which are internally driven by neuronal networks and possibly participate in offline processing during awake, sleep, and resting states. Evidence is accumulating that the brain's spontaneous activity is not mere noise but part of the mechanisms to process information about previous experiences. A bunch of literature has shown how previous sensory and behavioral experiences influence the subsequent patterns of brain activity with various methods in various animals. It seems, however, that the patterns of neural activity and their computational roles differ significantly from area to area and from function to function. In this article, I review the various forms of the brain's spontaneous activity, especially those observed during memory processing, and some attempts to model the generation mechanisms and computational roles of such activities.
Collapse
Affiliation(s)
- Tomoki Fukai
- Okinawa Institute of Science and Technology, Tancha 1919-1, Onna-son, Okinawa 904-0495, Japan.
| |
Collapse
|
4
|
Sutton NM, Ascoli GA. Spiking Neural Networks and Hippocampal Function: A Web-Accessible Survey of Simulations, Modeling Methods, and Underlying Theories. COGN SYST RES 2021; 70:80-92. [PMID: 34504394 DOI: 10.1016/j.cogsys.2021.07.008] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
Computational modeling has contributed to hippocampal research in a wide variety of ways and through a large diversity of approaches, reflecting the many advanced cognitive roles of this brain region. The intensively studied neuron type circuitry of the hippocampus is a particularly conducive substrate for spiking neural models. Here we present an online knowledge base of spiking neural network simulations of hippocampal functions. First, we overview theories involving the hippocampal formation in subjects such as spatial representation, learning, and memory. Then we describe an original literature mining process to organize published reports in various key aspects, including: (i) subject area (e.g., navigation, pattern completion, epilepsy); (ii) level of modeling detail (Hodgkin-Huxley, integrate-and-fire, etc.); and (iii) theoretical framework (attractor dynamics, oscillatory interference, self-organizing maps, and others). Moreover, every peer-reviewed publication is also annotated to indicate the specific neuron types represented in the network simulation, establishing a direct link with the Hippocampome.org portal. The web interface of the knowledge base enables dynamic content browsing and advanced searches, and consistently presents evidence supporting every annotation. Moreover, users are given access to several types of statistical reports about the collection, a selection of which is summarized in this paper. This open access resource thus provides an interactive platform to survey spiking neural network models of hippocampal functions, compare available computational methods, and foster ideas for suitable new directions of research.
Collapse
Affiliation(s)
- Nate M Sutton
- Department of Bioengineering, 4400 University Drive, George Mason University, Fairfax, Virginia, 22030 (USA)
| | - Giorgio A Ascoli
- Department of Bioengineering, 4400 University Drive, George Mason University, Fairfax, Virginia, 22030 (USA).,Interdepartmental Neuroscience Program, 4400 University Drive, George Mason University, Fairfax, Virginia, 22030 (USA)
| |
Collapse
|
5
|
Reifenstein ET, Bin Khalid I, Kempter R. Synaptic learning rules for sequence learning. eLife 2021; 10:e67171. [PMID: 33860763 PMCID: PMC8175084 DOI: 10.7554/elife.67171] [Citation(s) in RCA: 19] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2021] [Accepted: 03/31/2021] [Indexed: 12/29/2022] Open
Abstract
Remembering the temporal order of a sequence of events is a task easily performed by humans in everyday life, but the underlying neuronal mechanisms are unclear. This problem is particularly intriguing as human behavior often proceeds on a time scale of seconds, which is in stark contrast to the much faster millisecond time-scale of neuronal processing in our brains. One long-held hypothesis in sequence learning suggests that a particular temporal fine-structure of neuronal activity - termed 'phase precession' - enables the compression of slow behavioral sequences down to the fast time scale of the induction of synaptic plasticity. Using mathematical analysis and computer simulations, we find that - for short enough synaptic learning windows - phase precession can improve temporal-order learning tremendously and that the asymmetric part of the synaptic learning window is essential for temporal-order learning. To test these predictions, we suggest experiments that selectively alter phase precession or the learning window and evaluate memory of temporal order.
Collapse
Affiliation(s)
- Eric Torsten Reifenstein
- Institute for Theoretical Biology, Department of Biology, Humboldt-Universität zu BerlinBerlinGermany
- Bernstein Center for Computational Neuroscience BerlinBerlinGermany
| | - Ikhwan Bin Khalid
- Institute for Theoretical Biology, Department of Biology, Humboldt-Universität zu BerlinBerlinGermany
| | - Richard Kempter
- Institute for Theoretical Biology, Department of Biology, Humboldt-Universität zu BerlinBerlinGermany
- Bernstein Center for Computational Neuroscience BerlinBerlinGermany
- Einstein Center for Neurosciences BerlinBerlinGermany
| |
Collapse
|
6
|
Robertson EM, Genzel L. Memories replayed: reactivating past successes and new dilemmas. Philos Trans R Soc Lond B Biol Sci 2020; 375:20190226. [PMID: 32248775 DOI: 10.1098/rstb.2019.0226] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/04/2023] Open
Abstract
Our experiences continue to be processed 'offline' in the ensuing hours of both wakefulness and sleep. During these different brain states, the memory formed during our experience is replayed or reactivated. Here, we discuss the unique challenges in studying offline reactivation, the growth in both the experimental and analytical techniques available across different animals from rodents to humans to capture these offline events, the important challenges this innovation has brought, our still modest understanding of how reactivation drives diverse synaptic changes across circuits, and how these changes differ (if at all), and perhaps complement, those at memory formation. Together, these discussions highlight critical emerging issues vital for identifying how reactivation affects circuits, and, in turn, behaviour, and provides a broader context for the contributions in this special issue. This article is part of the Theo Murphy meeting issue 'Memory reactivation: replaying events past, present and future'.
Collapse
Affiliation(s)
- Edwin M Robertson
- Institute of Neuroscience & Psychology, University of Glasgow, Glasgow, UK
| | - Lisa Genzel
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
| |
Collapse
|
7
|
Leibold C. A model for navigation in unknown environments based on a reservoir of hippocampal sequences. Neural Netw 2020; 124:328-342. [DOI: 10.1016/j.neunet.2020.01.014] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2019] [Revised: 12/18/2019] [Accepted: 01/14/2020] [Indexed: 12/21/2022]
|
8
|
Matheus Gauy M, Lengler J, Einarsson H, Meier F, Weissenberger F, Yanik MF, Steger A. A Hippocampal Model for Behavioral Time Acquisition and Fast Bidirectional Replay of Spatio-Temporal Memory Sequences. Front Neurosci 2018; 12:961. [PMID: 30618583 PMCID: PMC6306028 DOI: 10.3389/fnins.2018.00961] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2018] [Accepted: 12/03/2018] [Indexed: 01/09/2023] Open
Abstract
The hippocampus is known to play a crucial role in the formation of long-term memory. For this, fast replays of previously experienced activities during sleep or after reward experiences are believed to be crucial. But how such replays are generated is still completely unclear. In this paper we propose a possible mechanism for this: we present a model that can store experienced trajectories on a behavioral timescale after a single run, and can subsequently bidirectionally replay such trajectories, thereby omitting any specifics of the previous behavior like speed, etc, but allowing repetitions of events, even with different subsequent events. Our solution builds on well-known concepts, one-shot learning and synfire chains, enhancing them by additional mechanisms using global inhibition and disinhibition. For replays our approach relies on dendritic spikes and cholinergic modulation, as supported by experimental data. We also hypothesize a functional role of disinhibition as a pacemaker during behavioral time.
Collapse
Affiliation(s)
- Marcelo Matheus Gauy
- Department of Computer Science, Institute of Theoretical Computer Science, ETH Zurich, Zurich, Switzerland
| | - Johannes Lengler
- Department of Computer Science, Institute of Theoretical Computer Science, ETH Zurich, Zurich, Switzerland
| | - Hafsteinn Einarsson
- Department of Computer Science, Institute of Theoretical Computer Science, ETH Zurich, Zurich, Switzerland
| | - Florian Meier
- Department of Computer Science, Institute of Theoretical Computer Science, ETH Zurich, Zurich, Switzerland
| | - Felix Weissenberger
- Department of Computer Science, Institute of Theoretical Computer Science, ETH Zurich, Zurich, Switzerland
| | - Mehmet Fatih Yanik
- Department of Information Technology and Electrical Engineering, Institute for Neuroinformatics, ETH Zurich, Zurich, Switzerland
| | - Angelika Steger
- Department of Computer Science, Institute of Theoretical Computer Science, ETH Zurich, Zurich, Switzerland
| |
Collapse
|