1
|
Kaster M, Czappa F, Butz-Ostendorf M, Wolf F. Building a realistic, scalable memory model with independent engrams using a homeostatic mechanism. Front Neuroinform 2024; 18:1323203. [PMID: 38706939 PMCID: PMC11066267 DOI: 10.3389/fninf.2024.1323203] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2023] [Accepted: 03/27/2024] [Indexed: 05/07/2024] Open
Abstract
Memory formation is usually associated with Hebbian learning and synaptic plasticity, which changes the synaptic strengths but omits structural changes. A recent study suggests that structural plasticity can also lead to silent memory engrams, reproducing a conditioned learning paradigm with neuron ensembles. However, this study is limited by its way of synapse formation, enabling the formation of only one memory engram. Overcoming this, our model allows the formation of many engrams simultaneously while retaining high neurophysiological accuracy, e.g., as found in cortical columns. We achieve this by substituting the random synapse formation with the Model of Structural Plasticity. As a homeostatic model, neurons regulate their activity by growing and pruning synaptic elements based on their current activity. Utilizing synapse formation based on the Euclidean distance between the neurons with a scalable algorithm allows us to easily simulate 4 million neurons with 343 memory engrams. These engrams do not interfere with one another by default, yet we can change the simulation parameters to form long-reaching associations. Our model's analysis shows that homeostatic engram formation requires a certain spatiotemporal order of events. It predicts that synaptic pruning precedes and enables synaptic engram formation and that it does not occur as a mere compensatory response to enduring synapse potentiation as in Hebbian plasticity with synaptic scaling. Our model paves the way for simulations addressing further inquiries, ranging from memory chains and hierarchies to complex memory systems comprising areas with different learning mechanisms.
Collapse
Affiliation(s)
- Marvin Kaster
- Laboratory for Parallel Programming, Department of Computer Science, Technical University of Darmstadt, Darmstadt, Germany
| | - Fabian Czappa
- Laboratory for Parallel Programming, Department of Computer Science, Technical University of Darmstadt, Darmstadt, Germany
| | - Markus Butz-Ostendorf
- Laboratory for Parallel Programming, Department of Computer Science, Technical University of Darmstadt, Darmstadt, Germany
- Data Science, Translational Medicine and Clinical Pharmacology, Boehringer Ingelheim Pharma GmbH & Co. KG, Biberach, Germany
| | - Felix Wolf
- Laboratory for Parallel Programming, Department of Computer Science, Technical University of Darmstadt, Darmstadt, Germany
| |
Collapse
|
2
|
Hussaini S, Milford M, Fischer T. Spiking Neural Networks for Visual Place Recognition Via Weighted Neuronal Assignments. IEEE Robot Autom Lett 2022. [DOI: 10.1109/lra.2022.3149030] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Affiliation(s)
- Somayeh Hussaini
- QUT Centre for Robotics, Queensland University of Technology, Brisbane, QLD, Australia
| | - Michael Milford
- QUT Centre for Robotics, Queensland University of Technology, Brisbane, QLD, Australia
| | - Tobias Fischer
- QUT Centre for Robotics, Queensland University of Technology, Brisbane, QLD, Australia
| |
Collapse
|
3
|
Sankar R, Rougier NP, Leblois A. Computational benefits of structural plasticity, illustrated in songbirds. Neurosci Biobehav Rev 2021; 132:1183-1196. [PMID: 34801257 DOI: 10.1016/j.neubiorev.2021.10.033] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2021] [Revised: 10/13/2021] [Accepted: 10/25/2021] [Indexed: 11/29/2022]
Abstract
The plasticity of nervous systems allows animals to quickly adapt to a changing environment. In particular, the structural plasticity of brain networks is often critical to the development of the central nervous system and the acquisition of complex behaviors. As an example, structural plasticity is central to the development of song-related brain circuits and may be critical for song acquisition in juvenile songbirds. Here, we review current evidences for structural plasticity and their significance from a computational point of view. We start by reviewing evidence for structural plasticity across species and categorizing them along the spatial axes as well as the along the time course during development. We introduce the vocal learning circuitry in zebra finches, as a useful example of structural plasticity, and use this specific case to explore the possible contributions of structural plasticity to computational models. Finally, we discuss current modeling studies incorporating structural plasticity and unexplored questions which are raised by such models.
Collapse
Affiliation(s)
- Remya Sankar
- Inria Bordeaux Sud-Ouest, Talence, France; Institut des Maladies Neurodégénératives, Université de Bordeaux, Bordeaux, France; Institut des Maladies Neurodégénératives, CNRS, UMR 5293, France; LaBRI, Université de Bordeaux, INP, CNRS, UMR 5800, Talence, France
| | - Nicolas P Rougier
- Inria Bordeaux Sud-Ouest, Talence, France; Institut des Maladies Neurodégénératives, Université de Bordeaux, Bordeaux, France; Institut des Maladies Neurodégénératives, CNRS, UMR 5293, France; LaBRI, Université de Bordeaux, INP, CNRS, UMR 5800, Talence, France
| | - Arthur Leblois
- Institut des Maladies Neurodégénératives, Université de Bordeaux, Bordeaux, France; Institut des Maladies Neurodégénératives, CNRS, UMR 5293, France.
| |
Collapse
|
4
|
Structural plasticity on an accelerated analog neuromorphic hardware system. Neural Netw 2020; 133:11-20. [PMID: 33091719 DOI: 10.1016/j.neunet.2020.09.024] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2020] [Revised: 07/17/2020] [Accepted: 09/28/2020] [Indexed: 11/23/2022]
Abstract
In computational neuroscience, as well as in machine learning, neuromorphic devices promise an accelerated and scalable alternative to neural network simulations. Their neural connectivity and synaptic capacity depend on their specific design choices, but is always intrinsically limited. Here, we present a strategy to achieve structural plasticity that optimizes resource allocation under these constraints by constantly rewiring the pre- and postsynaptic partners while keeping the neuronal fan-in constant and the connectome sparse. In particular, we implemented this algorithm on the analog neuromorphic system BrainScaleS-2. It was executed on a custom embedded digital processor located on chip, accompanying the mixed-signal substrate of spiking neurons and synapse circuits. We evaluated our implementation in a simple supervised learning scenario, showing its ability to optimize the network topology with respect to the nature of its training data, as well as its overall computational efficiency.
Collapse
|
5
|
A Self-Operating Time Crystal Model of the Human Brain: Can We Replace Entire Brain Hardware with a 3D Fractal Architecture of Clocks Alone? INFORMATION 2020. [DOI: 10.3390/info11050238] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/09/2023] Open
Abstract
Time crystal was conceived in the 1970s as an autonomous engine made of only clocks to explain the life-like features of a virus. Later, time crystal was extended to living cells like neurons. The brain controls most biological clocks that regenerate the living cells continuously. Most cognitive tasks and learning in the brain run by periodic clock-like oscillations. Can we integrate all cognitive tasks in terms of running clocks of the hardware? Since the existing concept of time crystal has only one clock with a singularity point, we generalize the basic idea of time crystal so that we could bond many clocks in a 3D architecture. Harvesting inside phase singularity is the key. Since clocks reset continuously in the brain–body system, during reset, other clocks take over. So, we insert clock architecture inside singularity resembling brain components bottom-up and top-down. Instead of one clock, the time crystal turns to a composite, so it is poly-time crystal. We used century-old research on brain rhythms to compile the first hardware-free pure clock reconstruction of the human brain. Similar to the global effort on connectome, a spatial reconstruction of the brain, we advocate a global effort for more intricate mapping of all brain clocks, to fill missing links with respect to the brain’s temporal map. Once made, reverse engineering the brain would remain a mere engineering challenge.
Collapse
|
6
|
Yan Y, Kappel D, Neumarker F, Partzsch J, Vogginger B, Hoppner S, Furber S, Maass W, Legenstein R, Mayr C. Efficient Reward-Based Structural Plasticity on a SpiNNaker 2 Prototype. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 2019; 13:579-591. [PMID: 30932847 DOI: 10.1109/tbcas.2019.2906401] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Advances in neuroscience uncover the mechanisms employed by the brain to efficiently solve complex learning tasks with very limited resources. However, the efficiency is often lost when one tries to port these findings to a silicon substrate, since brain-inspired algorithms often make extensive use of complex functions, such as random number generators, that are expensive to compute on standard general purpose hardware. The prototype chip of the second generation SpiNNaker system is designed to overcome this problem. Low-power advanced RISC machine (ARM) processors equipped with a random number generator and an exponential function accelerator enable the efficient execution of brain-inspired algorithms. We implement the recently introduced reward-based synaptic sampling model that employs structural plasticity to learn a function or task. The numerical simulation of the model requires to update the synapse variables in each time step including an explorative random term. To the best of our knowledge, this is the most complex synapse model implemented so far on the SpiNNaker system. By making efficient use of the hardware accelerators and numerical optimizations, the computation time of one plasticity update is reduced by a factor of 2. This, combined with fitting the model into to the local static random access memory (SRAM), leads to 62% energy reduction compared to the case without accelerators and the use of external dynamic random access memory (DRAM). The model implementation is integrated into the SpiNNaker software framework allowing for scalability onto larger systems. The hardware-software system presented in this paper paves the way for power-efficient mobile and biomedical applications with biologically plausible brain-inspired algorithms.
Collapse
|
7
|
Steffen L, Reichard D, Weinland J, Kaiser J, Roennau A, Dillmann R. Neuromorphic Stereo Vision: A Survey of Bio-Inspired Sensors and Algorithms. Front Neurorobot 2019; 13:28. [PMID: 31191287 PMCID: PMC6546825 DOI: 10.3389/fnbot.2019.00028] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2019] [Accepted: 05/07/2019] [Indexed: 11/16/2022] Open
Abstract
Any visual sensor, whether artificial or biological, maps the 3D-world on a 2D-representation. The missing dimension is depth and most species use stereo vision to recover it. Stereo vision implies multiple perspectives and matching, hence it obtains depth from a pair of images. Algorithms for stereo vision are also used prosperously in robotics. Although, biological systems seem to compute disparities effortless, artificial methods suffer from high energy demands and latency. The crucial part is the correspondence problem; finding the matching points of two images. The development of event-based cameras, inspired by the retina, enables the exploitation of an additional physical constraint—time. Due to their asynchronous course of operation, considering the precise occurrence of spikes, Spiking Neural Networks take advantage of this constraint. In this work, we investigate sensors and algorithms for event-based stereo vision leading to more biologically plausible robots. Hereby, we focus mainly on binocular stereo vision.
Collapse
Affiliation(s)
- Lea Steffen
- FZI Research Center for Information Technology, Karlsruhe, Germany
| | - Daniel Reichard
- FZI Research Center for Information Technology, Karlsruhe, Germany
| | - Jakob Weinland
- FZI Research Center for Information Technology, Karlsruhe, Germany
| | - Jacques Kaiser
- FZI Research Center for Information Technology, Karlsruhe, Germany
| | - Arne Roennau
- FZI Research Center for Information Technology, Karlsruhe, Germany
| | - Rüdiger Dillmann
- FZI Research Center for Information Technology, Karlsruhe, Germany.,Humanoids and Intelligence Systems Lab, Karlsruhe Institute of Technology (KIT), Karlsruhe, Germany
| |
Collapse
|