1
|
Knoblauch A, Palm G. Iterative Retrieval and Block Coding in Autoassociative and Heteroassociative Memory. Neural Comput 2020; 32:205-260. [DOI: 10.1162/neco_a_01247] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Neural associative memories (NAM) are perceptron-like single-layer networks with fast synaptic learning typically storing discrete associations between pairs of neural activity patterns. Gripon and Berrou ( 2011 ) investigated NAM employing block coding, a particular sparse coding method, and reported a significant increase in storage capacity. Here we verify and extend their results for both heteroassociative and recurrent autoassociative networks. For this we provide a new analysis of iterative retrieval in finite autoassociative and heteroassociative networks that allows estimating storage capacity for random and block patterns. Furthermore, we have implemented various retrieval algorithms for block coding and compared them in simulations to our theoretical results and previous simulation data. In good agreement of theory and experiments, we find that finite networks employing block coding can store significantly more memory patterns. However, due to the reduced information per block pattern, it is not possible to significantly increase stored information per synapse. Asymptotically, the information retrieval capacity converges to the known limits [Formula: see text] and [Formula: see text] also for block coding. We have also implemented very large recurrent networks up to [Formula: see text] neurons, showing that maximal capacity [Formula: see text] bit per synapse occurs for finite networks having a size [Formula: see text] similar to cortical macrocolumns.
Collapse
Affiliation(s)
- Andreas Knoblauch
- KEIM Institute, Albstadt-Sigmaringen University, D-72458 Albstadt, Germany
| | - Günther Palm
- Ulm University, Institute for Neural Information Processing, D-89081 Ulm, Germany
| |
Collapse
|
2
|
Gauy MM, Meier F, Steger A. Multiassociative Memory: Recurrent Synapses Increase Storage Capacity. Neural Comput 2017; 29:1375-1405. [DOI: 10.1162/neco_a_00954] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The connection density of nearby neurons in the cortex has been observed to be around 0.1, whereas the longer-range connections are present with much sparser density (Kalisman, Silberberg, & Markram, 2005 ). We propose a memory association model that qualitatively explains these empirical observations. The model we consider is a multiassociative, sparse, Willshaw-like model consisting of binary threshold neurons and binary synapses. It uses recurrent synapses for iterative retrieval of stored memories. We quantify the usefulness of recurrent synapses by simulating the model for small network sizes and by doing a precise mathematical analysis for large network sizes. Given the network parameters, we can determine the precise values of recurrent and afferent synapse densities that optimize the storage capacity of the network. If the network size is like that of a cortical column, then the predicted optimal recurrent density lies in a range that is compatible with biological measurements. Furthermore, we show that our model is able to surpass the standard Willshaw model in the multiassociative case if the information capacity is normalized per strong synapse or per bits required to store the model, as considered in Knoblauch, Palm, and Sommer ( 2010 ).
Collapse
Affiliation(s)
- Marcelo Matheus Gauy
- Department of Computer Science, Institute of Theoretical Computer Science, ETH Zurich, Zurich 8092, Switzerland
| | - Florian Meier
- Department of Computer Science, Institute of Theoretical Computer Science, ETH Zurich, Zurich 8092, Switzerland
| | - Angelika Steger
- Department of Computer Science, Institute of Theoretical Computer Science, ETH Zurich, Zurich 8092, Switzerland, and Collegium Helveticum, Zurich 8090, Switzerland
| |
Collapse
|
3
|
Abstract
Neural associative networks are a promising computational paradigm for both modeling neural circuits of the brain and implementing associative memory and Hebbian cell assemblies in parallel VLSI or nanoscale hardware. Previous work has extensively investigated synaptic learning in linear models of the Hopfield type and simple nonlinear models of the Steinbuch/Willshaw type. Optimized Hopfield networks of size n can store a large number of about [Formula: see text] memories of size k (or associations between them) but require real-valued synapses, which are expensive to implement and can store at most [Formula: see text] bits per synapse. Willshaw networks can store a much smaller number of about [Formula: see text] memories but get along with much cheaper binary synapses. Here I present a learning model employing synapses with discrete synaptic weights. For optimal discretization parameters, this model can store, up to a factor [Formula: see text] close to one, the same number of memories as for optimized Hopfield-type learning—for example, [Formula: see text] for binary synapses, [Formula: see text] for 2 bit (four-state) synapses, [Formula: see text] for 3 bit (8-state) synapses, and [Formula: see text] for 4 bit (16-state) synapses. The model also provides the theoretical framework to determine optimal discretization parameters for computer implementations or brainlike parallel hardware including structural plasticity. In particular, as recently shown for the Willshaw network, it is possible to store [Formula: see text] bit per computer bit and up to [Formula: see text] bits per nonsilent synapse, whereas the absolute number of stored memories can be much larger than for the Willshaw model.
Collapse
|
4
|
Knoblauch A, Körner E, Körner U, Sommer FT. Structural synaptic plasticity has high memory capacity and can explain graded amnesia, catastrophic forgetting, and the spacing effect. PLoS One 2014; 9:e96485. [PMID: 24858841 PMCID: PMC4032253 DOI: 10.1371/journal.pone.0096485] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2013] [Accepted: 04/08/2014] [Indexed: 11/19/2022] Open
Abstract
Although already William James and, more explicitly, Donald Hebb's theory of cell assemblies have suggested that activity-dependent rewiring of neuronal networks is the substrate of learning and memory, over the last six decades most theoretical work on memory has focused on plasticity of existing synapses in prewired networks. Research in the last decade has emphasized that structural modification of synaptic connectivity is common in the adult brain and tightly correlated with learning and memory. Here we present a parsimonious computational model for learning by structural plasticity. The basic modeling units are "potential synapses" defined as locations in the network where synapses can potentially grow to connect two neurons. This model generalizes well-known previous models for associative learning based on weight plasticity. Therefore, existing theory can be applied to analyze how many memories and how much information structural plasticity can store in a synapse. Surprisingly, we find that structural plasticity largely outperforms weight plasticity and can achieve a much higher storage capacity per synapse. The effect of structural plasticity on the structure of sparsely connected networks is quite intuitive: Structural plasticity increases the "effectual network connectivity", that is, the network wiring that specifically supports storage and recall of the memories. Further, this model of structural plasticity produces gradients of effectual connectivity in the course of learning, thereby explaining various cognitive phenomena including graded amnesia, catastrophic forgetting, and the spacing effect.
Collapse
Affiliation(s)
- Andreas Knoblauch
- Engineering Faculty, Albstadt-Sigmaringen University, Albstadt, Germany
- Honda Research Institute Europe, Offenbach am Main, Germany
| | - Edgar Körner
- Honda Research Institute Europe, Offenbach am Main, Germany
| | - Ursula Körner
- Honda Research Institute Europe, Offenbach am Main, Germany
| | - Friedrich T. Sommer
- Redwood Center for Theoretical Neuroscience, University of California, Berkeley, California, United States of America
| |
Collapse
|
5
|
Oku M, Makino T, Aihara K. Pseudo-orthogonalization of memory patterns for associative memory. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2013; 24:1877-1887. [PMID: 24808619 DOI: 10.1109/tnnls.2013.2268542] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
A new method for improving the storage capacity of associative memory models on a neural network is proposed. The storage capacity of the network increases in proportion to the network size in the case of random patterns, but, in general, the capacity suffers from correlation among memory patterns. Numerous solutions to this problem have been proposed so far, but their high computational cost limits their scalability. In this paper, we propose a novel and simple solution that is locally computable without any iteration. Our method involves XNOR masking of the original memory patterns with random patterns, and the masked patterns and masks are concatenated. The resulting decorrelated patterns allow higher storage capacity at the cost of the pattern length. Furthermore, the increase in the pattern length can be reduced through blockwise masking, which results in a small amount of capacity loss. Movie replay and image recognition are presented as examples to demonstrate the scalability of the proposed method.
Collapse
|
6
|
Huyck CR, Passmore PJ. A review of cell assemblies. BIOLOGICAL CYBERNETICS 2013; 107:263-288. [PMID: 23559034 DOI: 10.1007/s00422-013-0555-5] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/17/2012] [Accepted: 03/06/2013] [Indexed: 06/02/2023]
Abstract
Since the cell assembly (CA) was hypothesised, it has gained substantial support and is believed to be the neural basis of psychological concepts. A CA is a relatively small set of connected neurons, that through neural firing can sustain activation without stimulus from outside the CA, and is formed by learning. Extensive evidence from multiple single unit recording and other techniques provides support for the existence of CAs that have these properties, and that their neurons also spike with some degree of synchrony. Since the evidence is so broad and deep, the review concludes that CAs are all but certain. A model of CAs is introduced that is informal, but is broad enough to include, e.g. synfire chains, without including, e.g. holographic reduced representation. CAs are found in most cortical areas and in some sub-cortical areas, they are involved in psychological tasks including categorisation, short-term memory and long-term memory, and are central to other tasks including working memory. There is currently insufficient evidence to conclude that CAs are the neural basis of all concepts. A range of models have been used to simulate CA behaviour including associative memory and more process- oriented tasks such as natural language parsing. Questions involving CAs, e.g. memory persistence, CAs' complex interactions with brain waves and learning, remain unanswered. CA research involves a wide range of disciplines including biology and psychology, and this paper reviews literature directly related to the CA, providing a basis of discussion for this interdisciplinary community on this important topic. Hopefully, this discussion will lead to more formal and accurate models of CAs that are better linked to neuropsychological data.
Collapse
|
7
|
Sterne P. Efficient and robust associative memory from a generalized Bloom filter. BIOLOGICAL CYBERNETICS 2012; 106:271-281. [PMID: 22729656 DOI: 10.1007/s00422-012-0494-6] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/12/2011] [Accepted: 04/27/2012] [Indexed: 06/01/2023]
Abstract
We develop a variant of a Bloom filter that is robust to hardware failure and show how it can be used as an efficient associative memory. We define a measure of the information recall and show that our new associative memory is able to recall more than twice as much information as a Hopfield network. The extra efficiency of our associative memory is all the more remarkable as it uses only bits while the Hopfield network uses integers.
Collapse
Affiliation(s)
- P Sterne
- Cavendish Laboratory, Cambridge University, Cambridge, UK.
| |
Collapse
|
8
|
Sacramento J, Wichert A. Binary Willshaw learning yields high synaptic capacity for long-term familiarity memory. BIOLOGICAL CYBERNETICS 2012; 106:123-133. [PMID: 22481645 DOI: 10.1007/s00422-012-0488-4] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/05/2011] [Accepted: 03/15/2012] [Indexed: 05/31/2023]
Abstract
In this study, we investigate from a computational perspective the efficiency of the Willshaw synaptic update rule in the context of familiarity discrimination, a binary-answer, memory-related task that has been linked through psychophysical experiments with modified neural activity patterns in the prefrontal and perirhinal cortex regions. Our motivation for recovering this well-known learning prescription is two-fold: first, the switch-like nature of the induced synaptic bonds, as there is evidence that biological synaptic transitions might occur in a discrete stepwise fashion. Second, the possibility that in the mammalian brain, unused, silent synapses might be pruned in the long-term. Besides the usual pattern and network capacities, we calculate the synaptic capacity of the model, a recently proposed measure where only the functional subset of synapses is taken into account. We find that in terms of network capacity, Willshaw learning is strongly affected by the pattern coding rates, which have to be kept fixed and very low at any time to achieve a non-zero capacity in the large network limit. The information carried per functional synapse, however, diverges and is comparable to that of the pattern association case, even for more realistic moderately low activity levels that are a function of network size.
Collapse
Affiliation(s)
- João Sacramento
- INESC-ID and Instituto Superior Técnico, Technical University of Lisbon, Av. Prof. Dr. Aníbal Cavaco Silva, 2744-016, Porto Salvo, Portugal.
| | | |
Collapse
|
9
|
Abstract
Neural associative memories are perceptron-like single-layer networks with fast synaptic learning typically storing discrete associations between pairs of neural activity patterns. Previous work optimized the memory capacity for various models of synaptic learning: linear Hopfield-type rules, the Willshaw model employing binary synapses, or the BCPNN rule of Lansner and Ekeberg, for example. Here I show that all of these previous models are limit cases of a general optimal model where synaptic learning is determined by probabilistic Bayesian considerations. Asymptotically, for large networks and very sparse neuron activity, the Bayesian model becomes identical to an inhibitory implementation of the Willshaw and BCPNN-type models. For less sparse patterns, the Bayesian model becomes identical to Hopfield-type networks employing the covariance rule. For intermediate sparseness or finite networks, the optimal Bayesian learning rule differs from the previous models and can significantly improve memory performance. I also provide a unified analytical framework to determine memory capacity at a given output noise level that links approaches based on mutual information, Hamming distance, and signal-to-noise ratio.
Collapse
|
10
|
Knoblauch A, Palm G, Sommer FT. Memory Capacities for Synaptic and Structural Plasticity. Neural Comput 2010; 22:289-341. [PMID: 19925281 DOI: 10.1162/neco.2009.08-07-588] [Citation(s) in RCA: 89] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Neural associative networks with plastic synapses have been proposed as computational models of brain functions and also for applications such as pattern recognition and information retrieval. To guide biological models and optimize technical applications, several definitions of memory capacity have been used to measure the efficiency of associative memory. Here we explain why the currently used performance measures bias the comparison between models and cannot serve as a theoretical benchmark. We introduce fair measures for information-theoretic capacity in associative memory that also provide a theoretical benchmark. In neural networks, two types of manipulating synapses can be discerned: synaptic plasticity, the change in strength of existing synapses, and structural plasticity, the creation and pruning of synapses. One of the new types of memory capacity we introduce permits quantifying how structural plasticity can increase the network efficiency by compressing the network structure, for example, by pruning unused synapses. Specifically, we analyze operating regimes in the Willshaw model in which structural plasticity can compress the network structure and push performance to the theoretical benchmark. The amount C of information stored in each synapse can scale with the logarithm of the network size rather than being constant, as in classical Willshaw and Hopfield nets (⩽ ln 2 ≈ 0.7). Further, the review contains novel technical material: a capacity analysis of the Willshaw model that rigorously controls for the level of retrieval quality, an analysis for memories with a nonconstant number of active units (where C ⩽ 1/eln 2 ≈ 0.53), and the analysis of the computational complexity of associative memories with and without network compression.
Collapse
Affiliation(s)
| | - Günther Palm
- Institut für Neuroinformatik, Fakultät für Ingenieurwissenschaften und Informatik, Universität Ulm, D-89069 Ulm, Germany
| | - Friedrich T. Sommer
- University of California at Berkeley, Redwood Center for Theoretical Neuroscience, Berkeley, CA 94720-3220, U.S.A
| |
Collapse
|
11
|
Makarov VA, Song Y, Velarde MG, Hübner D, Cruse H. Elements for a general memory structure: properties of recurrent neural networks used to form situation models. BIOLOGICAL CYBERNETICS 2008; 98:371-395. [PMID: 18350312 DOI: 10.1007/s00422-008-0221-5] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/24/2007] [Accepted: 02/06/2008] [Indexed: 05/26/2023]
Abstract
We study how individual memory items are stored assuming that situations given in the environment can be represented in the form of synaptic-like couplings in recurrent neural networks. Previous numerical investigations have shown that specific architectures based on suppression or max units can successfully learn static or dynamic stimuli (situations). Here we provide a theoretical basis concerning the learning process convergence and the network response to a novel stimulus. We show that, besides learning "simple" static situations, a nD network can learn and replicate a sequence of up to n different vectors or frames. We find limits on the learning rate and show coupling matrices developing during training in different cases including expansion of the network into the case of nonlinear interunit coupling. Furthermore, we show that a specific coupling matrix provides low-pass-filter properties to the units, thus connecting networks constructed by static summation units with continuous-time networks. We also show under which conditions such networks can be used to perform arithmetic calculations by means of pattern completion.
Collapse
Affiliation(s)
- Valeri A Makarov
- Instituto Pluridisciplinar, Universidad Complutense, Paseo Juan XXIII, 1, 28040 Madrid, Spain.
| | | | | | | | | |
Collapse
|
12
|
Rehn M, Sommer FT. Storing and restoring visual input with collaborative rank coding and associative memory. Neurocomputing 2006. [DOI: 10.1016/j.neucom.2005.12.080] [Citation(s) in RCA: 16] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
|
13
|
Butz M, Teuchert-Noodt G. A simulation model for compensatory plasticity in the prefrontal cortex inducing a cortico-cortical dysconnection in early brain development. J Neural Transm (Vienna) 2006; 113:695-710. [PMID: 16463119 DOI: 10.1007/s00702-005-0403-4] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2005] [Accepted: 10/09/2005] [Indexed: 10/25/2022]
Abstract
In the present work, an abstract prefrontal cortex simulation model is used to predict compensatory structural alterations of the cortico-cortical connectivity pattern in the normal and pathologic forebrain maturation. The simulation network shows different representative courses of morphogenesis when developing undisturbed or when suffering from disturbing excitatory afferences. The simulative results could be affirmed by an immuno-histochemical study, revealing a qualitatively comparable development of the glutamatergic projection fibre density in gerbils (Meriones unguiculatus) after juvenile and adult methamphetamine intoxication. The simulation model further allows to consider different rearing conditions (enriched-environment model), and claims contradictory effects of an equal disturbance after enriched or impoverished rearing which are in accordance with the experimental findings.
Collapse
Affiliation(s)
- M Butz
- Department of Neuroanatomy, Faculty of Biology, University of Bielefeld, Germany
| | | |
Collapse
|
14
|
Wennekers T, Sommer FT. Gamma-oscillations support optimal retrieval in associative memories of two-compartment neurons. Neurocomputing 1999. [DOI: 10.1016/s0925-2312(99)00036-3] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|