1
|
Sinapayen L, Masumori A, Ikegami T. Learning by stimulation avoidance: A principle to control spiking neural networks dynamics. PLoS One 2017; 12:e0170388. [PMID: 28158309 PMCID: PMC5291507 DOI: 10.1371/journal.pone.0170388] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2016] [Accepted: 12/31/2016] [Indexed: 11/29/2022] Open
Abstract
Learning based on networks of real neurons, and learning based on biologically inspired models of neural networks, have yet to find general learning rules leading to widespread applications. In this paper, we argue for the existence of a principle allowing to steer the dynamics of a biologically inspired neural network. Using carefully timed external stimulation, the network can be driven towards a desired dynamical state. We term this principle "Learning by Stimulation Avoidance" (LSA). We demonstrate through simulation that the minimal sufficient conditions leading to LSA in artificial networks are also sufficient to reproduce learning results similar to those obtained in biological neurons by Shahaf and Marom, and in addition explains synaptic pruning. We examined the underlying mechanism by simulating a small network of 3 neurons, then scaled it up to a hundred neurons. We show that LSA has a higher explanatory power than existing hypotheses about the response of biological neural networks to external simulation, and can be used as a learning rule for an embodied application: learning of wall avoidance by a simulated robot. In other works, reinforcement learning with spiking networks can be obtained through global reward signals akin simulating the dopamine system; we believe that this is the first project demonstrating sensory-motor learning with random spiking networks through Hebbian learning relying on environmental conditions without a separate reward system.
Collapse
Affiliation(s)
- Lana Sinapayen
- The University of Tokyo, Ikegami Laboratory, Tokyo, Japan
| | | | | |
Collapse
|
2
|
Kassab R, Alexandre F. Integration of exteroceptive and interoceptive information within the hippocampus: a computational study. Front Syst Neurosci 2015; 9:87. [PMID: 26097448 PMCID: PMC4456570 DOI: 10.3389/fnsys.2015.00087] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2015] [Accepted: 05/22/2015] [Indexed: 12/25/2022] Open
Abstract
Many episodic memory studies have critically implicated the hippocampus in the rapid binding of sensory information from the perception of the external environment, reported by exteroception. Other structures in the medial temporal lobe, especially the amygdala, have been more specifically linked with emotional dimension of episodic memories, reported by interoception. The hippocampal projection to the amygdala is proposed as a substrate important for the formation of extero-interoceptive associations, allowing adaptive behaviors based on past experiences. Recently growing evidence suggests that hippocampal activity observed in a wide range of behavioral tasks could reflect associations between exteroceptive patterns and their emotional valences. The hippocampal computational models, therefore, need to be updated to elaborate better interpretation of hippocampal-dependent behaviors. In earlier models, interoceptive features, if not neglected, are bound together with other exteroceptive features through autoassociative learning mechanisms. This way of binding integrates both kinds of features at the same level, which is not always suitable for example in the case of pattern completion. Based on the anatomical and functional heterogeneity along the septotemporal and transverse axes of the hippocampus, we suggest instead that distinct hippocampal subregions may be engaged in the representation of these different types of information, each stored apart in autoassociative memories but linked together in a heteroassociative way. The model is developed within the hard constraint of rapid, even single trial, learning of episodic memories. The performance of the model is assessed quantitatively and its resistance to interference is demonstrated through a series of numerical experiments. An experiment of reversal learning in patients with amnesic cognitive impairment is also reproduced.
Collapse
Affiliation(s)
- Randa Kassab
- INRIA Bordeaux Sud-OuestTalence, France
- LaBRI, UMR 5800, Centre National de la Recherche Scientifique, Bordeaux INP, Université de BordeauxTalence, France
- Institut des Maladies Neurodégénératives, UMR 5293, Centre National de la Recherche Scientifique, Université de BordeauxBordeaux, France
| | - Frédéric Alexandre
- INRIA Bordeaux Sud-OuestTalence, France
- LaBRI, UMR 5800, Centre National de la Recherche Scientifique, Bordeaux INP, Université de BordeauxTalence, France
- Institut des Maladies Neurodégénératives, UMR 5293, Centre National de la Recherche Scientifique, Université de BordeauxBordeaux, France
| |
Collapse
|
3
|
Bauer R, Zubler F, Pfister S, Hauri A, Pfeiffer M, Muir DR, Douglas RJ. Developmental self-construction and -configuration of functional neocortical neuronal networks. PLoS Comput Biol 2014; 10:e1003994. [PMID: 25474693 PMCID: PMC4256067 DOI: 10.1371/journal.pcbi.1003994] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2014] [Accepted: 10/09/2014] [Indexed: 11/20/2022] Open
Abstract
The prenatal development of neural circuits must provide sufficient configuration to support at least a set of core postnatal behaviors. Although knowledge of various genetic and cellular aspects of development is accumulating rapidly, there is less systematic understanding of how these various processes play together in order to construct such functional networks. Here we make some steps toward such understanding by demonstrating through detailed simulations how a competitive co-operative (‘winner-take-all’, WTA) network architecture can arise by development from a single precursor cell. This precursor is granted a simplified gene regulatory network that directs cell mitosis, differentiation, migration, neurite outgrowth and synaptogenesis. Once initial axonal connection patterns are established, their synaptic weights undergo homeostatic unsupervised learning that is shaped by wave-like input patterns. We demonstrate how this autonomous genetically directed developmental sequence can give rise to self-calibrated WTA networks, and compare our simulation results with biological data. Models of learning in artificial neural networks generally assume that the neurons and approximate network are given, and then learning tunes the synaptic weights. By contrast, we address the question of how an entire functional neuronal network containing many differentiated neurons and connections can develop from only a single progenitor cell. We chose a winner-take-all network as the developmental target, because it is a computationally powerful circuit, and a candidate motif of neocortical networks. The key aspect of this challenge is that the developmental mechanisms must be locally autonomous as in Biology: They cannot depend on global knowledge or supervision. We have explored this developmental process by simulating in physical detail the fundamental biological behaviors, such as cell proliferation, neurite growth and synapse formation that give rise to the structural connectivity observed in the superficial layers of the neocortex. These differentiated, approximately connected neurons then adapt their synaptic weights homeostatically to obtain a uniform electrical signaling activity before going on to organize themselves according to the fundamental correlations embedded in a noisy wave-like input signal. In this way the precursor expands itself through development and unsupervised learning into winner-take-all functionality and orientation selectivity in a biologically plausible manner.
Collapse
Affiliation(s)
- Roman Bauer
- Institute of Neuroinformatics, University/ETH Zürich, Zürich, Switzerland
- School of Computing Science, Newcastle University, Newcastle upon Tyne, United Kingdom
- * E-mail:
| | - Frédéric Zubler
- Institute of Neuroinformatics, University/ETH Zürich, Zürich, Switzerland
- Department of Neurology, Inselspital Bern, Bern University Hospital, University of Bern, Bern, Switzerland
| | - Sabina Pfister
- Institute of Neuroinformatics, University/ETH Zürich, Zürich, Switzerland
| | - Andreas Hauri
- Institute of Neuroinformatics, University/ETH Zürich, Zürich, Switzerland
| | - Michael Pfeiffer
- Institute of Neuroinformatics, University/ETH Zürich, Zürich, Switzerland
| | - Dylan R. Muir
- Institute of Neuroinformatics, University/ETH Zürich, Zürich, Switzerland
- Biozentrum, University of Basel, Basel, Switzerland
| | - Rodney J. Douglas
- Institute of Neuroinformatics, University/ETH Zürich, Zürich, Switzerland
| |
Collapse
|
4
|
Synaptic dynamics: Linear model and adaptation algorithm. Neural Netw 2014; 56:49-68. [DOI: 10.1016/j.neunet.2014.04.001] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2014] [Revised: 04/10/2014] [Accepted: 04/20/2014] [Indexed: 11/17/2022]
|
5
|
Huyck CR, Mitchell IG. Post and pre-compensatory Hebbian learning for categorisation. Cogn Neurodyn 2014; 8:299-311. [PMID: 25009672 DOI: 10.1007/s11571-014-9282-4] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2013] [Revised: 01/13/2014] [Accepted: 01/22/2014] [Indexed: 11/28/2022] Open
Abstract
A system with some degree of biological plausibility is developed to categorise items from a widely used machine learning benchmark. The system uses fatiguing leaky integrate and fire neurons, a relatively coarse point model that roughly duplicates biological spiking properties; this allows spontaneous firing based on hypo-fatigue so that neurons not directly stimulated by the environment may be included in the circuit. A novel compensatory Hebbian learning algorithm is used that considers the total synaptic weight coming into a neuron. The network is unsupervised and entirely self-organising. This is relatively effective as a machine learning algorithm, categorising with just neurons, and the performance is comparable with a Kohonen map. However the learning algorithm is not stable, and behaviour decays as length of training increases. Variables including learning rate, inhibition and topology are explored leading to stable systems driven by the environment. The model is thus a reasonable next step toward a full neural memory model.
Collapse
Affiliation(s)
| | - Ian G Mitchell
- Department of Computer Science, Middlesex University, London, UK
| |
Collapse
|
6
|
Rumbell T, Denham SL, Wennekers T. A spiking self-organizing map combining STDP, oscillations, and continuous learning. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2014; 25:894-907. [PMID: 24808036 DOI: 10.1109/tnnls.2013.2283140] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
The self-organizing map (SOM) is a neural network algorithm to create topographically ordered spatial representations of an input data set using unsupervised learning. The SOM algorithm is inspired by the feature maps found in mammalian cortices but lacks some important functional properties of its biological equivalents. Neurons have no direct access to global information, transmit information through spikes and may be using phasic coding of spike times within synchronized oscillations, receive continuous input from the environment, do not necessarily alter network properties such as learning rate and lateral connectivity throughout training, and learn through relative timing of action potentials across a synaptic connection. In this paper, a network of integrate-and-fire neurons is presented that incorporates solutions to each of these issues through the neuron model and network structure. Results of the simulated experiments assessing map formation using artificial data as well as the Iris and Wisconsin Breast Cancer datasets show that this novel implementation maintains fundamental properties of the conventional SOM, thereby representing a significant step toward further understanding of the self-organizational properties of the brain while providing an additional method for implementing SOMs that can be utilized for future modeling in software or special purpose spiking neuron hardware.
Collapse
|
7
|
He W, Huang K, Ning N, Ramanathan K, Li G, Jiang Y, Sze J, Shi L, Zhao R, Pei J. Enabling an integrated rate-temporal learning scheme on memristor. Sci Rep 2014; 4:4755. [PMID: 24755608 PMCID: PMC3996481 DOI: 10.1038/srep04755] [Citation(s) in RCA: 54] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2013] [Accepted: 04/02/2014] [Indexed: 11/18/2022] Open
Abstract
Learning scheme is the key to the utilization of spike-based computation and the emulation of neural/synaptic behaviors toward realization of cognition. The biological observations reveal an integrated spike time- and spike rate-dependent plasticity as a function of presynaptic firing frequency. However, this integrated rate-temporal learning scheme has not been realized on any nano devices. In this paper, such scheme is successfully demonstrated on a memristor. Great robustness against the spiking rate fluctuation is achieved by waveform engineering with the aid of good analog properties exhibited by the iron oxide-based memristor. The spike-time-dependence plasticity (STDP) occurs at moderate presynaptic firing frequencies and spike-rate-dependence plasticity (SRDP) dominates other regions. This demonstration provides a novel approach in neural coding implementation, which facilitates the development of bio-inspired computing systems.
Collapse
Affiliation(s)
- Wei He
- Data Storage Institute, Agency for Science, Technology and Research (A*STAR), 5 Engineering Drive 1, Singapore 117608
- These authors contributed equally to this work
| | - Kejie Huang
- Singapore University of Technology & Design, 20 Dover Drive, Singapore 138682
- These authors contributed equally to this work
| | - Ning Ning
- Data Storage Institute, Agency for Science, Technology and Research (A*STAR), 5 Engineering Drive 1, Singapore 117608
| | - Kiruthika Ramanathan
- Data Storage Institute, Agency for Science, Technology and Research (A*STAR), 5 Engineering Drive 1, Singapore 117608
| | - Guoqi Li
- Optical Memory National Engineering Research Center, Department of Precision Instrument, Tsinghua University, Beijing 100084, China
| | - Yu Jiang
- Data Storage Institute, Agency for Science, Technology and Research (A*STAR), 5 Engineering Drive 1, Singapore 117608
| | - JiaYin Sze
- Data Storage Institute, Agency for Science, Technology and Research (A*STAR), 5 Engineering Drive 1, Singapore 117608
| | - Luping Shi
- Optical Memory National Engineering Research Center, Department of Precision Instrument, Tsinghua University, Beijing 100084, China
| | - Rong Zhao
- Singapore University of Technology & Design, 20 Dover Drive, Singapore 138682
| | - Jing Pei
- Optical Memory National Engineering Research Center, Department of Precision Instrument, Tsinghua University, Beijing 100084, China
| |
Collapse
|
8
|
Abstract
The mammalian cerebral cortex is characterized in vivo by irregular spontaneous activity, but how this ongoing dynamics affects signal processing and learning remains unknown. The associative plasticity rules demonstrated in vitro, mostly in silent networks, are based on the detection of correlations between presynaptic and postsynaptic activity and hence are sensitive to spontaneous activity and spurious correlations. Therefore, they cannot operate in realistic network states. Here, we present a new class of spike-timing-dependent plasticity learning rules with local floating plasticity thresholds, the slow dynamics of which account for metaplasticity. This novel algorithm is shown to both correctly predict homeostasis in synaptic weights and solve the problem of asymptotic stable learning in noisy states. It is shown to naturally encompass many other known types of learning rule, unifying them into a single coherent framework. The mixed presynaptic and postsynaptic dependency of the floating plasticity threshold is justified by a cascade of known molecular pathways, which leads to experimentally testable predictions.
Collapse
|
9
|
Ren Q, Zhang Z, Zhao J. Effect on information transfer of synaptic pruning driven by spike-timing-dependent plasticity. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2012; 85:022901. [PMID: 22463266 DOI: 10.1103/physreve.85.022901] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/16/2011] [Revised: 01/09/2012] [Indexed: 05/31/2023]
Abstract
Spike-timing-dependent plasticity (STDP) is an important driving force of self-organization in neural systems. With properly chosen input signals, STDP can yield a synaptic pruning process, whose functional role needs to be further investigated. We explore this issue from an information theoretic standpoint. Temporally correlated stimuli are introduced to neurons of an input layer. Then synapses on the dendrite, and thus the receptive field, of an output neuron are refined by STDP. The mutual information between input and output spike trains is calculated with the context tree method. The results show that synapse removal can enhance information transfer, i.e., that "less can be more" under certain constraints that stress the balance between potentiation and depression dictated by the parameters of the STDP rule, as well as the temporal scale of the input correlation.
Collapse
Affiliation(s)
- Quansheng Ren
- School of Electronics Engineering and Computer Science, Peking University, Beijing 100871, People's Republic of China
| | | | | |
Collapse
|
10
|
A biophysically-based neuromorphic model of spike rate- and timing-dependent plasticity. Proc Natl Acad Sci U S A 2011; 108:E1266-74. [PMID: 22089232 DOI: 10.1073/pnas.1106161108] [Citation(s) in RCA: 58] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/31/2023] Open
Abstract
Current advances in neuromorphic engineering have made it possible to emulate complex neuronal ion channel and intracellular ionic dynamics in real time using highly compact and power-efficient complementary metal-oxide-semiconductor (CMOS) analog very-large-scale-integrated circuit technology. Recently, there has been growing interest in the neuromorphic emulation of the spike-timing-dependent plasticity (STDP) Hebbian learning rule by phenomenological modeling using CMOS, memristor or other analog devices. Here, we propose a CMOS circuit implementation of a biophysically grounded neuromorphic (iono-neuromorphic) model of synaptic plasticity that is capable of capturing both the spike rate-dependent plasticity (SRDP, of the Bienenstock-Cooper-Munro or BCM type) and STDP rules. The iono-neuromorphic model reproduces bidirectional synaptic changes with NMDA receptor-dependent and intracellular calcium-mediated long-term potentiation or long-term depression assuming retrograde endocannabinoid signaling as a second coincidence detector. Changes in excitatory or inhibitory synaptic weights are registered and stored in a nonvolatile and compact digital format analogous to the discrete insertion and removal of AMPA or GABA receptor channels. The versatile Hebbian synapse device is applicable to a variety of neuroprosthesis, brain-machine interface, neurorobotics, neuromimetic computation, machine learning, and neural-inspired adaptive control problems.
Collapse
|
11
|
Abstract
Neocortical neurons in vivo process each of their individual inputs in the context of ongoing synaptic background activity, produced by the thousands of presynaptic partners a typical neuron has. Previous work has shown that background activity affects multiple aspects of neuronal and network function. However, its effect on the induction of spike-timing dependent plasticity (STDP) is not clear. Here we report that injections of simulated background conductances (produced by a dynamic-clamp system) into pyramidal cells in rat brain slices selectively reduced the magnitude of timing-dependent synaptic potentiation while leaving the magnitude of timing-dependent synaptic depression unchanged. The conductance-dependent suppression also sharpened the STDP curve, with reliable synaptic potentiation induced only when EPSPs and action potentials (APs) were paired within 8 ms of each other. Dual somatic and dendritic patch recordings suggested that the deficit in synaptic potentiation arose from shunting of dendritic EPSPs and APs. Using a biophysically detailed computational model, we were not only able to replicate the conductance-dependent shunting of dendritic potentials, but show that synaptic background can truncate calcium dynamics within dendritic spines in a way that affects potentiation more strongly than depression. This conductance-dependent regulation of synaptic plasticity may constitute a novel homeostatic mechanism that can prevent the runaway synaptic potentiation to which Hebbian networks are vulnerable.
Collapse
|
12
|
Bush D, Philippides A, Husbands P, O'Shea M. Spike-timing dependent plasticity and the cognitive map. Front Comput Neurosci 2010; 4:142. [PMID: 21060719 PMCID: PMC2972746 DOI: 10.3389/fncom.2010.00142] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2010] [Accepted: 09/24/2010] [Indexed: 11/13/2022] Open
Abstract
Since the discovery of place cells – single pyramidal neurons that encode spatial location – it has been hypothesized that the hippocampus may act as a cognitive map of known environments. This putative function has been extensively modeled using auto-associative networks, which utilize rate-coded synaptic plasticity rules in order to generate strong bi-directional connections between concurrently active place cells that encode for neighboring place fields. However, empirical studies using hippocampal cultures have demonstrated that the magnitude and direction of changes in synaptic strength can also be dictated by the relative timing of pre- and post-synaptic firing according to a spike-timing dependent plasticity (STDP) rule. Furthermore, electrophysiology studies have identified persistent “theta-coded” temporal correlations in place cell activity in vivo, characterized by phase precession of firing as the corresponding place field is traversed. It is not yet clear if STDP and theta-coded neural dynamics are compatible with cognitive map theory and previous rate-coded models of spatial learning in the hippocampus. Here, we demonstrate that an STDP rule based on empirical data obtained from the hippocampus can mediate rate-coded Hebbian learning when pre- and post-synaptic activity is stochastic and has no persistent sequence bias. We subsequently demonstrate that a spiking recurrent neural network that utilizes this STDP rule, alongside theta-coded neural activity, allows the rapid development of a cognitive map during directed or random exploration of an environment of overlapping place fields. Hence, we establish that STDP and phase precession are compatible with rate-coded models of cognitive map development.
Collapse
Affiliation(s)
- Daniel Bush
- Department of Physics and Astronomy, University of California Los Angeles Los Angeles, CA, USA
| | | | | | | |
Collapse
|
13
|
Dual coding with STDP in a spiking recurrent neural network model of the hippocampus. PLoS Comput Biol 2010; 6:e1000839. [PMID: 20617201 PMCID: PMC2895637 DOI: 10.1371/journal.pcbi.1000839] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2010] [Accepted: 05/27/2010] [Indexed: 11/19/2022] Open
Abstract
The firing rate of single neurons in the mammalian hippocampus has been demonstrated to encode for a range of spatial and non-spatial stimuli. It has also been demonstrated that phase of firing, with respect to the theta oscillation that dominates the hippocampal EEG during stereotype learning behaviour, correlates with an animal's spatial location. These findings have led to the hypothesis that the hippocampus operates using a dual (rate and temporal) coding system. To investigate the phenomenon of dual coding in the hippocampus, we examine a spiking recurrent network model with theta coded neural dynamics and an STDP rule that mediates rate-coded Hebbian learning when pre- and post-synaptic firing is stochastic. We demonstrate that this plasticity rule can generate both symmetric and asymmetric connections between neurons that fire at concurrent or successive theta phase, respectively, and subsequently produce both pattern completion and sequence prediction from partial cues. This unifies previously disparate auto- and hetero-associative network models of hippocampal function and provides them with a firmer basis in modern neurobiology. Furthermore, the encoding and reactivation of activity in mutually exciting Hebbian cell assemblies demonstrated here is believed to represent a fundamental mechanism of cognitive processing in the brain. Changes in the strength of synaptic connections between neurons are believed to mediate processes of learning and memory in the brain. A computational theory of this synaptic plasticity was first provided by Donald Hebb within the context of a more general neural coding mechanism, whereby phase sequences of activity directed by ongoing external and internal dynamics propagate in mutually exciting ensembles of neurons. Empirical evidence for this cell assembly model has been obtained in the hippocampus, where neuronal ensembles encoding for spatial location repeatedly fire in sequence at different phases of the ongoing theta oscillation. To investigate the encoding and reactivation of these dual coded activity patterns, we examine a biologically inspired spiking neural network model of the hippocampus with a novel synaptic plasticity rule. We demonstrate that this allows the rapid development of both symmetric and asymmetric connections between neurons that fire at concurrent or consecutive theta phase respectively. Recall activity, corresponding to both pattern completion and sequence prediction, can subsequently be produced by partial external cues. This allows the reconciliation of two previously disparate classes of hippocampal model and provides a framework for further examination of cell assembly dynamics in spiking neural networks.
Collapse
|