251
|
Toutounji H, Pasemann F. Behavior control in the sensorimotor loop with short-term synaptic dynamics induced by self-regulating neurons. Front Neurorobot 2014; 8:19. [PMID: 24904403 PMCID: PMC4033235 DOI: 10.3389/fnbot.2014.00019] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2013] [Accepted: 05/07/2014] [Indexed: 12/02/2022] Open
Abstract
The behavior and skills of living systems depend on the distributed control provided by specialized and highly recurrent neural networks. Learning and memory in these systems is mediated by a set of adaptation mechanisms, known collectively as neuronal plasticity. Translating principles of recurrent neural control and plasticity to artificial agents has seen major strides, but is usually hampered by the complex interactions between the agent's body and its environment. One of the important standing issues is for the agent to support multiple stable states of behavior, so that its behavioral repertoire matches the requirements imposed by these interactions. The agent also must have the capacity to switch between these states in time scales that are comparable to those by which sensory stimulation varies. Achieving this requires a mechanism of short-term memory that allows the neurocontroller to keep track of the recent history of its input, which finds its biological counterpart in short-term synaptic plasticity. This issue is approached here by deriving synaptic dynamics in recurrent neural networks. Neurons are introduced as self-regulating units with a rich repertoire of dynamics. They exhibit homeostatic properties for certain parameter domains, which result in a set of stable states and the required short-term memory. They can also operate as oscillators, which allow them to surpass the level of activity imposed by their homeostatic operation conditions. Neural systems endowed with the derived synaptic dynamics can be utilized for the neural behavior control of autonomous mobile agents. The resulting behavior depends also on the underlying network structure, which is either engineered or developed by evolutionary techniques. The effectiveness of these self-regulating units is demonstrated by controlling locomotion of a hexapod with 18 degrees of freedom, and obstacle-avoidance of a wheel-driven robot.
Collapse
Affiliation(s)
- Hazem Toutounji
- Department of Neurocybernetics, Institute of Cognitive Science, University of Osnabrück Osnabrück, Germany
| | - Frank Pasemann
- Department of Neurocybernetics, Institute of Cognitive Science, University of Osnabrück Osnabrück, Germany
| |
Collapse
|
252
|
Activity-dependent synaptic plasticity of a chalcogenide electronic synapse for neuromorphic systems. Sci Rep 2014; 4:4906. [PMID: 24809396 PMCID: PMC4014880 DOI: 10.1038/srep04906] [Citation(s) in RCA: 63] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/27/2013] [Accepted: 04/11/2014] [Indexed: 12/11/2022] Open
Abstract
Nanoscale inorganic electronic synapses or synaptic devices, which are capable of emulating the functions of biological synapses of brain neuronal systems, are regarded as the basic building blocks for beyond-Von Neumann computing architecture, combining information storage and processing. Here, we demonstrate a Ag/AgInSbTe/Ag structure for chalcogenide memristor-based electronic synapses. The memristive characteristics with reproducible gradual resistance tuning are utilised to mimic the activity-dependent synaptic plasticity that serves as the basis of memory and learning. Bidirectional long-term Hebbian plasticity modulation is implemented by the coactivity of pre- and postsynaptic spikes, and the sign and degree are affected by assorted factors including the temporal difference, spike rate and voltage. Moreover, synaptic saturation is observed to be an adjustment of Hebbian rules to stabilise the growth of synaptic weights. Our results may contribute to the development of highly functional plastic electronic synapses and the further construction of next-generation parallel neuromorphic computing architecture.
Collapse
|
253
|
Kleberg FI, Fukai T, Gilson M. Excitatory and inhibitory STDP jointly tune feedforward neural circuits to selectively propagate correlated spiking activity. Front Comput Neurosci 2014; 8:53. [PMID: 24847242 PMCID: PMC4019846 DOI: 10.3389/fncom.2014.00053] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/25/2013] [Accepted: 04/10/2014] [Indexed: 11/13/2022] Open
Abstract
Spike-timing-dependent plasticity (STDP) has been well established between excitatory neurons and several computational functions have been proposed in various neural systems. Despite some recent efforts, however, there is a significant lack of functional understanding of inhibitory STDP (iSTDP) and its interplay with excitatory STDP (eSTDP). Here, we demonstrate by analytical and numerical methods that iSTDP contributes crucially to the balance of excitatory and inhibitory weights for the selection of a specific signaling pathway among other pathways in a feedforward circuit. This pathway selection is based on the high sensitivity of STDP to correlations in spike times, which complements a recent proposal for the role of iSTDP in firing-rate based selection. Our model predicts that asymmetric anti-Hebbian iSTDP exceeds asymmetric Hebbian iSTDP for supporting pathway-specific balance, which we show is useful for propagating transient neuronal responses. Furthermore, we demonstrate how STDPs at excitatory-excitatory, excitatory-inhibitory, and inhibitory-excitatory synapses cooperate to improve the pathway selection. We propose that iSTDP is crucial for shaping the network structure that achieves efficient processing of synchronous spikes.
Collapse
Affiliation(s)
- Florence I Kleberg
- Lab for Neural Circuit Theory, RIKEN Brain Science Institute Wako, Japan
| | - Tomoki Fukai
- Lab for Neural Circuit Theory, RIKEN Brain Science Institute Wako, Japan
| | - Matthieu Gilson
- Lab for Neural Circuit Theory, RIKEN Brain Science Institute Wako, Japan
| |
Collapse
|
254
|
Kruskal PB, Li L, MacLean JN. Circuit reactivation dynamically regulates synaptic plasticity in neocortex. Nat Commun 2014; 4:2574. [PMID: 24108320 DOI: 10.1038/ncomms3574] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2013] [Accepted: 09/09/2013] [Indexed: 11/09/2022] Open
Abstract
Circuit reactivations involve a stereotyped sequence of neuronal firing and have been behaviourally linked to memory consolidation. Here we use multiphoton imaging and patch-clamp recording, and observe sparse and stereotyped circuit reactivations that correspond to UP states within active neurons. To evaluate the effect of the circuit on synaptic plasticity, we trigger a single spike-timing-dependent plasticity (STDP) pairing once per circuit reactivation. The pairings reliably fall within a particular epoch of the circuit sequence and result in long-term potentiation. During reactivation, the amplitude of plasticity significantly correlates with the preceding 20-25 ms of membrane depolarization rather than the depolarization at the time of pairing. This circuit-dependent plasticity provides a natural constraint on synaptic potentiation, regulating the inherent instability of STDP in an assembly phase-sequence model. Subthreshold voltage during endogenous circuit reactivations provides a critical informative context for plasticity and facilitates the stable consolidation of a spatiotemporal sequence.
Collapse
Affiliation(s)
- Peter B Kruskal
- Committee on Computational Neuroscience, University of Chicago, Chicago, Illinois 60637, USA
| | | | | |
Collapse
|
255
|
Chrol-Cannon J, Jin Y. Computational modeling of neural plasticity for self-organization of neural networks. Biosystems 2014; 125:43-54. [PMID: 24769242 DOI: 10.1016/j.biosystems.2014.04.003] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2013] [Revised: 04/03/2014] [Accepted: 04/04/2014] [Indexed: 11/28/2022]
Abstract
Self-organization in biological nervous systems during the lifetime is known to largely occur through a process of plasticity that is dependent upon the spike-timing activity in connected neurons. In the field of computational neuroscience, much effort has been dedicated to building up computational models of neural plasticity to replicate experimental data. Most recently, increasing attention has been paid to understanding the role of neural plasticity in functional and structural neural self-organization, as well as its influence on the learning performance of neural networks for accomplishing machine learning tasks such as classification and regression. Although many ideas and hypothesis have been suggested, the relationship between the structure, dynamics and learning performance of neural networks remains elusive. The purpose of this article is to review the most important computational models for neural plasticity and discuss various ideas about neural plasticity's role. Finally, we suggest a few promising research directions, in particular those along the line that combines findings in computational neuroscience and systems biology, and their synergetic roles in understanding learning, memory and cognition, thereby bridging the gap between computational neuroscience, systems biology and computational intelligence.
Collapse
Affiliation(s)
- Joseph Chrol-Cannon
- Department of Computing, University of Surrey, Guildford GU2 7XH, United Kingdom
| | - Yaochu Jin
- Department of Computing, University of Surrey, Guildford GU2 7XH, United Kingdom.
| |
Collapse
|
256
|
Jimenez Rezende D, Gerstner W. Stochastic variational learning in recurrent spiking networks. Front Comput Neurosci 2014; 8:38. [PMID: 24772078 PMCID: PMC3983494 DOI: 10.3389/fncom.2014.00038] [Citation(s) in RCA: 37] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2013] [Accepted: 03/17/2014] [Indexed: 11/15/2022] Open
Abstract
The ability to learn and perform statistical inference with biologically plausible recurrent networks of spiking neurons is an important step toward understanding perception and reasoning. Here we derive and investigate a new learning rule for recurrent spiking networks with hidden neurons, combining principles from variational learning and reinforcement learning. Our network defines a generative model over spike train histories and the derived learning rule has the form of a local Spike Timing Dependent Plasticity rule modulated by global factors (neuromodulators) conveying information about “novelty” on a statistically rigorous ground. Simulations show that our model is able to learn both stationary and non-stationary patterns of spike trains. We also propose one experiment that could potentially be performed with animals in order to test the dynamics of the predicted novelty signal.
Collapse
Affiliation(s)
- Danilo Jimenez Rezende
- Laboratory of Cognitive Neuroscience, School of Life Sciences, Brain Mind Institute, Ecole Polytechnique Federale de Lausanne Lausanne, Vaud, Switzerland ; Laboratory of Computational Neuroscience, School of Computer and Communication Sciences, Ecole Polytechnique Federale de Lausanne Lausanne, Vaud, Switzerland
| | - Wulfram Gerstner
- Laboratory of Cognitive Neuroscience, School of Life Sciences, Brain Mind Institute, Ecole Polytechnique Federale de Lausanne Lausanne, Vaud, Switzerland ; Laboratory of Computational Neuroscience, School of Computer and Communication Sciences, Ecole Polytechnique Federale de Lausanne Lausanne, Vaud, Switzerland
| |
Collapse
|
257
|
Abstract
Recent modeling of spike-timing-dependent plasticity indicates that plasticity involves as a third factor a local dendritic potential, besides pre- and postsynaptic firing times. We present a simple compartmental neuron model together with a non-Hebbian, biologically plausible learning rule for dendritic synapses where plasticity is modulated by these three factors. In functional terms, the rule seeks to minimize discrepancies between somatic firings and a local dendritic potential. Such prediction errors can arise in our model from stochastic fluctuations as well as from synaptic input, which directly targets the soma. Depending on the nature of this direct input, our plasticity rule subserves supervised or unsupervised learning. When a reward signal modulates the learning rate, reinforcement learning results. Hence a single plasticity rule supports diverse learning paradigms.
Collapse
|
258
|
Bhalla US. Molecular computation in neurons: a modeling perspective. Curr Opin Neurobiol 2014; 25:31-7. [DOI: 10.1016/j.conb.2013.11.006] [Citation(s) in RCA: 31] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2013] [Revised: 09/26/2013] [Accepted: 11/18/2013] [Indexed: 12/31/2022]
|
259
|
Toutounji H, Pipa G. Spatiotemporal computations of an excitable and plastic brain: neuronal plasticity leads to noise-robust and noise-constructive computations. PLoS Comput Biol 2014; 10:e1003512. [PMID: 24651447 PMCID: PMC3961183 DOI: 10.1371/journal.pcbi.1003512] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2013] [Accepted: 01/29/2014] [Indexed: 11/18/2022] Open
Abstract
It is a long-established fact that neuronal plasticity occupies the central role in generating neural function and computation. Nevertheless, no unifying account exists of how neurons in a recurrent cortical network learn to compute on temporally and spatially extended stimuli. However, these stimuli constitute the norm, rather than the exception, of the brain's input. Here, we introduce a geometric theory of learning spatiotemporal computations through neuronal plasticity. To that end, we rigorously formulate the problem of neural representations as a relation in space between stimulus-induced neural activity and the asymptotic dynamics of excitable cortical networks. Backed up by computer simulations and numerical analysis, we show that two canonical and widely spread forms of neuronal plasticity, that is, spike-timing-dependent synaptic plasticity and intrinsic plasticity, are both necessary for creating neural representations, such that these computations become realizable. Interestingly, the effects of these forms of plasticity on the emerging neural code relate to properties necessary for both combating and utilizing noise. The neural dynamics also exhibits features of the most likely stimulus in the network's spontaneous activity. These properties of the spatiotemporal neural code resulting from plasticity, having their grounding in nature, further consolidate the biological relevance of our findings.
Collapse
Affiliation(s)
- Hazem Toutounji
- Institute of Cognitive Science, University of Osnabrück, Osnabrück, Lower Saxony, Germany
| | - Gordon Pipa
- Institute of Cognitive Science, University of Osnabrück, Osnabrück, Lower Saxony, Germany
| |
Collapse
|
260
|
Zheng Y, Schwabe L. Shaping synaptic learning by the duration of postsynaptic action potential in a new STDP model. PLoS One 2014; 9:e88592. [PMID: 24551122 PMCID: PMC3925143 DOI: 10.1371/journal.pone.0088592] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2012] [Accepted: 01/13/2014] [Indexed: 12/04/2022] Open
Abstract
Single spikes and their timing matter in changing synaptic efficacy, which is known as spike-timing-dependent plasticity (STDP). Most previous studies treated spikes as all-or-none events, and considered their duration and magnitude as negligible. Here we explore the effects of action potential (AP) duration on synaptic plasticity in a simplified model neuron using computer simulations. We propose a novel STDP model that depresses synapses using an AP duration dependent LTD window and induces potentiation of synaptic strength when presynaptic spikes arrive before and during a postsynaptic AP (dSTDP). We demonstrate that AP duration is another key factor for insensitizing the postsynaptic neural firing and for controlling the shape of synaptic weight distribution. Extended AP durations produce a wide unimodal weight distribution that resembles the ones reported experimentally and make the postsynaptic neuron tranquil when disturbed by poisson noise spike trains, while equivalently sensitive to the synchronized. Our results suggest that the impact of AP duration, modeled here as an AP-dependent STDP window, on synaptic plasticity can be dramatic and should motivate future STDP studies.
Collapse
Affiliation(s)
- Youwei Zheng
- Faculty of Computer Science and Electrical Engineering, University of Rostock, Rostock, Germany
- * E-mail:
| | - Lars Schwabe
- Faculty of Computer Science and Electrical Engineering, University of Rostock, Rostock, Germany
| |
Collapse
|
261
|
Sinha DB, Ledbetter NM, Barbour DL. Spike-timing computation properties of a feed-forward neural network model. Front Comput Neurosci 2014; 8:5. [PMID: 24478688 PMCID: PMC3904091 DOI: 10.3389/fncom.2014.00005] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2013] [Accepted: 01/09/2014] [Indexed: 11/13/2022] Open
Abstract
Brain function is characterized by dynamical interactions among networks of neurons. These interactions are mediated by network topology at many scales ranging from microcircuits to brain areas. Understanding how networks operate can be aided by understanding how the transformation of inputs depends upon network connectivity patterns, e.g., serial and parallel pathways. To tractably determine how single synapses or groups of synapses in such pathways shape these transformations, we modeled feed-forward networks of 7–22 neurons in which synaptic strength changed according to a spike-timing dependent plasticity (STDP) rule. We investigated how activity varied when dynamics were perturbed by an activity-dependent electrical stimulation protocol (spike-triggered stimulation; STS) in networks of different topologies and background input correlations. STS can successfully reorganize functional brain networks in vivo, but with a variability in effectiveness that may derive partially from the underlying network topology. In a simulated network with a single disynaptic pathway driven by uncorrelated background activity, structured spike-timing relationships between polysynaptically connected neurons were not observed. When background activity was correlated or parallel disynaptic pathways were added, however, robust polysynaptic spike timing relationships were observed, and application of STS yielded predictable changes in synaptic strengths and spike-timing relationships. These observations suggest that precise input-related or topologically induced temporal relationships in network activity are necessary for polysynaptic signal propagation. Such constraints for polysynaptic computation suggest potential roles for higher-order topological structure in network organization, such as maintaining polysynaptic correlation in the face of relatively weak synapses.
Collapse
Affiliation(s)
- Drew B Sinha
- Laboratory of Sensory Neuroscience and Neuroengineering, Department of Biomedical Engineering, Washington University in St. Louis St. Louis, MO, USA
| | - Noah M Ledbetter
- Laboratory of Sensory Neuroscience and Neuroengineering, Department of Biomedical Engineering, Washington University in St. Louis St. Louis, MO, USA
| | - Dennis L Barbour
- Laboratory of Sensory Neuroscience and Neuroengineering, Department of Biomedical Engineering, Washington University in St. Louis St. Louis, MO, USA
| |
Collapse
|
262
|
Coexistence of reward and unsupervised learning during the operant conditioning of neural firing rates. PLoS One 2014; 9:e87123. [PMID: 24475240 PMCID: PMC3903641 DOI: 10.1371/journal.pone.0087123] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2013] [Accepted: 12/21/2013] [Indexed: 11/24/2022] Open
Abstract
A fundamental goal of neuroscience is to understand how cognitive processes, such as operant conditioning, are performed by the brain. Typical and well studied examples of operant conditioning, in which the firing rates of individual cortical neurons in monkeys are increased using rewards, provide an opportunity for insight into this. Studies of reward-modulated spike-timing-dependent plasticity (RSTDP), and of other models such as R-max, have reproduced this learning behavior, but they have assumed that no unsupervised learning is present (i.e., no learning occurs without, or independent of, rewards). We show that these models cannot elicit firing rate reinforcement while exhibiting both reward learning and ongoing, stable unsupervised learning. To fix this issue, we propose a new RSTDP model of synaptic plasticity based upon the observed effects that dopamine has on long-term potentiation and depression (LTP and LTD). We show, both analytically and through simulations, that our new model can exhibit unsupervised learning and lead to firing rate reinforcement. This requires that the strengthening of LTP by the reward signal is greater than the strengthening of LTD and that the reinforced neuron exhibits irregular firing. We show the robustness of our findings to spike-timing correlations, to the synaptic weight dependence that is assumed, and to changes in the mean reward. We also consider our model in the differential reinforcement of two nearby neurons. Our model aligns more strongly with experimental studies than previous models and makes testable predictions for future experiments.
Collapse
|
263
|
Vasilaki E, Giugliano M. Emergence of connectivity motifs in networks of model neurons with short- and long-term plastic synapses. PLoS One 2014; 9:e84626. [PMID: 24454735 PMCID: PMC3893143 DOI: 10.1371/journal.pone.0084626] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2013] [Accepted: 11/16/2013] [Indexed: 11/29/2022] Open
Abstract
Recent experimental data from the rodent cerebral cortex and olfactory bulb indicate that specific connectivity motifs are correlated with short-term dynamics of excitatory synaptic transmission. It was observed that neurons with short-term facilitating synapses form predominantly reciprocal pairwise connections, while neurons with short-term depressing synapses form predominantly unidirectional pairwise connections. The cause of these structural differences in excitatory synaptic microcircuits is unknown. We show that these connectivity motifs emerge in networks of model neurons, from the interactions between short-term synaptic dynamics (SD) and long-term spike-timing dependent plasticity (STDP). While the impact of STDP on SD was shown in simultaneous neuronal pair recordings in vitro, the mutual interactions between STDP and SD in large networks are still the subject of intense research. Our approach combines an SD phenomenological model with an STDP model that faithfully captures long-term plasticity dependence on both spike times and frequency. As a proof of concept, we first simulate and analyze recurrent networks of spiking neurons with random initial connection efficacies and where synapses are either all short-term facilitating or all depressing. For identical external inputs to the network, and as a direct consequence of internally generated activity, we find that networks with depressing synapses evolve unidirectional connectivity motifs, while networks with facilitating synapses evolve reciprocal connectivity motifs. We then show that the same results hold for heterogeneous networks, including both facilitating and depressing synapses. This does not contradict a recent theory that proposes that motifs are shaped by external inputs, but rather complements it by examining the role of both the external inputs and the internally generated network activity. Our study highlights the conditions under which SD-STDP might explain the correlation between facilitation and reciprocal connectivity motifs, as well as between depression and unidirectional motifs.
Collapse
Affiliation(s)
- Eleni Vasilaki
- Department of Computer Science, University of Sheffield, Sheffield, United Kingdom
- Department of Biomedical Sciences, University of Antwerp, Wilrijk, Belgium
| | - Michele Giugliano
- Department of Computer Science, University of Sheffield, Sheffield, United Kingdom
- Department of Biomedical Sciences, University of Antwerp, Wilrijk, Belgium
- Brain Mind Institute, Swiss Federal Institute of Technology of Lausanne, Lausanne, Switzerland
- * E-mail:
| |
Collapse
|
264
|
Abstract
Achieving high-level skills is generally considered to require intense training, which is thought to optimally engage neuronal plasticity mechanisms. Recent work, however, suggests that intensive training may not be necessary for skill learning. Skills can be effectively acquired by a complementary approach in which the learning occurs in response to mere exposure to repetitive sensory stimulation. Such training-independent sensory learning induces lasting changes in perception and goal-directed behaviour in humans, without any explicit task training. We suggest that the effectiveness of this form of learning in different sensory domains stems from the fact that the stimulation protocols used are optimized to alter synaptic transmission and efficacy. While this approach directly links behavioural research in humans with studies on cellular plasticity, other approaches show that learning can occur even in the absence of an actual stimulus. These include learning through imagery or feedback-induced cortical activation, resulting in learning without task training. All these approaches challenge our understanding of the mechanisms that mediate learning. Apparently, humans can learn under conditions thought to be impossible a few years ago. Although the underlying mechanisms are far from being understood, training-independent sensory learning opens novel possibilities for applications aimed at augmenting human cognition.
Collapse
Affiliation(s)
- Christian Beste
- Institute for Cognitive Neuroscience, Department of Biopsychology, Ruhr-Universität Bochum, Universitätsstrasse 150, D-44780 Bochum, Germany.
| | | |
Collapse
|
265
|
Abstract
The sensory cortex contains a wide array of neuronal types, which are connected together into complex but partially stereotyped circuits. Sensory stimuli trigger cascades of electrical activity through these circuits, causing specific features of sensory scenes to be encoded in the firing patterns of cortical populations. Recent research is beginning to reveal how the connectivity of individual neurons relates to the sensory features they encode, how differences in the connectivity patterns of different cortical cell classes enable them to encode information using different strategies, and how feedback connections from higher-order cortex allow sensory information to be integrated with behavioural context.
Collapse
|
266
|
Xue F, Hou Z, Li X. Computational capability of liquid state machines with spike-timing-dependent plasticity. Neurocomputing 2013. [DOI: 10.1016/j.neucom.2013.06.019] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
|
267
|
Miller A, Jin DZ. Potentiation decay of synapses and length distributions of synfire chains self-organized in recurrent neural networks. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2013; 88:062716. [PMID: 24483495 DOI: 10.1103/physreve.88.062716] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/20/2013] [Indexed: 06/03/2023]
Abstract
Synfire chains are thought to underlie precisely timed sequences of spikes observed in various brain regions and across species. How they are formed is not understood. Here we analyze self-organization of synfire chains through the spike-timing dependent plasticity (STDP) of the synapses, axon remodeling, and potentiation decay of synaptic weights in networks of neurons driven by noisy external inputs and subject to dominant feedback inhibition. Potentiation decay is the gradual, activity-independent reduction of synaptic weights over time. We show that potentiation decay enables a dynamic and statistically stable network connectivity when neurons spike spontaneously. Periodic stimulation of a subset of neurons leads to formation of synfire chains through a random recruitment process, which terminates when the chain connects to itself and forms a loop. We demonstrate that chain length distributions depend on the potentiation decay. Fast potentiation decay leads to long chains with wide distributions, while slow potentiation decay leads to short chains with narrow distributions. We suggest that the potentiation decay, which corresponds to the decay of early long-term potentiation of synapses, is an important synaptic plasticity rule in regulating formation of neural circuity through STDP.
Collapse
Affiliation(s)
- Aaron Miller
- Department of Physics, Bridgewater College, Bridgewater, Virginia 22812, USA
| | - Dezhe Z Jin
- Department of Physics, The Pennsylvania State University, University Park, Pennsylvania 16802, USA
| |
Collapse
|
268
|
Beyeler M, Dutt ND, Krichmar JL. Categorization and decision-making in a neurobiologically plausible spiking network using a STDP-like learning rule. Neural Netw 2013; 48:109-24. [DOI: 10.1016/j.neunet.2013.07.012] [Citation(s) in RCA: 77] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/24/2013] [Revised: 07/28/2013] [Accepted: 07/31/2013] [Indexed: 11/26/2022]
|
269
|
Kilpatrick ZP, Ermentrout B, Doiron B. Optimizing working memory with heterogeneity of recurrent cortical excitation. J Neurosci 2013; 33:18999-9011. [PMID: 24285904 PMCID: PMC6618706 DOI: 10.1523/jneurosci.1641-13.2013] [Citation(s) in RCA: 37] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2013] [Revised: 10/14/2013] [Accepted: 10/19/2013] [Indexed: 11/21/2022] Open
Abstract
A neural correlate of parametric working memory is a stimulus-specific rise in neuron firing rate that persists long after the stimulus is removed. Network models with local excitation and broad inhibition support persistent neural activity, linking network architecture and parametric working memory. Cortical neurons receive noisy input fluctuations that cause persistent activity to diffusively wander about the network, degrading memory over time. We explore how cortical architecture that supports parametric working memory affects the diffusion of persistent neural activity. Studying both a spiking network and a simplified potential well model, we show that spatially heterogeneous excitatory coupling stabilizes a discrete number of persistent states, reducing the diffusion of persistent activity over the network. However, heterogeneous coupling also coarse-grains the stimulus representation space, limiting the storage capacity of parametric working memory. The storage errors due to coarse-graining and diffusion trade off so that information transfer between the initial and recalled stimulus is optimized at a fixed network heterogeneity. For sufficiently long delay times, the optimal number of attractors is less than the number of possible stimuli, suggesting that memory networks can under-represent stimulus space to optimize performance. Our results clearly demonstrate the combined effects of network architecture and stochastic fluctuations on parametric memory storage.
Collapse
Affiliation(s)
| | - Bard Ermentrout
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania 15260, and
- Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania 15213
| | - Brent Doiron
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania 15260, and
- Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania 15213
| |
Collapse
|
270
|
Abstract
Spike timing-dependent plasticity (STDP) and other conventional Hebbian-type plasticity rules are prone to produce runaway dynamics of synaptic weights. Once potentiated, a synapse would have higher probability to lead to spikes and thus to be further potentiated, but once depressed, a synapse would tend to be further depressed. The runaway synaptic dynamics can be prevented by precisely balancing STDP rules for potentiation and depression; however, experimental evidence shows a great variety of potentiation and depression windows and magnitudes. Here we show that modifications of synapses to layer 2/3 pyramidal neurons from rat visual and auditory cortices in slices can be induced by intracellular tetanization: bursts of postsynaptic spikes without presynaptic stimulation. Induction of these heterosynaptic changes depended on the rise of intracellular calcium, and their direction and magnitude correlated with initial state of release mechanisms. We suggest that this type of plasticity serves as a mechanism that stabilizes the distribution of synaptic weights and prevents their runaway dynamics. To test this hypothesis, we develop a cortical neuron model implementing both homosynaptic (STDP) and heterosynaptic plasticity with properties matching the experimental data. We find that heterosynaptic plasticity effectively prevented runaway dynamics for the tested range of STDP and input parameters. Synaptic weights, although shifted from the original, remained normally distributed and nonsaturated. Our study presents a biophysically constrained model of how the interaction of different forms of plasticity--Hebbian and heterosynaptic--may prevent runaway synaptic dynamics and keep synaptic weights unsaturated and thus capable of further plastic changes and formation of new memories.
Collapse
|
271
|
Albers C, Schmiedt JT, Pawelzik KR. Theta-specific susceptibility in a model of adaptive synaptic plasticity. Front Comput Neurosci 2013; 7:170. [PMID: 24312047 PMCID: PMC3835974 DOI: 10.3389/fncom.2013.00170] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2013] [Accepted: 11/04/2013] [Indexed: 12/13/2022] Open
Abstract
Learning and memory formation are processes which are still not fully understood. It is widely believed that synaptic plasticity is the most important neural substrate for both. However, it has been observed that large-scale theta band oscillations in the mammalian brain are beneficial for learning, and it is not clear if and how this is linked to synaptic plasticity. Also, the underlying dynamics of synaptic plasticity itself have not been completely uncovered yet, especially for non-linear interactions between multiple spikes. Here, we present a new and simple dynamical model of synaptic plasticity. It incorporates novel contributions to synaptic plasticity including adaptation processes. We test its ability to reproduce non-linear effects on four different data sets of complex spike patterns, and show that the model can be tuned to reproduce the observed synaptic changes in great detail. When subjected to periodically varying firing rates, already linear pair based spike timing dependent plasticity (STDP) predicts a specific susceptibility of synaptic plasticity to pre- and postsynaptic firing rate oscillations in the theta-band. Our model retains this band-pass property, while for high firing rates in the non-linear regime it modifies the specific phase relation required for depression and potentiation. For realistic parameters, maximal synaptic potentiation occurs when the postsynaptic is trailing the presynaptic activity slightly. Anti-phase oscillations tend to depress it. Our results are well in line with experimental findings, providing a straightforward and mechanistic explanation for the importance of theta oscillations for learning.
Collapse
Affiliation(s)
- Christian Albers
- Department of Neurophysics, Institute for Theoretical Physics, University of Bremen Bremen, Germany
| | | | | |
Collapse
|
272
|
Zenke F, Hennequin G, Gerstner W. Synaptic plasticity in neural networks needs homeostasis with a fast rate detector. PLoS Comput Biol 2013; 9:e1003330. [PMID: 24244138 PMCID: PMC3828150 DOI: 10.1371/journal.pcbi.1003330] [Citation(s) in RCA: 91] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2013] [Accepted: 09/25/2013] [Indexed: 01/17/2023] Open
Abstract
Hebbian changes of excitatory synapses are driven by and further enhance correlations between pre- and postsynaptic activities. Hence, Hebbian plasticity forms a positive feedback loop that can lead to instability in simulated neural networks. To keep activity at healthy, low levels, plasticity must therefore incorporate homeostatic control mechanisms. We find in numerical simulations of recurrent networks with a realistic triplet-based spike-timing-dependent plasticity rule (triplet STDP) that homeostasis has to detect rate changes on a timescale of seconds to minutes to keep the activity stable. We confirm this result in a generic mean-field formulation of network activity and homeostatic plasticity. Our results strongly suggest the existence of a homeostatic regulatory mechanism that reacts to firing rate changes on the order of seconds to minutes. Learning and memory in the brain are thought to be mediated through Hebbian plasticity. When a group of neurons is repetitively active together, their connections get strengthened. This can cause co-activation even in the absence of the stimulus that triggered the change. To avoid run-away behavior it is important to prevent neurons from forming excessively strong connections. This is achieved by regulatory homeostatic mechanisms that constrain the overall activity. Here we study the stability of background activity in a recurrent network model with a plausible Hebbian learning rule and homeostasis. We find that the activity in our model is unstable unless homeostasis reacts to rate changes on a timescale of minutes or faster. Since this timescale is incompatible with most known forms of homeostasis, this implies the existence of a previously unknown, rapid homeostatic regulatory mechanism capable of either gating the rate of plasticity, or affecting synaptic efficacies otherwise on a short timescale.
Collapse
Affiliation(s)
- Friedemann Zenke
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, Ecole polytechnique fédérale de Lausanne, Lausanne, Switzerland
- * E-mail:
| | - Guillaume Hennequin
- Computational and Biological Learning Laboratory, Department of Engineering, University of Cambridge, Cambridge, United Kingdom
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, Ecole polytechnique fédérale de Lausanne, Lausanne, Switzerland
| |
Collapse
|
273
|
Artola A. Diabetes mellitus- and ageing-induced changes in the capacity for long-term depression and long-term potentiation inductions: Toward a unified mechanism. Eur J Pharmacol 2013; 719:161-169. [DOI: 10.1016/j.ejphar.2013.04.061] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2013] [Revised: 03/21/2013] [Accepted: 04/03/2013] [Indexed: 12/01/2022]
|
274
|
Scaling of topologically similar functional modules defines mouse primary auditory and somatosensory microcircuitry. J Neurosci 2013; 33:14048-60, 14060a. [PMID: 23986241 DOI: 10.1523/jneurosci.1977-13.2013] [Citation(s) in RCA: 38] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Mapping the flow of activity through neocortical microcircuits provides key insights into the underlying circuit architecture. Using a comparative analysis we determined the extent to which the dynamics of microcircuits in mouse primary somatosensory barrel field (S1BF) and auditory (A1) neocortex generalize. We imaged the simultaneous dynamics of up to 1126 neurons spanning multiple columns and layers using high-speed multiphoton imaging. The temporal progression and reliability of reactivation of circuit events in both regions suggested common underlying cortical design features. We used circuit activity flow to generate functional connectivity maps, or graphs, to test the microcircuit hypothesis within a functional framework. S1BF and A1 present a useful test of the postulate as both regions map sensory input anatomically, but each area appears organized according to different design principles. We projected the functional topologies into anatomical space and found benchmarks of organization that had been previously described using physiology and anatomical methods, consistent with a close mapping between anatomy and functional dynamics. By comparing graphs representing activity flow we found that each region is similarly organized as highlighted by hallmarks of small world, scale free, and hierarchical modular topologies. Models of prototypical functional circuits from each area of cortex were sufficient to recapitulate experimentally observed circuit activity. Convergence to common behavior by these models was accomplished using preferential attachment to scale from an auditory up to a somatosensory circuit. These functional data imply that the microcircuit hypothesis be framed as scalable principles of neocortical circuit design.
Collapse
|
275
|
Yger P, Harris KD. The Convallis rule for unsupervised learning in cortical networks. PLoS Comput Biol 2013; 9:e1003272. [PMID: 24204224 PMCID: PMC3808450 DOI: 10.1371/journal.pcbi.1003272] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2013] [Accepted: 08/28/2013] [Indexed: 01/26/2023] Open
Abstract
The phenomenology and cellular mechanisms of cortical synaptic plasticity are becoming known in increasing detail, but the computational principles by which cortical plasticity enables the development of sensory representations are unclear. Here we describe a framework for cortical synaptic plasticity termed the "Convallis rule", mathematically derived from a principle of unsupervised learning via constrained optimization. Implementation of the rule caused a recurrent cortex-like network of simulated spiking neurons to develop rate representations of real-world speech stimuli, enabling classification by a downstream linear decoder. Applied to spike patterns used in in vitro plasticity experiments, the rule reproduced multiple results including and beyond STDP. However STDP alone produced poorer learning performance. The mathematical form of the rule is consistent with a dual coincidence detector mechanism that has been suggested by experiments in several synaptic classes of juvenile neocortex. Based on this confluence of normative, phenomenological, and mechanistic evidence, we suggest that the rule may approximate a fundamental computational principle of the neocortex.
Collapse
Affiliation(s)
- Pierre Yger
- UCL Institute of Neurology and UCL Department of Neuroscience, Physiology, and Pharmacology, London, United Kingdom
- * E-mail:
| | - Kenneth D. Harris
- UCL Institute of Neurology and UCL Department of Neuroscience, Physiology, and Pharmacology, London, United Kingdom
| |
Collapse
|
276
|
Popovych OV, Yanchuk S, Tass PA. Self-organized noise resistance of oscillatory neural networks with spike timing-dependent plasticity. Sci Rep 2013; 3:2926. [PMID: 24113385 PMCID: PMC4070574 DOI: 10.1038/srep02926] [Citation(s) in RCA: 58] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2013] [Accepted: 09/25/2013] [Indexed: 01/01/2023] Open
Abstract
Intuitively one might expect independent noise to be a powerful tool for desynchronizing a population of synchronized neurons. We here show that, intriguingly, for oscillatory neural populations with adaptive synaptic weights governed by spike timing-dependent plasticity (STDP) the opposite is true. We found that the mean synaptic coupling in such systems increases dynamically in response to the increase of the noise intensity, and there is an optimal noise level, where the amount of synaptic coupling gets maximal in a resonance-like manner as found for the stochastic or coherence resonances, although the mechanism in our case is different. This constitutes a noise-induced self-organization of the synaptic connectivity, which effectively counteracts the desynchronizing impact of independent noise over a wide range of the noise intensity. Given the attempts to counteract neural synchrony underlying tinnitus with noisers and maskers, our results may be of clinical relevance.
Collapse
Affiliation(s)
- Oleksandr V. Popovych
- Institute of Neuroscience and Medicine – Neuromodulation (INM-7), Research Center Jülich, 52425 Jülich, Germany
| | - Serhiy Yanchuk
- Institute of Mathematics, Humboldt University of Berlin, 10099 Berlin, Germany
| | - Peter A. Tass
- Institute of Neuroscience and Medicine – Neuromodulation (INM-7), Research Center Jülich, 52425 Jülich, Germany
- Department of Neuromodulation, University of Cologne, 50924 Cologne, Germany
- Clinic for Stereotactic and Functional Neurosurgery, University of Cologne, 50924 Cologne, Germany
| |
Collapse
|
277
|
Modulation of distal calcium electrogenesis by neuropeptide Y₁ receptors inhibits neocortical long-term depression. J Neurosci 2013; 33:11184-93. [PMID: 23825421 DOI: 10.1523/jneurosci.5595-12.2013] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
In layer 5 neocortical pyramidal neurons, backpropagating action potentials (bAPs) firing at rates above a critical frequency (CF) induce supralinear Ca²⁺ influx and regenerative potentials in apical dendrites. Paired temporally with an EPSP, this Ca²⁺ influx can result in synaptic plasticity. We studied the actions of neuropeptide Y (NPY), an abundant neocortical neuropeptide, on Ca²⁺ influx in layer 5 pyramidal neurons of somatosensory neocortex in Sprague Dawley and Wistar rats, using a combination of somatic and dendritic intracellular recordings and simultaneous Ca²⁺ imaging. Ca²⁺ influx induced by trains of bAPs above a neuron's CF was inhibited by NPY, acting only at the distal dendrite, via Y₁ receptors. NPY does not affect evoked synaptic glutamate release, paired synaptic facilitation, or synaptic rundown in longer trains. Extracellular Cs⁺ did not prevent NPY's postsynaptic effects, suggesting it does not act via either G-protein-activated inwardly rectifying K⁺ conductance (G(IRK)) or hyperpolarization-activated, cyclic nucleotide-gated channels. NPY application suppresses the induction of the long-term depression (LTD) normally caused by pairing 100 EPSPs with bursts of 2 bAPs evoked at a supracritical frequency. These findings suggest that distal dendritic Ca²⁺ influx is necessary for LTD induction, and selective inhibition of this distal dendritic Ca²⁺ influx by NPY can thus regulate synaptic plasticity in layer 5 pyramidal neurons.
Collapse
|
278
|
Linear transformation of thalamocortical input by intracortical excitation. Nat Neurosci 2013; 16:1324-30. [PMID: 23933750 PMCID: PMC3855439 DOI: 10.1038/nn.3494] [Citation(s) in RCA: 115] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2013] [Accepted: 07/15/2013] [Indexed: 12/13/2022]
Abstract
Neurons in thalamorecipient layers of sensory cortices integrate thalamocortical and intracortical inputs. Although their functional properties can be inherited from the convergence of thalamic inputs, the roles of intracortical circuits in thalamocortical transformation of sensory information remain unclear. Here, by reversibly silencing intracortical excitatory circuits with optogenetic activation of parvalbumin-positive inhibitory neurons in mouse primary visual cortex, we compared visually-evoked thalamocortical input with total excitation in the same layer 4 pyramidal neurons. We found that intracortical excitatory circuits preserve the orientation and direction tuning of thalamocortical excitation, with a linear amplification of thalamocortical signals by about threefold. The spatial receptive field of thalamocortical input is slightly elongated, and is expanded by intracortical excitation in an approximately proportional manner. Thus, intracortical excitatory circuits faithfully reinforce the representation of thalamocortical information, and may influence the size of the receptive field by recruiting additional inputs.
Collapse
|
279
|
Abstract
Storing and recalling spiking sequences is a general problem the brain needs to solve. It is, however, unclear what type of biologically plausible learning rule is suited to learn a wide class of spatiotemporal activity patterns in a robust way. Here we consider a recurrent network of stochastic spiking neurons composed of both visible and hidden neurons. We derive a generic learning rule that is matched to the neural dynamics by minimizing an upper bound on the Kullback-Leibler divergence from the target distribution to the model distribution. The derived learning rule is consistent with spike-timing dependent plasticity in that a presynaptic spike preceding a postsynaptic spike elicits potentiation while otherwise depression emerges. Furthermore, the learning rule for synapses that target visible neurons can be matched to the recently proposed voltage-triplet rule. The learning rule for synapses that target hidden neurons is modulated by a global factor, which shares properties with astrocytes and gives rise to testable predictions.
Collapse
|
280
|
Reato D, Bikson M, Parra LC. Long-term effects of weak electrical stimulation on active neuronal networks. BMC Neurosci 2013. [PMCID: PMC3704781 DOI: 10.1186/1471-2202-14-s1-p308] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
|
281
|
Natural firing patterns reduce sensitivity of synaptic plasticity to spike-timing. BMC Neurosci 2013. [PMCID: PMC3704802 DOI: 10.1186/1471-2202-14-s1-p304] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
|
282
|
Inhibitory interneurons decorrelate excitatory cells to drive sparse code formation in a spiking model of V1. J Neurosci 2013; 33:5475-85. [PMID: 23536063 DOI: 10.1523/jneurosci.4188-12.2013] [Citation(s) in RCA: 64] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Sparse coding models of natural scenes can account for several physiological properties of primary visual cortex (V1), including the shapes of simple cell receptive fields (RFs) and the highly kurtotic firing rates of V1 neurons. Current spiking network models of pattern learning and sparse coding require direct inhibitory connections between the excitatory simple cells, in conflict with the physiological distinction between excitatory (glutamatergic) and inhibitory (GABAergic) neurons (Dale's Law). At the same time, the computational role of inhibitory neurons in cortical microcircuit function has yet to be fully explained. Here we show that adding a separate population of inhibitory neurons to a spiking model of V1 provides conformance to Dale's Law, proposes a computational role for at least one class of interneurons, and accounts for certain observed physiological properties in V1. When trained on natural images, this excitatory-inhibitory spiking circuit learns a sparse code with Gabor-like RFs as found in V1 using only local synaptic plasticity rules. The inhibitory neurons enable sparse code formation by suppressing predictable spikes, which actively decorrelates the excitatory population. The model predicts that only a small number of inhibitory cells is required relative to excitatory cells and that excitatory and inhibitory input should be correlated, in agreement with experimental findings in visual cortex. We also introduce a novel local learning rule that measures stimulus-dependent correlations between neurons to support "explaining away" mechanisms in neural coding.
Collapse
|
283
|
Uramoto T, Torikai H. A calcium-based simple model of multiple spike interactions in spike-timing-dependent plasticity. Neural Comput 2013; 25:1853-69. [PMID: 23607556 DOI: 10.1162/neco_a_00462] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Spike-timing-dependent plasticity (STDP) is a form of synaptic modification that depends on the relative timings of presynaptic and postsynaptic spikes. In this letter, we proposed a calcium-based simple STDP model, described by an ordinary differential equation having only three state variables: one represents the density of intracellular calcium, one represents a fraction of open state NMDARs, and one represents the synaptic weight. We shown that in spite of its simplicity, the model can reproduce the properties of the plasticity that have been experimentally measured in various brain areas (e.g., layer 2/3 and 5 visual cortical slices, hippocampal cultures, and layer 2/3 somatosensory cortical slices) with respect to various patterns of presynaptic and postsynaptic spikes. In addition, comparisons with other STDP models are made, and the significance and advantages of the proposed model are discussed.
Collapse
Affiliation(s)
- Takumi Uramoto
- Graduate School of Engineering Science, Osaka University, Toyonaka, Osaka 560-8531, Japan.
| | | |
Collapse
|
284
|
Ostojic S, Fusi S. Synaptic encoding of temporal contiguity. Front Comput Neurosci 2013; 7:32. [PMID: 23641210 PMCID: PMC3640208 DOI: 10.3389/fncom.2013.00032] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2013] [Accepted: 03/25/2013] [Indexed: 12/02/2022] Open
Abstract
Often we need to perform tasks in an environment that changes stochastically. In these situations it is important to learn the statistics of sequences of events in order to predict the future and the outcome of our actions. The statistical description of many of these sequences can be reduced to the set of probabilities that a particular event follows another event (temporal contiguity). Under these conditions, it is important to encode and store in our memory these transition probabilities. Here we show that for a large class of synaptic plasticity models, the distribution of synaptic strengths encodes transitions probabilities. Specifically, when the synaptic dynamics depend on pairs of contiguous events and the synapses can remember multiple instances of the transitions, then the average synaptic weights are a monotonic function of the transition probabilities. The synaptic weights converge to the distribution encoding the probabilities also when the correlations between consecutive synaptic modifications are considered. We studied how this distribution depends on the number of synaptic states for a specific model of a multi-state synapse with hard bounds. In the case of bistable synapses, the average synaptic weights are a smooth function of the transition probabilities and the accuracy of the encoding depends on the learning rate. As the number of synaptic states increases, the average synaptic weights become a step function of the transition probabilities. We finally show that the information stored in the synaptic weights can be read out by a simple rate-based neural network. Our study shows that synapses encode transition probabilities under general assumptions and this indicates that temporal contiguity is likely to be encoded and harnessed in almost every neural circuit in the brain.
Collapse
Affiliation(s)
- Srdjan Ostojic
- Department of Neuroscience, Center for Theoretical Neuroscience, Columbia University Medical Center New York, NY, USA ; Department Etudes Cognitives, CNRS, Group for Neural Theory, LNC INSERM U960, Ecole Normale Superieure Paris, France
| | | |
Collapse
|
285
|
Ko H, Cossell L, Baragli C, Antolik J, Clopath C, Hofer SB, Mrsic-Flogel TD. The emergence of functional microcircuits in visual cortex. Nature 2013; 496:96-100. [PMID: 23552948 PMCID: PMC4843961 DOI: 10.1038/nature12015] [Citation(s) in RCA: 268] [Impact Index Per Article: 24.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2012] [Accepted: 02/14/2013] [Indexed: 12/03/2022]
Abstract
Sensory processing occurs in neocortical microcircuits in which synaptic connectivity is highly structured and excitatory neurons form subnetworks that process related sensory information. However, the developmental mechanisms underlying the formation of functionally organized connectivity in cortical microcircuits remain unknown. Here we directly relate patterns of excitatory synaptic connectivity to visual response properties of neighbouring layer 2/3 pyramidal neurons in mouse visual cortex at different postnatal ages, using two-photon calcium imaging in vivo and multiple whole-cell recordings in vitro. Although neural responses were already highly selective for visual stimuli at eye opening, neurons responding to similar visual features were not yet preferentially connected, indicating that the emergence of feature selectivity does not depend on the precise arrangement of local synaptic connections. After eye opening, local connectivity reorganized extensively: more connections formed selectively between neurons with similar visual responses and connections were eliminated between visually unresponsive neurons, but the overall connectivity rate did not change. We propose a sequential model of cortical microcircuit development based on activity-dependent mechanisms of plasticity whereby neurons first acquire feature preference by selecting feedforward inputs before the onset of sensory experience--a process that may be facilitated by early electrical coupling between neuronal subsets--and then patterned input drives the formation of functional subnetworks through a redistribution of recurrent synaptic connections.
Collapse
Affiliation(s)
- Ho Ko
- Department of Neuroscience, Physiology and Pharmacology, University College London, 21 University Street, London WC1E 6DE, UK
| | | | | | | | | | | | | |
Collapse
|
286
|
Bar-Ilan L, Gidon A, Segev I. The role of dendritic inhibition in shaping the plasticity of excitatory synapses. Front Neural Circuits 2013; 6:118. [PMID: 23565076 PMCID: PMC3615258 DOI: 10.3389/fncir.2012.00118] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2012] [Accepted: 12/19/2012] [Indexed: 11/17/2022] Open
Abstract
Using computational tools we explored the impact of local synaptic inhibition on the plasticity of excitatory synapses in dendrites. The latter critically depends on the intracellular concentration of calcium, which in turn, depends on membrane potential and thus on inhibitory activity in particular dendritic compartments. We systematically characterized the dependence of excitatory synaptic plasticity on dendritic morphology, loci and strength, as well as on the spatial distribution of inhibitory synapses and on the level of excitatory activity. Plasticity of excitatory synapses may attain three states: “protected” (unchanged), potentiated (long-term potentiation; LTP), or depressed (long-term depression; LTD). The transition between these three plasticity states could be finely tuned by synaptic inhibition with high spatial resolution. Strategic placement of inhibition could give rise to the co-existence of all three states over short dendritic branches. We compared the plasticity effect of the innervation patterns typical of different inhibitory subclasses—Chandelier, Basket, Martinotti, and Double Bouquet—in a detailed model of a layer 5 pyramidal cell. Our study suggests that dendritic inhibition plays a key role in shaping and fine-tuning excitatory synaptic plasticity in dendrites.
Collapse
Affiliation(s)
- Lital Bar-Ilan
- Department of Neurobiology, The Hebrew University of Jerusalem Israel
| | | | | |
Collapse
|
287
|
Borisyuk R, Chik D, Kazanovich Y, da Silva Gomes J. Spiking neural network model for memorizing sequences with forward and backward recall. Biosystems 2013; 112:214-23. [PMID: 23562400 DOI: 10.1016/j.biosystems.2013.03.018] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2012] [Revised: 02/22/2013] [Accepted: 03/26/2013] [Indexed: 02/07/2023]
Abstract
We present an oscillatory network of conductance based spiking neurons of Hodgkin-Huxley type as a model of memory storage and retrieval of sequences of events (or objects). The model is inspired by psychological and neurobiological evidence on sequential memories. The building block of the model is an oscillatory module which contains excitatory and inhibitory neurons with all-to-all connections. The connection architecture comprises two layers. A lower layer represents consecutive events during their storage and recall. This layer is composed of oscillatory modules. Plastic excitatory connections between the modules are implemented using an STDP type learning rule for sequential storage. Excitatory neurons in the upper layer project star-like modifiable connections toward the excitatory lower layer neurons. These neurons in the upper layer are used to tag sequences of events represented in the lower layer. Computer simulations demonstrate good performance of the model including difficult cases when different sequences contain overlapping events. We show that the model with STDP type or anti-STDP type learning rules can be applied for the simulation of forward and backward replay of neural spikes respectively.
Collapse
Affiliation(s)
- Roman Borisyuk
- School of Computing and Mathematics, University of Plymouth, UK.
| | | | | | | |
Collapse
|
288
|
Nessler B, Pfeiffer M, Buesing L, Maass W. Bayesian computation emerges in generic cortical microcircuits through spike-timing-dependent plasticity. PLoS Comput Biol 2013; 9:e1003037. [PMID: 23633941 PMCID: PMC3636028 DOI: 10.1371/journal.pcbi.1003037] [Citation(s) in RCA: 112] [Impact Index Per Article: 10.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2012] [Accepted: 03/04/2013] [Indexed: 11/24/2022] Open
Abstract
The principles by which networks of neurons compute, and how spike-timing dependent plasticity (STDP) of synaptic weights generates and maintains their computational function, are unknown. Preceding work has shown that soft winner-take-all (WTA) circuits, where pyramidal neurons inhibit each other via interneurons, are a common motif of cortical microcircuits. We show through theoretical analysis and computer simulations that Bayesian computation is induced in these network motifs through STDP in combination with activity-dependent changes in the excitability of neurons. The fundamental components of this emergent Bayesian computation are priors that result from adaptation of neuronal excitability and implicit generative models for hidden causes that are created in the synaptic weights through STDP. In fact, a surprising result is that STDP is able to approximate a powerful principle for fitting such implicit generative models to high-dimensional spike inputs: Expectation Maximization. Our results suggest that the experimentally observed spontaneous activity and trial-to-trial variability of cortical neurons are essential features of their information processing capability, since their functional role is to represent probability distributions rather than static neural codes. Furthermore it suggests networks of Bayesian computation modules as a new model for distributed information processing in the cortex.
Collapse
Affiliation(s)
- Bernhard Nessler
- Institute for Theoretical Computer Science, Graz University of Technology, Graz, Austria.
| | | | | | | |
Collapse
|
289
|
Babadi B, Abbott LF. Pairwise analysis can account for network structures arising from spike-timing dependent plasticity. PLoS Comput Biol 2013; 9:e1002906. [PMID: 23436986 PMCID: PMC3578766 DOI: 10.1371/journal.pcbi.1002906] [Citation(s) in RCA: 32] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2012] [Accepted: 12/14/2012] [Indexed: 11/18/2022] Open
Abstract
Spike timing-dependent plasticity (STDP) modifies synaptic strengths based on timing information available locally at each synapse. Despite this, it induces global structures within a recurrently connected network. We study such structures both through simulations and by analyzing the effects of STDP on pair-wise interactions of neurons. We show how conventional STDP acts as a loop-eliminating mechanism and organizes neurons into in- and out-hubs. Loop-elimination increases when depression dominates and turns into loop-generation when potentiation dominates. STDP with a shifted temporal window such that coincident spikes cause depression enhances recurrent connections and functions as a strict buffering mechanism that maintains a roughly constant average firing rate. STDP with the opposite temporal shift functions as a loop eliminator at low rates and as a potent loop generator at higher rates. In general, studying pairwise interactions of neurons provides important insights about the structures that STDP can produce in large networks.
Collapse
Affiliation(s)
- Baktash Babadi
- Center for Theoretical Neuroscience, Department of Neuroscience, Columbia University, New York, New York, United States of America.
| | | |
Collapse
|
290
|
Honnuraiah S, Narayanan R. A calcium-dependent plasticity rule for HCN channels maintains activity homeostasis and stable synaptic learning. PLoS One 2013; 8:e55590. [PMID: 23390543 PMCID: PMC3563588 DOI: 10.1371/journal.pone.0055590] [Citation(s) in RCA: 36] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2012] [Accepted: 12/27/2012] [Indexed: 01/17/2023] Open
Abstract
Theoretical and computational frameworks for synaptic plasticity and learning have a long and cherished history, with few parallels within the well-established literature for plasticity of voltage-gated ion channels. In this study, we derive rules for plasticity in the hyperpolarization-activated cyclic nucleotide-gated (HCN) channels, and assess the synergy between synaptic and HCN channel plasticity in establishing stability during synaptic learning. To do this, we employ a conductance-based model for the hippocampal pyramidal neuron, and incorporate synaptic plasticity through the well-established Bienenstock-Cooper-Munro (BCM)-like rule for synaptic plasticity, wherein the direction and strength of the plasticity is dependent on the concentration of calcium influx. Under this framework, we derive a rule for HCN channel plasticity to establish homeostasis in synaptically-driven firing rate, and incorporate such plasticity into our model. In demonstrating that this rule for HCN channel plasticity helps maintain firing rate homeostasis after bidirectional synaptic plasticity, we observe a linear relationship between synaptic plasticity and HCN channel plasticity for maintaining firing rate homeostasis. Motivated by this linear relationship, we derive a calcium-dependent rule for HCN-channel plasticity, and demonstrate that firing rate homeostasis is maintained in the face of synaptic plasticity when moderate and high levels of cytosolic calcium influx induced depression and potentiation of the HCN-channel conductance, respectively. Additionally, we show that such synergy between synaptic and HCN-channel plasticity enhances the stability of synaptic learning through metaplasticity in the BCM-like synaptic plasticity profile. Finally, we demonstrate that the synergistic interaction between synaptic and HCN-channel plasticity preserves robustness of information transfer across the neuron under a rate-coding schema. Our results establish specific physiological roles for experimentally observed plasticity in HCN channels accompanying synaptic plasticity in hippocampal neurons, and uncover potential links between HCN-channel plasticity and calcium influx, dynamic gain control and stable synaptic learning.
Collapse
Affiliation(s)
- Suraj Honnuraiah
- Cellular Neurophysiology Laboratory, Molecular Biophysics Unit, Indian Institute of Science, Bangalore, India
| | - Rishikesh Narayanan
- Cellular Neurophysiology Laboratory, Molecular Biophysics Unit, Indian Institute of Science, Bangalore, India
| |
Collapse
|
291
|
Pawlak V, Greenberg DS, Sprekeler H, Gerstner W, Kerr JND. Changing the responses of cortical neurons from sub- to suprathreshold using single spikes in vivo. eLife 2013; 2:e00012. [PMID: 23359858 PMCID: PMC3552422 DOI: 10.7554/elife.00012] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2012] [Accepted: 11/29/2012] [Indexed: 11/13/2022] Open
Abstract
Action Potential (APs) patterns of sensory cortex neurons encode a variety of stimulus features, but how can a neuron change the feature to which it responds? Here, we show that in vivo a spike-timing-dependent plasticity (STDP) protocol—consisting of pairing a postsynaptic AP with visually driven presynaptic inputs—modifies a neurons' AP-response in a bidirectional way that depends on the relative AP-timing during pairing. Whereas postsynaptic APs repeatedly following presynaptic activation can convert subthreshold into suprathreshold responses, APs repeatedly preceding presynaptic activation reduce AP responses to visual stimulation. These changes were paralleled by restructuring of the neurons response to surround stimulus locations and membrane-potential time-course. Computational simulations could reproduce the observed subthreshold voltage changes only when presynaptic temporal jitter was included. Together this shows that STDP rules can modify output patterns of sensory neurons and the timing of single-APs plays a crucial role in sensory coding and plasticity. DOI:http://dx.doi.org/10.7554/eLife.00012.001 Nerve cells, called neurons, are one of the core components of the brain and form complex networks by connecting to other neurons via long, thin ‘wire-like’ processes called axons. Axons can extend across the brain, enabling neurons to form connections—or synapses—with thousands of others. It is through these complex networks that incoming information from sensory organs, such as the eye, is propagated through the brain and encoded. The basic unit of communication between neurons is the action potential, often called a ‘spike’, which propagates along the network of axons and, through a chemical process at synapses, communicates with the postsynaptic neurons that the axon is connected to. These action potentials excite the neuron that they arrive at, and this excitatory process can generate a new action potential that then propagates along the axon to excite additional target neurons. In the visual areas of the cortex, neurons respond with action potentials when they ‘recognize’ a particular feature in a scene—a process called tuning. How a neuron becomes tuned to certain features in the world and not to others is unclear, as are the rules that enable a neuron to change what it is tuned to. What is clear, however, is that to understand this process is to understand the basis of sensory perception. Memory storage and formation is thought to occur at synapses. The efficiency of signal transmission between neurons can increase or decrease over time, and this process is often referred to as synaptic plasticity. But for these synaptic changes to be transmitted to target neurons, the changes must alter the number of action potentials. Although it has been shown in vitro that the efficiency of synaptic transmission—that is the strength of the synapse—can be altered by changing the order in which the pre- and postsynaptic cells are activated (referred to as ‘Spike-timing-dependent plasticity’), this has never been shown to have an effect on the number of action potentials generated in a single neuron in vivo. It is therefore unknown whether this process is functionally relevant. Now Pawlak et al. report that spike-timing-dependent plasticity in the visual cortex of anaesthetized rats can change the spiking of neurons in the visual cortex. They used a visual stimulus (a bar flashed up for half a second) to activate a presynaptic cell, and triggered a single action potential in the postsynaptic cell a very short time later. By repeatedly activating the cells in this way, they increased the strength of the synaptic connection between the two neurons. After a small number of these pairing activations, presenting the visual stimulus alone to the presynaptic cell was enough to trigger an action potential (a suprathreshold response) in the postsynaptic neuron—even though this was not the case prior to the pairing. This study shows that timing rules known to change the strength of synaptic connections—and proposed to underlie learning and memory—have functional relevance in vivo, and that the timing of single action potentials can change the functional status of a cortical neuron. DOI:http://dx.doi.org/10.7554/eLife.00012.002
Collapse
Affiliation(s)
- Verena Pawlak
- Network Imaging Group , Max Planck Institute for Biological Cybernetics , Tübingen , Germany
| | | | | | | | | |
Collapse
|
292
|
Palm G. Neural associative memories and sparse coding. Neural Netw 2013; 37:165-71. [DOI: 10.1016/j.neunet.2012.08.013] [Citation(s) in RCA: 80] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2012] [Revised: 08/17/2012] [Accepted: 08/22/2012] [Indexed: 11/16/2022]
|
293
|
van Rossum MCW, Shippi M, Barrett AB. Soft-bound synaptic plasticity increases storage capacity. PLoS Comput Biol 2012; 8:e1002836. [PMID: 23284281 PMCID: PMC3527223 DOI: 10.1371/journal.pcbi.1002836] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2012] [Accepted: 10/24/2012] [Indexed: 12/02/2022] Open
Abstract
Accurate models of synaptic plasticity are essential to understand the adaptive properties of the nervous system and for realistic models of learning and memory. Experiments have shown that synaptic plasticity depends not only on pre- and post-synaptic activity patterns, but also on the strength of the connection itself. Namely, weaker synapses are more easily strengthened than already strong ones. This so called soft-bound plasticity automatically constrains the synaptic strengths. It is known that this has important consequences for the dynamics of plasticity and the synaptic weight distribution, but its impact on information storage is unknown. In this modeling study we introduce an information theoretic framework to analyse memory storage in an online learning setting. We show that soft-bound plasticity increases a variety of performance criteria by about 18% over hard-bound plasticity, and likely maximizes the storage capacity of synapses. It is generally believed that our memories are stored in the synaptic connections between neurons. Numerous experimental studies have therefore examined when and how the synaptic connections change. In parallel, many computational studies have examined the properties of memory and synaptic plasticity, aiming to better understand human memory and allow for neural network models of the brain. However, the plasticity rules used in most studies are highly simplified and do not take into account the rich behaviour found in experiments. For instance, it has been observed in experiments that it is hard to make strong synapses even stronger. Here we show that this saturation of plasticity enhances the number of memories that can be stored and introduce a general framework to calculate information storage in online learning paradigms.
Collapse
Affiliation(s)
- Mark C W van Rossum
- Institute for Adaptive and Neural Computation, School of Informatics, University of Edinburgh, Edinburgh, United Kingdom.
| | | | | |
Collapse
|
294
|
Use of multi-electrode array recordings in studies of network synaptic plasticity in both time and space. Neurosci Bull 2012; 28:409-22. [PMID: 22833039 DOI: 10.1007/s12264-012-1251-5] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022] Open
Abstract
Simultaneous multisite recording using multi-electrode arrays (MEAs) in cultured and acutely-dissociated brain slices and other tissues is an emerging technique in the field of network electrophysiology. Over the past 40 years, great efforts have been made by both scientists and commercial concerns, to advance this technique. The MEA technique has been widely applied to many regions of the brain, retina, heart and smooth muscle in various studies at the network level. The present review starts from the development of MEA techniques and their uses in brain preparations, and then specifically concentrates on the use of MEA recordings in studies of synaptic plasticity at the network level in both the temporal and spatial domains. Because the MEA technique helps bridge the gap between single-cell recordings and behavioral assays, its wide application will undoubtedly shed light on the mechanisms underlying brain functions and dysfunctions at the network level that remained largely unknown due to the technical difficulties before it matured.
Collapse
|
295
|
Abstract
In spike-timing-dependent plasticity (STDP), the order and precise temporal interval between presynaptic and postsynaptic spikes determine the sign and magnitude of long-term potentiation (LTP) or depression (LTD). STDP is widely utilized in models of circuit-level plasticity, development, and learning. However, spike timing is just one of several factors (including firing rate, synaptic cooperativity, and depolarization) that govern plasticity induction, and its relative importance varies across synapses and activity regimes. This review summarizes this broader view of plasticity, including the forms and cellular mechanisms for the spike-timing dependence of plasticity, and, the evidence that spike timing is an important determinant of plasticity in vivo.
Collapse
Affiliation(s)
- Daniel E Feldman
- Department of Molecular and Cell Biology, and Helen Wills Neuroscience Institute, University of California, Berkeley, Berkeley, CA 94720-3200, USA.
| |
Collapse
|
296
|
Cutsuridis V. Bursts shape the NMDA-R mediated spike timing dependent plasticity curve: role of burst interspike interval and GABAergic inhibition. Cogn Neurodyn 2012; 6:421-41. [PMID: 24082963 PMCID: PMC3438326 DOI: 10.1007/s11571-012-9205-1] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2012] [Revised: 04/24/2012] [Accepted: 05/02/2012] [Indexed: 11/28/2022] Open
Abstract
Spike timing dependent plasticity (STDP) is a synaptic learning rule where the relative timing between the presynaptic and postsynaptic action potentials determines the sign and strength of synaptic plasticity. In its basic form STDP has an asymmetric form which incorporates both persistent increases and persistent decreases in synaptic strength. The basic form of STDP, however, is not a fixed property and depends on the dendritic location. An asymmetric curve is observed in the distal dendrites, whereas a symmetrical one is observed in the proximal ones. A recent computational study has shown that the transition from the asymmetry to symmetry is due to inhibition under certain conditions. Synapses have also been observed to be unreliable at generating plasticity when excitatory postsynaptic potentials and single spikes are paired at low frequencies. Bursts of spikes, however, are reliably signaled because transmitter release is facilitated. This article presents a two-compartment model of the CA1 pyramidal cell. The model is neurophysiologically plausible with its dynamics resulting from the interplay of many ionic and synaptic currents. Plasticity is measured by a deterministic Ca(2+) dynamics model which measures the instantaneous calcium level and its time course in the dendrite and change the strength of the synapse accordingly. The model is validated to match the asymmetrical form of STDP from the pairing of a presynaptic (dendritic) and postsynaptic (somatic) spikes as observed experimentally. With the parameter set unchanged the model investigates how pairing of bursts with single spikes and bursts in the presence or absence of inhibition shapes the STDP curve. The model predicts that inhibition strength and frequency are not the only factors of the asymmetry-to-symmetry switch of the STDP curve. Burst interspike interval is another factor. This study is an important first step towards understanding how STDP is affected under natural firing patterns in vivo.
Collapse
|
297
|
Knoblauch A, Hauser F, Gewaltig MO, Körner E, Palm G. Does spike-timing-dependent synaptic plasticity couple or decouple neurons firing in synchrony? Front Comput Neurosci 2012; 6:55. [PMID: 22936909 PMCID: PMC3424530 DOI: 10.3389/fncom.2012.00055] [Citation(s) in RCA: 39] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2012] [Accepted: 07/12/2012] [Indexed: 12/25/2022] Open
Abstract
Spike synchronization is thought to have a constructive role for feature integration, attention, associative learning, and the formation of bidirectionally connected Hebbian cell assemblies. By contrast, theoretical studies on spike-timing-dependent plasticity (STDP) report an inherently decoupling influence of spike synchronization on synaptic connections of coactivated neurons. For example, bidirectional synaptic connections as found in cortical areas could be reproduced only by assuming realistic models of STDP and rate coding. We resolve this conflict by theoretical analysis and simulation of various simple and realistic STDP models that provide a more complete characterization of conditions when STDP leads to either coupling or decoupling of neurons firing in synchrony. In particular, we show that STDP consistently couples synchronized neurons if key model parameters are matched to physiological data: First, synaptic potentiation must be significantly stronger than synaptic depression for small (positive or negative) time lags between presynaptic and postsynaptic spikes. Second, spike synchronization must be sufficiently imprecise, for example, within a time window of 5-10 ms instead of 1 ms. Third, axonal propagation delays should not be much larger than dendritic delays. Under these assumptions synchronized neurons will be strongly coupled leading to a dominance of bidirectional synaptic connections even for simple STDP models and low mean firing rates at the level of spontaneous activity.
Collapse
|
298
|
Long-term memory search across the visual brain. Neural Plast 2012; 2012:392695. [PMID: 22900206 PMCID: PMC3409559 DOI: 10.1155/2012/392695] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2012] [Revised: 05/23/2012] [Accepted: 05/28/2012] [Indexed: 11/17/2022] Open
Abstract
Signal transmission from the human retina to visual cortex and connectivity of visual brain areas are relatively well understood. How specific visual perceptions transform into corresponding long-term memories remains unknown. Here, I will review recent Blood Oxygenation Level-Dependent functional Magnetic Resonance Imaging (BOLD fMRI) in humans together with molecular biology studies (animal models) aiming to understand how the retinal image gets transformed into so-called visual (retinotropic) maps. The broken object paradigm has been chosen in order to illustrate the complexity of multisensory perception of simple objects subject to visual —rather than semantic— type of memory encoding. The author explores how amygdala projections to the visual cortex affect the memory formation and proposes the choice of experimental techniques needed to explain our massive visual memory capacity. Maintenance of the visual long-term memories is suggested to require recycling of GluR2-containing α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid receptors (AMPAR) and β2-adrenoreceptors at the postsynaptic membrane, which critically depends on the catalytic activity of the N-ethylmaleimide-sensitive factor (NSF) and protein kinase PKMζ.
Collapse
|
299
|
Spectral analysis of input spike trains by spike-timing-dependent plasticity. PLoS Comput Biol 2012; 8:e1002584. [PMID: 22792056 PMCID: PMC3390410 DOI: 10.1371/journal.pcbi.1002584] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2011] [Accepted: 05/14/2012] [Indexed: 11/19/2022] Open
Abstract
Spike-timing-dependent plasticity (STDP) has been observed in many brain areas such as sensory cortices, where it is hypothesized to structure synaptic connections between neurons. Previous studies have demonstrated how STDP can capture spiking information at short timescales using specific input configurations, such as coincident spiking, spike patterns and oscillatory spike trains. However, the corresponding computation in the case of arbitrary input signals is still unclear. This paper provides an overarching picture of the algorithm inherent to STDP, tying together many previous results for commonly used models of pairwise STDP. For a single neuron with plastic excitatory synapses, we show how STDP performs a spectral analysis on the temporal cross-correlograms between its afferent spike trains. The postsynaptic responses and STDP learning window determine kernel functions that specify how the neuron “sees” the input correlations. We thus denote this unsupervised learning scheme as ‘kernel spectral component analysis’ (kSCA). In particular, the whole input correlation structure must be considered since all plastic synapses compete with each other. We find that kSCA is enhanced when weight-dependent STDP induces gradual synaptic competition. For a spiking neuron with a “linear” response and pairwise STDP alone, we find that kSCA resembles principal component analysis (PCA). However, plain STDP does not isolate correlation sources in general, e.g., when they are mixed among the input spike trains. In other words, it does not perform independent component analysis (ICA). Tuning the neuron to a single correlation source can be achieved when STDP is paired with a homeostatic mechanism that reinforces the competition between synaptic inputs. Our results suggest that neuronal networks equipped with STDP can process signals encoded in the transient spiking activity at the timescales of tens of milliseconds for usual STDP. Tuning feature extraction of sensory stimuli is an important function for synaptic plasticity models. A widely studied example is the development of orientation preference in the primary visual cortex, which can emerge using moving bars in the visual field. A crucial point is the decomposition of stimuli into basic information tokens, e.g., selecting individual bars even though they are presented in overlapping pairs (vertical and horizontal). Among classical unsupervised learning models, independent component analysis (ICA) is capable of isolating basic tokens, whereas principal component analysis (PCA) cannot. This paper focuses on spike-timing-dependent plasticity (STDP), whose functional implications for neural information processing have been intensively studied both theoretically and experimentally in the last decade. Following recent studies demonstrating that STDP can perform ICA for specific cases, we show how STDP relates to PCA or ICA, and in particular explains the conditions under which it switches between them. Here information at the neuronal level is assumed to be encoded in temporal cross-correlograms of spike trains. We find that a linear spiking neuron equipped with pairwise STDP requires additional mechanisms, such as a homeostatic regulation of its output firing, in order to separate mixed correlation sources and thus perform ICA.
Collapse
|
300
|
Hunzinger JF, Chan VH, Froemke RC. Learning complex temporal patterns with resource-dependent spike timing-dependent plasticity. J Neurophysiol 2012; 108:551-66. [PMID: 22496526 DOI: 10.1152/jn.01150.2011] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Studies of spike timing-dependent plasticity (STDP) have revealed that long-term changes in the strength of a synapse may be modulated substantially by temporal relationships between multiple presynaptic and postsynaptic spikes. Whereas long-term potentiation (LTP) and long-term depression (LTD) of synaptic strength have been modeled as distinct or separate functional mechanisms, here, we propose a new shared resource model. A functional consequence of our model is fast, stable, and diverse unsupervised learning of temporal multispike patterns with a biologically consistent spiking neural network. Due to interdependencies between LTP and LTD, dendritic delays, and proactive homeostatic aspects of the model, neurons are equipped to learn to decode temporally coded information within spike bursts. Moreover, neurons learn spike timing with few exposures in substantial noise and jitter. Surprisingly, despite having only one parameter, the model also accurately predicts in vitro observations of STDP in more complex multispike trains, as well as rate-dependent effects. We discuss candidate commonalities in natural long-term plasticity mechanisms.
Collapse
|