1
|
Braun W, Memmesheimer RM. High-frequency oscillations and sequence generation in two-population models of hippocampal region CA1. PLoS Comput Biol 2022; 18:e1009891. [PMID: 35176028 PMCID: PMC8890743 DOI: 10.1371/journal.pcbi.1009891] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2021] [Revised: 03/02/2022] [Accepted: 02/02/2022] [Indexed: 11/19/2022] Open
Abstract
Hippocampal sharp wave/ripple oscillations are a prominent pattern of collective activity, which consists of a strong overall increase of activity with superimposed (140 − 200 Hz) ripple oscillations. Despite its prominence and its experimentally demonstrated importance for memory consolidation, the mechanisms underlying its generation are to date not understood. Several models assume that recurrent networks of inhibitory cells alone can explain the generation and main characteristics of the ripple oscillations. Recent experiments, however, indicate that in addition to inhibitory basket cells, the pattern requires in vivo the activity of the local population of excitatory pyramidal cells. Here, we study a model for networks in the hippocampal region CA1 incorporating such a local excitatory population of pyramidal neurons. We start by investigating its ability to generate ripple oscillations using extensive simulations. Using biologically plausible parameters, we find that short pulses of external excitation triggering excitatory cell spiking are required for sharp/wave ripple generation with oscillation patterns similar to in vivo observations. Our model has plausible values for single neuron, synapse and connectivity parameters, random connectivity and no strong feedforward drive to the inhibitory population. Specifically, whereas temporally broad excitation can lead to high-frequency oscillations in the ripple range, sparse pyramidal cell activity is only obtained with pulse-like external CA3 excitation. Further simulations indicate that such short pulses could originate from dendritic spikes in the apical or basal dendrites of CA1 pyramidal cells, which are triggered by coincident spike arrivals from hippocampal region CA3. Finally we show that replay of sequences by pyramidal neurons and ripple oscillations can arise intrinsically in CA1 due to structured connectivity that gives rise to alternating excitatory pulse and inhibitory gap coding; the latter denotes phases of silence in specific basket cell groups, which induce selective disinhibition of groups of pyramidal neurons. This general mechanism for sequence generation leads to sparse pyramidal cell and dense basket cell spiking, does not rely on synfire chain-like feedforward excitation and may be relevant for other brain regions as well.
Collapse
Affiliation(s)
- Wilhelm Braun
- Neural Network Dynamics and Computation, Institute of Genetics, University of Bonn, Bonn, Germany
- Institute of Computational Neuroscience, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
- * E-mail: (WB); (R-MM)
| | - Raoul-Martin Memmesheimer
- Neural Network Dynamics and Computation, Institute of Genetics, University of Bonn, Bonn, Germany
- * E-mail: (WB); (R-MM)
| |
Collapse
|
2
|
Lynch N, Mallmann-Trenn F. Learning hierarchically-structured concepts. Neural Netw 2021; 143:798-817. [PMID: 34488015 DOI: 10.1016/j.neunet.2021.07.033] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2021] [Revised: 07/26/2021] [Accepted: 07/29/2021] [Indexed: 10/20/2022]
Abstract
We use a recently developed synchronous Spiking Neural Network (SNN) model to study the problem of learning hierarchically-structured concepts. We introduce an abstract data model that describes simple hierarchical concepts. We define a feed-forward layered SNN model, with learning modeled using Oja's local learning rule, a well known biologically-plausible rule for adjusting synapse weights. We define what it means for such a network to recognize hierarchical concepts; our notion of recognition is robust, in that it tolerates a bounded amount of noise. Then, we present a learning algorithm by which a layered network may learn to recognize hierarchical concepts according to our robust definition. We analyze correctness and performance rigorously; the amount of time required to learn each concept, after learning all of the sub-concepts, is approximately O1ηkℓmaxlog(k)+1ɛ+blog(k), where k is the number of sub-concepts per concept, ℓmax is the maximum hierarchical depth, η is the learning rate, ɛ describes the amount of uncertainty allowed in robust recognition, and b describes the amount of weight decrease for "irrelevant" edges. An interesting feature of this algorithm is that it allows the network to learn sub-concepts in a highly interleaved manner. This algorithm assumes that the concepts are presented in a noise-free way; we also extend these results to accommodate noise in the learning process. Finally, we give a simple lower bound saying that, in order to recognize concepts with hierarchical depth two with noise-tolerance, a neural network should have at least two layers. The results in this paper represent first steps in the theoretical study of hierarchical concepts using SNNs. The cases studied here are basic, but they suggest many directions for extensions to more elaborate and realistic cases.
Collapse
Affiliation(s)
- Nancy Lynch
- Massachusetts Institute of Technology, Cambridge, MA, USA
| | | |
Collapse
|
3
|
Polyakov F. Affine differential geometry and smoothness maximization as tools for identifying geometric movement primitives. BIOLOGICAL CYBERNETICS 2017; 111:5-24. [PMID: 27822891 DOI: 10.1007/s00422-016-0705-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/27/2016] [Accepted: 10/19/2016] [Indexed: 06/06/2023]
Abstract
Neuroscientific studies of drawing-like movements usually analyze neural representation of either geometric (e.g., direction, shape) or temporal (e.g., speed) parameters of trajectories rather than trajectory's representation as a whole. This work is about identifying geometric building blocks of movements by unifying different empirically supported mathematical descriptions that characterize relationship between geometric and temporal aspects of biological motion. Movement primitives supposedly facilitate the efficiency of movements' representation in the brain and comply with such criteria for biological movements as kinematic smoothness and geometric constraint. The minimum-jerk model formalizes criterion for trajectories' maximal smoothness of order 3. I derive a class of differential equations obeyed by movement paths whose nth-order maximally smooth trajectories accumulate path measurement with constant rate. Constant rate of accumulating equi-affine arc complies with the 2/3 power-law model. Candidate primitive shapes identified as equations' solutions for arcs in different geometries in plane and in space are presented. Connection between geometric invariance, motion smoothness, compositionality and performance of the compromised motor control system is proposed within single invariance-smoothness framework. The derived class of differential equations is a novel tool for discovering candidates for geometric movement primitives.
Collapse
Affiliation(s)
- Felix Polyakov
- Department of Mathematics, Ariel University, Ariel, Israel.
- Department of Mathematics, Bar Ilan University, Ramat Gan, Israel.
| |
Collapse
|
4
|
Trengove C, Diesmann M, van Leeuwen C. Dynamic effective connectivity in cortically embedded systems of recurrently coupled synfire chains. J Comput Neurosci 2015; 40:1-26. [PMID: 26560334 PMCID: PMC4762935 DOI: 10.1007/s10827-015-0581-5] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2015] [Revised: 10/14/2015] [Accepted: 10/21/2015] [Indexed: 12/02/2022]
Abstract
As a candidate mechanism of neural representation, large numbers of synfire chains can efficiently be embedded in a balanced recurrent cortical network model. Here we study a model in which multiple synfire chains of variable strength are randomly coupled together to form a recurrent system. The system can be implemented both as a large-scale network of integrate-and-fire neurons and as a reduced model. The latter has binary-state pools as basic units but is otherwise isomorphic to the large-scale model, and provides an efficient tool for studying its behavior. Both the large-scale system and its reduced counterpart are able to sustain ongoing endogenous activity in the form of synfire waves, the proliferation of which is regulated by negative feedback caused by collateral noise. Within this equilibrium, diverse repertoires of ongoing activity are observed, including meta-stability and multiple steady states. These states arise in concert with an effective connectivity structure (ECS). The ECS admits a family of effective connectivity graphs (ECGs), parametrized by the mean global activity level. Of these graphs, the strongly connected components and their associated out-components account to a large extent for the observed steady states of the system. These results imply a notion of dynamic effective connectivity as governing neural computation with synfire chains, and related forms of cortical circuitry with complex topologies.
Collapse
Affiliation(s)
- Chris Trengove
- Perceptual Dynamics Laboratory, University of Leuven, Leuven, Belgium.
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany.,Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen University, Aachen, Germany.,Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| | - Cees van Leeuwen
- Perceptual Dynamics Laboratory, University of Leuven, Leuven, Belgium.,TU Kaiserslautern, Kaiserslautern, Germany
| |
Collapse
|
5
|
St Clair WB, Noelle DC. Implications of polychronous neuronal groups for the continuity of mind. Cogn Process 2015; 16:319-23. [PMID: 25630854 DOI: 10.1007/s10339-015-0645-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2014] [Accepted: 01/16/2015] [Indexed: 11/24/2022]
Abstract
Is conceptual space continuous? The answer to this question depends on how concepts are represented in the brain. Vector space representations, which ground conceptual states in the instantaneous firing rates of neurons, have successfully captured cognitive dynamics in a broad range of domains. There is a growing body of evidence, however, that conceptual information is encoded in spatiotemporal patterns of neural spikes, sometimes called polychronous neuronal groups (PNGs). The use of PNGs to represent conceptual states, rather than employing a continuous vector space, introduces new challenges, including issues of temporally extended representations, meaning through symbol grounding, compositionality, and representational similarity. In this article, we explore how PNGs support discontinuous transitions between concepts. While the continuous dynamics of vector space approaches require such transitions to activate intermediate and blended concepts, PNGs offer the means to change the activation of concepts discretely, introducing a form of conceptual dynamics unavailable to vector space models.
Collapse
|
6
|
Petrovici MA, Vogginger B, Müller P, Breitwieser O, Lundqvist M, Muller L, Ehrlich M, Destexhe A, Lansner A, Schüffny R, Schemmel J, Meier K. Characterization and compensation of network-level anomalies in mixed-signal neuromorphic modeling platforms. PLoS One 2014; 9:e108590. [PMID: 25303102 PMCID: PMC4193761 DOI: 10.1371/journal.pone.0108590] [Citation(s) in RCA: 32] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2014] [Accepted: 08/22/2014] [Indexed: 11/18/2022] Open
Abstract
Advancing the size and complexity of neural network models leads to an ever increasing demand for computational resources for their simulation. Neuromorphic devices offer a number of advantages over conventional computing architectures, such as high emulation speed or low power consumption, but this usually comes at the price of reduced configurability and precision. In this article, we investigate the consequences of several such factors that are common to neuromorphic devices, more specifically limited hardware resources, limited parameter configurability and parameter variations due to fixed-pattern noise and trial-to-trial variability. Our final aim is to provide an array of methods for coping with such inevitable distortion mechanisms. As a platform for testing our proposed strategies, we use an executable system specification (ESS) of the BrainScaleS neuromorphic system, which has been designed as a universal emulation back-end for neuroscientific modeling. We address the most essential limitations of this device in detail and study their effects on three prototypical benchmark network models within a well-defined, systematic workflow. For each network model, we start by defining quantifiable functionality measures by which we then assess the effects of typical hardware-specific distortion mechanisms, both in idealized software simulations and on the ESS. For those effects that cause unacceptable deviations from the original network dynamics, we suggest generic compensation mechanisms and demonstrate their effectiveness. Both the suggested workflow and the investigated compensation mechanisms are largely back-end independent and do not require additional hardware configurability beyond the one required to emulate the benchmark networks in the first place. We hereby provide a generic methodological environment for configurable neuromorphic devices that are targeted at emulating large-scale, functional neural networks.
Collapse
Affiliation(s)
- Mihai A. Petrovici
- Ruprecht-Karls-Universität Heidelberg, Kirchhoff Institute for Physics, Heidelberg, Germany
| | - Bernhard Vogginger
- Technische Universität Dresden, Institute of Circuits and Systems, Dresden, Germany
| | - Paul Müller
- Ruprecht-Karls-Universität Heidelberg, Kirchhoff Institute for Physics, Heidelberg, Germany
| | - Oliver Breitwieser
- Ruprecht-Karls-Universität Heidelberg, Kirchhoff Institute for Physics, Heidelberg, Germany
| | - Mikael Lundqvist
- Department of Computational Biology, School of Computer Science and Communication, Stockholm University and Royal Institute of Technology, Stockholm, Sweden
| | - Lyle Muller
- CNRS, Unité de Neuroscience, Information et Complexité, Gif sur Yvette, France
| | - Matthias Ehrlich
- Technische Universität Dresden, Institute of Circuits and Systems, Dresden, Germany
| | - Alain Destexhe
- CNRS, Unité de Neuroscience, Information et Complexité, Gif sur Yvette, France
| | - Anders Lansner
- Department of Computational Biology, School of Computer Science and Communication, Stockholm University and Royal Institute of Technology, Stockholm, Sweden
| | - René Schüffny
- Technische Universität Dresden, Institute of Circuits and Systems, Dresden, Germany
| | - Johannes Schemmel
- Ruprecht-Karls-Universität Heidelberg, Kirchhoff Institute for Physics, Heidelberg, Germany
| | - Karlheinz Meier
- Ruprecht-Karls-Universität Heidelberg, Kirchhoff Institute for Physics, Heidelberg, Germany
| |
Collapse
|
7
|
Abeles M, Diesmann M, Flash T, Geisel T, Herrmann M, Teicher M. Compositionality in neural control: an interdisciplinary study of scribbling movements in primates. Front Comput Neurosci 2013; 7:103. [PMID: 24062679 PMCID: PMC3771313 DOI: 10.3389/fncom.2013.00103] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2013] [Accepted: 07/11/2013] [Indexed: 01/22/2023] Open
Abstract
This article discusses the compositional structure of hand movements by analyzing and modeling neural and behavioral data obtained from experiments where a monkey (Macaca fascicularis) performed scribbling movements induced by a search task. Using geometrically based approaches to movement segmentation, it is shown that the hand trajectories are composed of elementary segments that are primarily parabolic in shape. The segments could be categorized into a small number of classes on the basis of decreasing intra-class variance over the course of training. A separate classification of the neural data employing a hidden Markov model showed a coincidence of the neural states with the behavioral categories. An additional analysis of both types of data by a data mining method provided evidence that the neural activity patterns underlying the behavioral primitives were formed by sets of specific and precise spike patterns. A geometric description of the movement trajectories, together with precise neural timing data indicates a compositional variant of a realistic synfire chain model. This model reproduces the typical shapes and temporal properties of the trajectories; hence the structure and composition of the primitives may reflect meaningful behavior.
Collapse
Affiliation(s)
- Moshe Abeles
- Gonda Brain Research Center, Bar Ilan University Ramat Gan, Israel ; Department of Physiology, The Hebrew University of Jerusalem Jerusalem, Israel
| | | | | | | | | | | |
Collapse
|
8
|
Pfeil T, Grübl A, Jeltsch S, Müller E, Müller P, Petrovici MA, Schmuker M, Brüderle D, Schemmel J, Meier K. Six networks on a universal neuromorphic computing substrate. Front Neurosci 2013; 7:11. [PMID: 23423583 PMCID: PMC3575075 DOI: 10.3389/fnins.2013.00011] [Citation(s) in RCA: 115] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2012] [Accepted: 01/18/2013] [Indexed: 11/28/2022] Open
Abstract
In this study, we present a highly configurable neuromorphic computing substrate and use it for emulating several types of neural networks. At the heart of this system lies a mixed-signal chip, with analog implementations of neurons and synapses and digital transmission of action potentials. Major advantages of this emulation device, which has been explicitly designed as a universal neural network emulator, are its inherent parallelism and high acceleration factor compared to conventional computers. Its configurability allows the realization of almost arbitrary network topologies and the use of widely varied neuronal and synaptic parameters. Fixed-pattern noise inherent to analog circuitry is reduced by calibration routines. An integrated development environment allows neuroscientists to operate the device without any prior knowledge of neuromorphic circuit design. As a showcase for the capabilities of the system, we describe the successful emulation of six different neural networks which cover a broad spectrum of both structure and functionality.
Collapse
Affiliation(s)
- Thomas Pfeil
- Kirchhoff-Institute for Physics, Universität Heidelberg Heidelberg, Germany
| | | | | | | | | | | | | | | | | | | |
Collapse
|
9
|
High-capacity embedding of synfire chains in a cortical network model. J Comput Neurosci 2012; 34:185-209. [PMID: 22878688 PMCID: PMC3605496 DOI: 10.1007/s10827-012-0413-9] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2011] [Revised: 04/18/2012] [Accepted: 07/02/2012] [Indexed: 10/28/2022]
Abstract
Synfire chains, sequences of pools linked by feedforward connections, support the propagation of precisely timed spike sequences, or synfire waves. An important question remains, how synfire chains can efficiently be embedded in cortical architecture. We present a model of synfire chain embedding in a cortical scale recurrent network using conductance-based synapses, balanced chains, and variable transmission delays. The network attains substantially higher embedding capacities than previous spiking neuron models and allows all its connections to be used for embedding. The number of waves in the model is regulated by recurrent background noise. We computationally explore the embedding capacity limit, and use a mean field analysis to describe the equilibrium state. Simulations confirm the mean field analysis over broad ranges of pool sizes and connectivity levels; the number of pools embedded in the system trades off against the firing rate and the number of waves. An optimal inhibition level balances the conflicting requirements of stable synfire propagation and limited response to background noise. A simplified analysis shows that the present conductance-based synapses achieve higher contrast between the responses to synfire input and background noise compared to current-based synapses, while regulation of wave numbers is traced to the use of variable transmission delays.
Collapse
|
10
|
Detecting synfire chains in parallel spike data. J Neurosci Methods 2012; 206:54-64. [PMID: 22361572 DOI: 10.1016/j.jneumeth.2012.02.003] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2011] [Revised: 02/01/2012] [Accepted: 02/03/2012] [Indexed: 11/23/2022]
Abstract
The synfire chain model of brain organization has received much theoretical attention since its introduction (Abeles, 1982, 1991). However there has been no convincing experimental demonstration of synfire chains due partly to limitations of recording technology but also due to lack of appropriate analytic methods for large scale recordings of parallel spike trains. We have previously published one such method based on intersection of the neural populations active at two different times (Schrader et al., 2008). In the present paper we extend this analysis to deal with higher firing rates and noise levels, and develop two additional tools based on properties of repeating firing patterns. All three measures show characteristic signatures if synfire chains underlie the recorded data. However we demonstrate that the detection of repeating firing patterns alone (as used in several papers) is not enough to infer the presence of synfire chains. Positive results from all three measures are needed.
Collapse
|
11
|
Asai Y, Villa AEP. Integration and transmission of distributed deterministic neural activity in feed-forward networks. Brain Res 2011; 1434:17-33. [PMID: 22071564 DOI: 10.1016/j.brainres.2011.10.012] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2011] [Revised: 10/06/2011] [Accepted: 10/07/2011] [Indexed: 10/16/2022]
Abstract
A ten layer feed-forward network characterized by diverging/converging patterns of projection between successive layers of regular spiking (RS) neurons is activated by an external spatiotemporal input pattern fed to Layer 1 in presence of stochastic background activities fed to all layers. We used three dynamical systems to derive the external input spike trains including the temporal information, and three types of neuron models for the network, i.e. either a network formed either by neurons modeled by exponential integrate-and-fire dynamics (RS-EIF, Fourcaud-Trocmé et al., 2003), or by simple spiking neurons (RS-IZH, Izhikevich, 2004) or by multiple-timescale adaptive threshold neurons (RS-MAT, Kobayashi et al., 2009), given five intensities for the background activity. The assessment of the temporal structure embedded in the output spike trains was carried out by detecting the preferred firing sequences for the reconstruction of de-noised spike trains (Asai and Villa, 2008). We confirmed that the RS-MAT model is likely to be more efficient in integrating and transmitting the temporal structure embedded in the external input. We observed that this structure could be propagated not only up to the 10th layer but in some cases it was retained better beyond the 4th downstream layers. This study suggests that diverging/converging network structures, by the propagation of synfire activity, could play a key role in the transmission of complex temporal patterns of discharges associated to deterministic nonlinear activity. This article is part of a Special Issue entitled Neural Coding.
Collapse
Affiliation(s)
- Yoshiyuki Asai
- Okinawa Institute of Science and Technology, Okinawa, Japan.
| | | |
Collapse
|
12
|
An imperfect dopaminergic error signal can drive temporal-difference learning. PLoS Comput Biol 2011; 7:e1001133. [PMID: 21589888 PMCID: PMC3093351 DOI: 10.1371/journal.pcbi.1001133] [Citation(s) in RCA: 43] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2010] [Accepted: 04/06/2011] [Indexed: 12/03/2022] Open
Abstract
An open problem in the field of computational neuroscience is how to link synaptic plasticity to system-level learning. A promising framework in this context is temporal-difference (TD) learning. Experimental evidence that supports the hypothesis that the mammalian brain performs temporal-difference learning includes the resemblance of the phasic activity of the midbrain dopaminergic neurons to the TD error and the discovery that cortico-striatal synaptic plasticity is modulated by dopamine. However, as the phasic dopaminergic signal does not reproduce all the properties of the theoretical TD error, it is unclear whether it is capable of driving behavior adaptation in complex tasks. Here, we present a spiking temporal-difference learning model based on the actor-critic architecture. The model dynamically generates a dopaminergic signal with realistic firing rates and exploits this signal to modulate the plasticity of synapses as a third factor. The predictions of our proposed plasticity dynamics are in good agreement with experimental results with respect to dopamine, pre- and post-synaptic activity. An analytical mapping from the parameters of our proposed plasticity dynamics to those of the classical discrete-time TD algorithm reveals that the biological constraints of the dopaminergic signal entail a modified TD algorithm with self-adapting learning parameters and an adapting offset. We show that the neuronal network is able to learn a task with sparse positive rewards as fast as the corresponding classical discrete-time TD algorithm. However, the performance of the neuronal network is impaired with respect to the traditional algorithm on a task with both positive and negative rewards and breaks down entirely on a task with purely negative rewards. Our model demonstrates that the asymmetry of a realistic dopaminergic signal enables TD learning when learning is driven by positive rewards but not when driven by negative rewards. What are the physiological changes that take place in the brain when we solve a problem or learn a new skill? It is commonly assumed that behavior adaptations are realized on the microscopic level by changes in synaptic efficacies. However, this is hard to verify experimentally due to the difficulties of identifying the relevant synapses and monitoring them over long periods during a behavioral task. To address this question computationally, we develop a spiking neuronal network model of actor-critic temporal-difference learning, a variant of reinforcement learning for which neural correlates have already been partially established. The network learns a complex task by means of an internally generated reward signal constrained by recent findings on the dopaminergic system. Our model combines top-down and bottom-up modelling approaches to bridge the gap between synaptic plasticity and system-level learning. It paves the way for further investigations of the dopaminergic system in reward learning in the healthy brain and in pathological conditions such as Parkinson's disease, and can be used as a module in functional models based on brain-scale circuitry.
Collapse
|
13
|
Hanuschkin A, Herrmann JM, Morrison A, Diesmann M. Compositionality of arm movements can be realized by propagating synchrony. J Comput Neurosci 2010; 30:675-97. [PMID: 20953686 PMCID: PMC3108016 DOI: 10.1007/s10827-010-0285-9] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2010] [Revised: 09/02/2010] [Accepted: 09/30/2010] [Indexed: 11/29/2022]
Abstract
We present a biologically plausible spiking neuronal network model of free monkey scribbling that reproduces experimental findings on cortical activity and the properties of the scribbling trajectory. The model is based on the idea that synfire chains can encode movement primitives. Here, we map the propagation of activity in a chain to a linearly evolving preferred velocity, which results in parabolic segments that fulfill the two-thirds power law. Connections between chains that match the final velocity of one encoded primitive to the initial velocity of the next allow the composition of random sequences of primitives with smooth transitions. The model provides an explanation for the segmentation of the trajectory and the experimentally observed deviations of the trajectory from the parabolic shape at primitive transition sites. Furthermore, the model predicts low frequency oscillations (<10 Hz) of the motor cortex local field potential during ongoing movements and increasing firing rates of non-specific motor cortex neurons before movement onset.
Collapse
Affiliation(s)
- Alexander Hanuschkin
- Functional Neural Circuits Group, Faculty of Biology, Schänzlestrasse 1, 79104, Freiburg, Germany.
| | | | | | | |
Collapse
|