26
|
Krishnamurthy P, Silberberg G, Lansner A. A cortical attractor network with Martinotti cells driven by facilitating synapses. PLoS One 2012; 7:e30752. [PMID: 22523533 PMCID: PMC3327695 DOI: 10.1371/journal.pone.0030752] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2011] [Accepted: 12/21/2011] [Indexed: 12/02/2022] Open
Abstract
The population of pyramidal cells significantly outnumbers the inhibitory interneurons in the neocortex, while at the same time the diversity of interneuron types is much more pronounced. One acknowledged key role of inhibition is to control the rate and patterning of pyramidal cell firing via negative feedback, but most likely the diversity of inhibitory pathways is matched by a corresponding diversity of functional roles. An important distinguishing feature of cortical interneurons is the variability of the short-term plasticity properties of synapses received from pyramidal cells. The Martinotti cell type has recently come under scrutiny due to the distinctly facilitating nature of the synapses they receive from pyramidal cells. This distinguishes these neurons from basket cells and other inhibitory interneurons typically targeted by depressing synapses. A key aspect of the work reported here has been to pinpoint the role of this variability. We first set out to reproduce quantitatively based on in vitro data the di-synaptic inhibitory microcircuit connecting two pyramidal cells via one or a few Martinotti cells. In a second step, we embedded this microcircuit in a previously developed attractor memory network model of neocortical layers 2/3. This model network demonstrated that basket cells with their characteristic depressing synapses are the first to discharge when the network enters an attractor state and that Martinotti cells respond with a delay, thereby shifting the excitation-inhibition balance and acting to terminate the attractor state. A parameter sensitivity analysis suggested that Martinotti cells might, in fact, play a dominant role in setting the attractor dwell time and thus cortical speed of processing, with cellular adaptation and synaptic depression having a less prominent role than previously thought.
Collapse
|
27
|
Berthet P, Lansner A. An abstract model of the basal ganglia, reward learning and action selection. BMC Neurosci 2011. [PMCID: PMC3240288 DOI: 10.1186/1471-2202-12-s1-p189] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
|
28
|
Silverstein D, Lansner A. Scaling of a biophysical neocortical attractor model using Parallel NEURON on the Blue Gene /P. BMC Neurosci 2011. [PMCID: PMC3240291 DOI: 10.1186/1471-2202-12-s1-p191] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
|
29
|
Benjaminsson S, Herman P, Lansner A. Odor segmentation and identification in an abstract large-scale model of the mammalian olfactory system. BMC Neurosci 2011. [PMCID: PMC3240287 DOI: 10.1186/1471-2202-12-s1-p188] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022] Open
|
30
|
Krishnamurthy P, Silberberg G, Lansner A. A cortical attractor network with dynamic synapses. BMC Neurosci 2011. [PMCID: PMC3240286 DOI: 10.1186/1471-2202-12-s1-p187] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
31
|
|
32
|
Lansner A. Perceptual and memory functions in a cortex-inspired attractor network model. BMC Neurosci 2011. [PMCID: PMC3240167 DOI: 10.1186/1471-2202-12-s1-k2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022] Open
|
33
|
Holst A, Lansner A. Errata: A Flexible and Fault Tolerant Query-Reply System Based on a Bayesian Neural Network. Int J Neural Syst 2011. [DOI: 10.1142/s012906579700061x] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
34
|
Lansner A, Ekeberg Ö. A ONE-LAYER FEEDBACK ARTIFICIAL NEURAL NETWORK WITH A BAYESIAN LEARNING RULE. Int J Neural Syst 2011. [DOI: 10.1142/s0129065789000499] [Citation(s) in RCA: 61] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
A probabilistic artificial neural network is presented. It is of a one-layer, feedback-coupled type with graded units. The learning rule is derived from Bayes's rule. Learning is regarded as collecting statistics and recall as a statistical inference process. Units correspond to events and connections come out as compatibility coefficients in a logarithmic combination rule. The input to a unit via connections from other active units affects the a posteriori belief in the event in question. The new model is compared to an earlier binary model with respect to storage capacity, noise tolerance, etc. in a content addressable memory (CAM) task. The new model is a real time network and some results on the reaction time for associative recall are given. The scaling of learning and relaxation operations is considered together with issues related to representation of information in one-layer artificial neural networks. An extension with complex units is discussed.
Collapse
|
35
|
Auffarth B, Kaplan B, Lansner A. Map formation in the olfactory bulb by axon guidance of olfactory neurons. Front Syst Neurosci 2011; 5:84. [PMID: 22013417 PMCID: PMC3190187 DOI: 10.3389/fnsys.2011.00084] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2011] [Accepted: 09/22/2011] [Indexed: 11/28/2022] Open
Abstract
The organization of representations in the brain has been observed to locally reflect subspaces of inputs that are relevant to behavioral or perceptual feature combinations, such as in areas receptive to lower and higher-order features in the visual system. The early olfactory system developed highly plastic mechanisms and convergent evidence indicates that projections from primary neurons converge onto the glomerular level of the olfactory bulb (OB) to form a code composed of continuous spatial zones that are differentially active for particular physico-chemical feature combinations, some of which are known to trigger behavioral responses. In a model study of the early human olfactory system, we derive a glomerular organization based on a set of real-world, biologically relevant stimuli, a distribution of receptors that respond each to a set of odorants of similar ranges of molecular properties, and a mechanism of axon guidance based on activity. Apart from demonstrating activity-dependent glomeruli formation and reproducing the relationship of glomerular recruitment with concentration, it is shown that glomerular responses reflect similarities of human odor category perceptions and that further, a spatial code provides a better correlation than a distributed population code. These results are consistent with evidence of functional compartmentalization in the OB and could suggest a function for the bulb in encoding of perceptual dimensions.
Collapse
|
36
|
Kaplan B, Benjaminsson S, Lansner A. A large-scale model of the three first stages of the mammalian olfactory system implemented with spiking neurons. BMC Neurosci 2011. [PMCID: PMC3240284 DOI: 10.1186/1471-2202-12-s1-p185] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022] Open
|
37
|
Silverstein DN, Lansner A. Is attentional blink a byproduct of neocortical attractors? Front Comput Neurosci 2011; 5:13. [PMID: 21625630 PMCID: PMC3096845 DOI: 10.3389/fncom.2011.00013] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2010] [Accepted: 03/15/2011] [Indexed: 11/13/2022] Open
Abstract
This study proposes a computational model for attentional blink or “blink of the mind,” a phenomenon where a human subject misses perception of a later expected visual pattern as two expected visual patterns are presented less than 500 ms apart. A neocortical patch modeled as an attractor network is stimulated with a sequence of 14 patterns 100 ms apart, two of which are expected targets. Patterns that become active attractors are considered recognized. A neocortical patch is represented as a square matrix of hypercolumns, each containing a set of minicolumns with synaptic connections within and across both minicolumns and hypercolumns. Each minicolumn consists of locally connected layer 2/3 pyramidal cells with interacting basket cells and layer 4 pyramidal cells for input stimulation. All neurons are implemented using the Hodgkin–Huxley multi-compartmental cell formalism and include calcium dynamics, and they interact via saturating and depressing AMPA/NMDA and GABAA synapses. Stored patterns are encoded with global connectivity of minicolumns across hypercolumns and active patterns compete as the result of lateral inhibition in the network. Stored patterns were stimulated over time intervals to create attractor interference measurable with synthetic spike traces. This setup corresponds with item presentations in human visual attentional blink studies. Stored target patterns were depolarized while distractor patterns where hyperpolarized to represent expectation of items in working memory. Simulations replicated the basic attentional blink phenomena and showed a reduced blink when targets were more salient. Studies on the inhibitory effect of benzodiazepines on attentional blink in human subjects were compared with neocortical simulations where the GABAA receptor conductance and decay time were increased. Simulations showed increases in the attentional blink duration, agreeing with observations in human studies. In addition, sensitivity analysis was performed on key parameters of the model, including Ca2+-gated K+ channel conductance, synaptic depression, GABAA channel conductance and the NMDA/AMPA ratio of charge entry.
Collapse
|
38
|
Brüderle D, Petrovici MA, Vogginger B, Ehrlich M, Pfeil T, Millner S, Grübl A, Wendt K, Müller E, Schwartz MO, de Oliveira DH, Jeltsch S, Fieres J, Schilling M, Müller P, Breitwieser O, Petkov V, Muller L, Davison AP, Krishnamurthy P, Kremkow J, Lundqvist M, Muller E, Partzsch J, Scholze S, Zühl L, Mayr C, Destexhe A, Diesmann M, Potjans TC, Lansner A, Schüffny R, Schemmel J, Meier K. A comprehensive workflow for general-purpose neural modeling with highly configurable neuromorphic hardware systems. BIOLOGICAL CYBERNETICS 2011; 104:263-296. [PMID: 21618053 DOI: 10.1007/s00422-011-0435-9] [Citation(s) in RCA: 35] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/10/2010] [Accepted: 04/19/2011] [Indexed: 05/30/2023]
Abstract
In this article, we present a methodological framework that meets novel requirements emerging from upcoming types of accelerated and highly configurable neuromorphic hardware systems. We describe in detail a device with 45 million programmable and dynamic synapses that is currently under development, and we sketch the conceptual challenges that arise from taking this platform into operation. More specifically, we aim at the establishment of this neuromorphic system as a flexible and neuroscientifically valuable modeling tool that can be used by non-hardware experts. We consider various functional aspects to be crucial for this purpose, and we introduce a consistent workflow with detailed descriptions of all involved modules that implement the suggested steps: The integration of the hardware interface into the simulator-independent model description language PyNN; a fully automated translation between the PyNN domain and appropriate hardware configurations; an executable specification of the future neuromorphic system that can be seamlessly integrated into this biology-to-hardware mapping process as a test bench for all software layers and possible hardware design modifications; an evaluation scheme that deploys models from a dedicated benchmark library, compares the results generated by virtual or prototype hardware devices with reference software simulations and analyzes the differences. The integration of these components into one hardware-software workflow provides an ecosystem for ongoing preparative studies that support the hardware design process and represents the basis for the maturity of the model-to-hardware mapping software. The functionality and flexibility of the latter is proven with a variety of experimental results.
Collapse
|
39
|
Lundqvist M, Herman P, Lansner A. Theta and gamma power increases and alpha/beta power decreases with memory load in an attractor network model. J Cogn Neurosci 2011; 23:3008-20. [PMID: 21452933 DOI: 10.1162/jocn_a_00029] [Citation(s) in RCA: 127] [Impact Index Per Article: 9.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Changes in oscillatory brain activity are strongly correlated with performance in cognitive tasks and modulations in specific frequency bands are associated with working memory tasks. Mesoscale network models allow the study of oscillations as an emergent feature of neuronal activity. Here we extend a previously developed attractor network model, shown to faithfully reproduce single-cell activity during retention and memory recall, with synaptic augmentation. This enables the network to function as a multi-item working memory by cyclic reactivation of up to six items. The reactivation happens at theta frequency, consistently with recent experimental findings, with increasing theta power for each additional item loaded in the network's memory. Furthermore, each memory reactivation is associated with gamma oscillations. Thus, single-cell spike trains as well as gamma oscillations in local groups are nested in the theta cycle. The network also exhibits an idling rhythm in the alpha/beta band associated with a noncoding global attractor. Put together, the resulting effect is increasing theta and gamma power and decreasing alpha/beta power with growing working memory load, rendering the network mechanisms involved a plausible explanation for this often reported behavior.
Collapse
|
40
|
Fonollosa J, Gutierrez-Galvez A, Lansner A, Martinez D, Rospars J, Beccherelli R, Perera A, Pearce T, Vershure P, Persaud K, Marco S. Biologically Inspired Computation for Chemical Sensing. ACTA ACUST UNITED AC 2011. [DOI: 10.1016/j.procs.2011.09.066] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
41
|
Benjaminsson S, Fransson P, Lansner A. A novel model-free data analysis technique based on clustering in a mutual information space: application to resting-state FMRI. Front Syst Neurosci 2010; 4. [PMID: 20721313 PMCID: PMC2922939 DOI: 10.3389/fnsys.2010.00034] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2009] [Accepted: 06/18/2010] [Indexed: 11/26/2022] Open
Abstract
Non-parametric data-driven analysis techniques can be used to study datasets with few assumptions about the data and underlying experiment. Variations of independent component analysis (ICA) have been the methods mostly used on fMRI data, e.g., in finding resting-state networks thought to reflect the connectivity of the brain. Here we present a novel data analysis technique and demonstrate it on resting-state fMRI data. It is a generic method with few underlying assumptions about the data. The results are built from the statistical relations between all input voxels, resulting in a whole-brain analysis on a voxel level. It has good scalability properties and the parallel implementation is capable of handling large datasets and databases. From the mutual information between the activities of the voxels over time, a distance matrix is created for all voxels in the input space. Multidimensional scaling is used to put the voxels in a lower-dimensional space reflecting the dependency relations based on the distance matrix. By performing clustering in this space we can find the strong statistical regularities in the data, which for the resting-state data turns out to be the resting-state networks. The decomposition is performed in the last step of the algorithm and is computationally simple. This opens up for rapid analysis and visualization of the data on different spatial levels, as well as automatically finding a suitable number of decomposition components.
Collapse
|
42
|
Benjaminsson S, Lansner A. Adaptive sensor drift counteraction by a modular neural network. Neurosci Res 2010. [DOI: 10.1016/j.neures.2010.07.2508] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
43
|
Çürüklü B, Lansner A. Configuration-specific facilitation phenomena explained by layer 2/3 summation pools in V1. BMC Neurosci 2009. [DOI: 10.1186/1471-2202-10-s1-p177] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
44
|
Lansner A. Associative memory models: from the cell-assembly theory to biophysically detailed cortex simulations. Trends Neurosci 2009; 32:178-86. [PMID: 19187979 DOI: 10.1016/j.tins.2008.12.002] [Citation(s) in RCA: 76] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2008] [Revised: 12/04/2008] [Accepted: 12/08/2008] [Indexed: 01/23/2023]
Abstract
The second half of the past century saw the emergence of a theory of cortical associative memory function originating in Donald Hebb's hypotheses on activity-dependent synaptic plasticity and cell-assembly formation and dynamics. This conceptual framework has today developed into a theory of attractor memory that brings together many experimental observations from different sources and levels of investigation into computational models displaying information-processing capabilities such as efficient associative memory and holistic perception. Here, we outline a development that might eventually lead to a neurobiologically grounded theory of cortical associative memory.
Collapse
|
45
|
Benjaminsson S, Fransson P, Lansner A. A novel model-free fMRI data analysis technique based on clustering in a mutual information space. Front Neuroinform 2009. [DOI: 10.3389/conf.neuro.11.2009.08.028] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
|
46
|
Johansson C, Lansner A. Implementing plastic weights in neural networks using low precision arithmetic. Neurocomputing 2009. [DOI: 10.1016/j.neucom.2008.04.007] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
47
|
Sandström M, Proschinger T, Lansner A. Fuzzy interval representation of olfactory stimulus concentration in an olfactory glomerulus model. BMC Neurosci 2008. [DOI: 10.1186/1471-2202-9-s1-p123] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
48
|
Djurfeldt M, Ekeberg O, Lansner A. Large-scale modeling - a tool for conquering the complexity of the brain. Front Neuroinform 2008; 2:1. [PMID: 18974793 PMCID: PMC2525974 DOI: 10.3389/neuro.11.001.2008] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2008] [Accepted: 02/22/2008] [Indexed: 11/13/2022] Open
Abstract
Is there any hope of achieving a thorough understanding of higher functions such as perception, memory, thought and emotion or is the stunning complexity of the brain a barrier which will limit such efforts for the foreseeable future? In this perspective we discuss methods to handle complexity, approaches to model building, and point to detailed large-scale models as a new contribution to the toolbox of the computational neuroscientist. We elucidate some aspects which distinguishes large-scale models and some of the technological challenges which they entail.
Collapse
|
49
|
Brette R, Rudolph M, Carnevale T, Hines M, Beeman D, Bower JM, Diesmann M, Morrison A, Goodman PH, Harris FC, Zirpe M, Natschläger T, Pecevski D, Ermentrout B, Djurfeldt M, Lansner A, Rochel O, Vieville T, Muller E, Davison AP, El Boustani S, Destexhe A. Simulation of networks of spiking neurons: a review of tools and strategies. J Comput Neurosci 2007; 23:349-98. [PMID: 17629781 PMCID: PMC2638500 DOI: 10.1007/s10827-007-0038-6] [Citation(s) in RCA: 334] [Impact Index Per Article: 19.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2006] [Revised: 04/02/2007] [Accepted: 04/12/2007] [Indexed: 11/26/2022]
Abstract
We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on the exact timing of the spikes. We overview different simulators and simulation environments presently available (restricted to those freely available, open source and documented). For each simulation tool, its advantages and pitfalls are reviewed, with an aim to allow the reader to identify which simulator is appropriate for a given task. Finally, we provide a series of benchmark simulations of different types of networks of spiking neurons, including Hodgkin-Huxley type, integrate-and-fire models, interacting with current-based or conductance-based synapses, using clock-driven or event-driven integration strategies. The same set of models are implemented on the different simulators, and the codes are made available. The ultimate goal of this review is to provide a resource to facilitate identifying the appropriate integration strategy and simulation tool to use for a given modeling problem related to spiking neural networks.
Collapse
|
50
|
Johansson C, Lansner A. Imposing biological constraints onto an abstract neocortical attractor network model. Neural Comput 2007; 19:1871-96. [PMID: 17521282 DOI: 10.1162/neco.2007.19.7.1871] [Citation(s) in RCA: 16] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
In this letter, we study an abstract model of neocortex based on its modularization into mini- and hypercolumns. We discuss a full-scale instance of this model and connect its network properties to the underlying biological properties of neurons in cortex. In particular, we discuss how the biological constraints put on the network determine the network's performance in terms of storage capacity. We show that a network instantiating the model scales well given the biologically constrained parameters on activity and connectivity, which makes this network interesting also as an engineered system. In this model, the minicolumns are grouped into hypercolumns that can be active or quiescent, and the model predicts that only a few percent of the hypercolumns should be active at any one time. With this model, we show that at least 20 to 30 pyramidal neurons should be aggregated into a minicolumn and at least 50 to 60 minicolumns should be grouped into a hypercolumn in order to achieve high storage capacity.
Collapse
|