51
|
Willmore BDB, Mazer JA, Gallant JL. Sparse coding in striate and extrastriate visual cortex. J Neurophysiol 2011; 105:2907-19. [PMID: 21471391 DOI: 10.1152/jn.00594.2010] [Citation(s) in RCA: 44] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Theoretical studies of mammalian cortex argue that efficient neural codes should be sparse. However, theoretical and experimental studies have used different definitions of the term "sparse" leading to three assumptions about the nature of sparse codes. First, codes that have high lifetime sparseness require few action potentials. Second, lifetime-sparse codes are also population-sparse. Third, neural codes are optimized to maximize lifetime sparseness. Here, we examine these assumptions in detail and test their validity in primate visual cortex. We show that lifetime and population sparseness are not necessarily correlated and that a code may have high lifetime sparseness regardless of how many action potentials it uses. We measure lifetime sparseness during presentation of natural images in three areas of macaque visual cortex, V1, V2, and V4. We find that lifetime sparseness does not increase across the visual hierarchy. This suggests that the neural code is not simply optimized to maximize lifetime sparseness. We also find that firing rates during a challenging visual task are higher than theoretical values based on metabolic limits and that responses in V1, V2, and V4 are well-described by exponential distributions. These findings are consistent with the hypothesis that neurons are optimized to maximize information transmission subject to metabolic constraints on mean firing rate.
Collapse
Affiliation(s)
- Ben D B Willmore
- Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom.
| | | | | |
Collapse
|
52
|
Assisi C, Stopfer M, Bazhenov M. Using the structure of inhibitory networks to unravel mechanisms of spatiotemporal patterning. Neuron 2011; 69:373-86. [PMID: 21262473 DOI: 10.1016/j.neuron.2010.12.019] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/15/2010] [Indexed: 10/18/2022]
Abstract
Neuronal networks exhibit a rich dynamical repertoire, a consequence of both the intrinsic properties of neurons and the structure of the network. It has been hypothesized that inhibitory interneurons corral principal neurons into transiently synchronous ensembles that encode sensory information and subserve behavior. How does the structure of the inhibitory network facilitate such spatiotemporal patterning? We established a relationship between an important structural property of a network, its colorings, and the dynamics it constrains. Using a model of the insect antennal lobe, we show that our description allows the explicit identification of the groups of inhibitory interneurons that switch, during odor stimulation, between activity and quiescence in a coordinated manner determined by features of the network structure. This description optimally matches the perspective of the downstream neurons looking for synchrony in ensembles of presynaptic cells and allows a low-dimensional description of seemingly complex high-dimensional network activity.
Collapse
Affiliation(s)
- Collins Assisi
- Department of Cell Biology and Neuroscience, University of California, Riverside, Riverside, CA 92521, USA.
| | | | | |
Collapse
|
53
|
Abstract
Neurons in the input layer of primary visual cortex in primates develop edge-like receptive fields. One approach to understanding the emergence of this response is to state that neural activity has to efficiently represent sensory data with respect to the statistics of natural scenes. Furthermore, it is believed that such an efficient coding is achieved using a competition across neurons so as to generate a sparse representation, that is, where a relatively small number of neurons are simultaneously active. Indeed, different models of sparse coding, coupled with Hebbian learning and homeostasis, have been proposed that successfully match the observed emergent response. However, the specific role of homeostasis in learning such sparse representations is still largely unknown. By quantitatively assessing the efficiency of the neural representation during learning, we derive a cooperative homeostasis mechanism that optimally tunes the competition between neurons within the sparse coding algorithm. We apply this homeostasis while learning small patches taken from natural images and compare its efficiency with state-of-the-art algorithms. Results show that while different sparse coding algorithms give similar coding results, the homeostasis provides an optimal balance for the representation of natural images within the population of neurons. Competition in sparse coding is optimized when it is fair. By contributing to optimizing statistical competition across neurons, homeostasis is crucial in providing a more efficient solution to the emergence of independent components.
Collapse
Affiliation(s)
- Laurent U Perrinet
- Institut de Neurosciences Cognitives de Méditerranée, CNRS/University of Provence, 13402 Marseille Cedex 20, France.
| |
Collapse
|
54
|
Linhares A, Chada DM, Aranha CN. The emergence of Miller's magic number on a sparse distributed memory. PLoS One 2011; 6:e15592. [PMID: 21246049 PMCID: PMC3016408 DOI: 10.1371/journal.pone.0015592] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2010] [Accepted: 11/13/2010] [Indexed: 11/18/2022] Open
Abstract
Human memory is limited in the number of items held in one's mind--a limit known as "Miller's magic number". We study the emergence of such limits as a result of the statistics of large bitvectors used to represent items in memory, given two postulates: i) the Sparse Distributed Memory; and ii) chunking through averaging. Potential implications for theoretical neuroscience are discussed.
Collapse
|
55
|
Abstract
The remarkable performance of the olfactory system in classifying and categorizing the complex olfactory environment is built upon several basic neural circuit motifs. These include forms of inhibition that may play comparable roles in widely divergent species. In this issue of Neuron, a new study by Stokes and Isaacson sheds light on how elementary types of inhibition dynamically interact.
Collapse
Affiliation(s)
- Maxim Bazhenov
- Department of Cell Biology and Neuroscience, University of California, Riverside, CA 92521, USA
| | | |
Collapse
|
56
|
Generating sparse and selective third-order responses in the olfactory system of the fly. Proc Natl Acad Sci U S A 2010; 107:10713-8. [PMID: 20498080 DOI: 10.1073/pnas.1005635107] [Citation(s) in RCA: 111] [Impact Index Per Article: 7.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
In the antennal lobe of Drosophila, information about odors is transferred from olfactory receptor neurons (ORNs) to projection neurons (PNs), which then send axons to neurons in the lateral horn of the protocerebrum (LHNs) and to Kenyon cells (KCs) in the mushroom body. The transformation from ORN to PN responses can be described by a normalization model similar to what has been used in modeling visually responsive neurons. We study the implications of this transformation for the generation of LHN and KC responses under the hypothesis that LHN responses are highly selective and therefore suitable for driving innate behaviors, whereas KCs provide a more general sparse representation of odors suitable for forming learned behavioral associations. Our results indicate that the transformation from ORN to PN firing rates in the antennal lobe equalizes the magnitudes of and decorrelates responses to different odors through feedforward nonlinearities and lateral suppression within the circuitry of the antennal lobe, and we study how these two components affect LHN and KC responses.
Collapse
|
57
|
Functional consequences of correlated excitatory and inhibitory conductances in cortical networks. J Comput Neurosci 2010; 28:579-94. [PMID: 20490645 DOI: 10.1007/s10827-010-0240-9] [Citation(s) in RCA: 48] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/19/2009] [Revised: 04/08/2010] [Accepted: 04/20/2010] [Indexed: 10/19/2022]
Abstract
Neurons in the neocortex receive a large number of excitatory and inhibitory synaptic inputs. Excitation and inhibition dynamically balance each other, with inhibition lagging excitation by only few milliseconds. To characterize the functional consequences of such correlated excitation and inhibition, we studied models in which this correlation structure is induced by feedforward inhibition (FFI). Simple circuits show that an effective FFI changes the integrative behavior of neurons such that only synchronous inputs can elicit spikes, causing the responses to be sparse and precise. Further, effective FFI increases the selectivity for propagation of synchrony through a feedforward network, thereby increasing the stability to background activity. Last, we show that recurrent random networks with effective inhibition are more likely to exhibit dynamical network activity states as have been observed in vivo. Thus, when a feedforward signal path is embedded in such recurrent network, the stabilizing effect of effective inhibition creates an suitable substrate for signal propagation. In conclusion, correlated excitation and inhibition support the notion that synchronous spiking may be important for cortical processing.
Collapse
|
58
|
Ito I, Bazhenov M, Ong RCY, Raman B, Stopfer M. Frequency transitions in odor-evoked neural oscillations. Neuron 2010; 64:692-706. [PMID: 20005825 DOI: 10.1016/j.neuron.2009.10.004] [Citation(s) in RCA: 58] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/06/2009] [Indexed: 11/25/2022]
Abstract
In many species, sensory stimuli elicit the oscillatory synchronization of groups of neurons. What determines the properties of these oscillations? In the olfactory system of the moth, we found that odors elicited oscillatory synchronization through a neural mechanism like that described in locust and Drosophila. During responses to long odor pulses, oscillations suddenly slowed as net olfactory receptor neuron (ORN) output decreased; thus, stimulus intensity appeared to determine oscillation frequency. However, changing the concentration of the odor had little effect upon oscillatory frequency. Our recordings in vivo and computational models based on these results suggested that the main effect of increasing odor concentration was to recruit additional, less well-tuned ORNs whose firing rates were tightly constrained by adaptation and saturation. Thus, in the periphery, concentration is encoded mainly by the size of the responsive ORN population, and oscillation frequency is set by the adaptation and saturation of this response.
Collapse
Affiliation(s)
- Iori Ito
- National Institute of Child Health and Human Development, National Institutes of Health, Bethesda, MD 20892, USA
| | | | | | | | | |
Collapse
|
59
|
Abstract
In both insect and vertebrate olfactory systems only two synapses separate the sensory periphery from brain areas required for memory formation and the organisation of behaviour. In the Drosophila olfactory system, which is anatomically very similar to its vertebrate counterpart, there has been substantial recent progress in understanding the flow of information from experiments using molecular genetic, electrophysiological and optical imaging techniques. In this review, we shall focus on how olfactory information is processed and transformed in order to extract behaviourally relevant information. We follow the progress from olfactory receptor neurons, through the first processing area, the antennal lobe, to higher olfactory centres. We address both the underlying anatomy and mechanisms that govern the transformation of neural activity. We emphasise our emerging understanding of how different elementary computations, including signal averaging, gain control, decorrelation and integration, may be mapped onto different circuit elements.
Collapse
Affiliation(s)
- Nicolas Y Masse
- Division of Neurobiology, MRC Laboratory of Molecular Biology, Cambridge CB2 0QH, UK
| | | | | |
Collapse
|
60
|
Synchronization enhances synaptic efficacy through spike timing-dependent plasticity in the olfactory system. Neurocomputing 2009. [DOI: 10.1016/j.neucom.2009.08.003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
61
|
Ujfalussy B, Kiss T, Erdi P. Parallel computational subunits in dentate granule cells generate multiple place fields. PLoS Comput Biol 2009; 5:e1000500. [PMID: 19750211 PMCID: PMC2730574 DOI: 10.1371/journal.pcbi.1000500] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2009] [Accepted: 08/06/2009] [Indexed: 12/02/2022] Open
Abstract
A fundamental question in understanding neuronal computations is how dendritic events influence the output of the neuron. Different forms of integration of neighbouring and distributed synaptic inputs, isolated dendritic spikes and local regulation of synaptic efficacy suggest that individual dendritic branches may function as independent computational subunits. In the present paper, we study how these local computations influence the output of the neuron. Using a simple cascade model, we demonstrate that triggering somatic firing by a relatively small dendritic branch requires the amplification of local events by dendritic spiking and synaptic plasticity. The moderately branching dendritic tree of granule cells seems optimal for this computation since larger dendritic trees favor local plasticity by isolating dendritic compartments, while reliable detection of individual dendritic spikes in the soma requires a low branch number. Finally, we demonstrate that these parallel dendritic computations could contribute to the generation of multiple independent place fields of hippocampal granule cells. Neurons were originally divided into three morphologically distinct compartments: the dendrites receive the synaptic input, the soma integrates it and communicates the output of the cell to other neurons via the axon. Although several lines of evidence challenged this oversimplified view, neurons are still considered to be the basic information processing units of the nervous system as their output reflects the computations performed by the entire dendritic tree. In the present study, the authors build a simplified computational model and calculate that, in certain neurons, relatively small dendritic branches are able to independently trigger somatic firing. Therefore, in these cells, an action potential mirrors the activity of a small dendritic subunit rather than the input arriving to the whole dendritic tree. These neurons can be regarded as a network of a few independent integrator units connected to a common output unit. The authors demonstrate that a moderately branched dendritic tree of hippocampal granule cells may be optimized for these parallel computations. Finally the authors show that these parallel dendritic computations could explain some aspects of the location dependent activity of hippocampal granule cells.
Collapse
Affiliation(s)
- Balázs Ujfalussy
- Department of Biophysics, KFKI Research Institute for Particle and Nuclear Physics of the Hungarian Academy of Sciences, Budapest, Hungary.
| | | | | |
Collapse
|
62
|
Demmer H, Kloppenburg P. Intrinsic Membrane Properties and Inhibitory Synaptic Input of Kenyon Cells as Mechanisms for Sparse Coding? J Neurophysiol 2009; 102:1538-50. [DOI: 10.1152/jn.00183.2009] [Citation(s) in RCA: 56] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022] Open
Abstract
The insect mushroom bodies (MBs) are multimodal signal processing centers and are essential for olfactory learning. Electrophysiological recordings from the MBs' principal component neurons, the Kenyon cells (KCs), showed a sparse representation of olfactory signals. It has been proposed that the intrinsic and synaptic properties of the KC circuitry combine to reduce the firing of action potentials and to generate relatively brief windows for synaptic integration in the KCs, thus causing them to operate as coincidence detectors. To better understand the ionic mechanisms that mediate the KC intrinsic firing properties, we used whole cell patch-clamp recordings from KCs in the adult, intact brain of Periplaneta americana to analyze voltage- and/or Ca2+-dependent inward ( ICa, INa) and outward currents [ IA, IK(V), IK,ST, IO(Ca)]. In general the currents had properties similar to those of currents in other insect neurons. Certain functional parameters of ICaand IO(Ca), however, had unusually high values, allowing them to assist sparse coding. ICahad a low-activation threshold and a very high current density compared with those of ICain other insect neurons. Together these parameters make ICasuitable for boosting and sharpening the excitatory postsynaptic potentials as reported in previous studies. IO(Ca)also had a large current density and a very depolarized activation threshold. In combination, the large ICaand IO(Ca)are likely to mediate the strong spike frequency adaptation. These intrinsic properties of the KCs are likely to be supported by their tonic, inhibitory synaptic input, which was revealed by specific GABA antagonists and which contributes significantly to the hyperpolarized membrane potential at rest.
Collapse
|
63
|
Huerta R, Nowotny T. Fast and Robust Learning by Reinforcement Signals: Explorations in the Insect Brain. Neural Comput 2009; 21:2123-51. [PMID: 19538091 DOI: 10.1162/neco.2009.03-08-733] [Citation(s) in RCA: 46] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
We propose a model for pattern recognition in the insect brain. Departing from a well-known body of knowledge about the insect brain, we investigate which of the potentially present features may be useful to learn input patterns rapidly and in a stable manner. The plasticity underlying pattern recognition is situated in the insect mushroom bodies and requires an error signal to associate the stimulus with a proper response. As a proof of concept, we used our model insect brain to classify the well-known MNIST database of handwritten digits, a popular benchmark for classifiers. We show that the structural organization of the insect brain appears to be suitable for both fast learning of new stimuli and reasonable performance in stationary conditions. Furthermore, it is extremely robust to damage to the brain structures involved in sensory processing. Finally, we suggest that spatiotemporal dynamics can improve the level of confidence in a classification decision. The proposed approach allows testing the effect of hypothesized mechanisms rather than speculating on their benefit for system performance or confidence in its responses.
Collapse
Affiliation(s)
- Ramón Huerta
- Institute for Nonlinear Science, University of California San Diego, La Jolla CA 92093-0402, U.S.A
| | - Thomas Nowotny
- Centre for Computational Neuroscience and Robotics, Department of Informatics, University of Sussex, Falmer, Brighton, BN1 9QJ, U.K
| |
Collapse
|
64
|
Farkhooi F, Muller E, Nawrot MP. Sequential sparsing by successive adapting neural populations. BMC Neurosci 2009. [DOI: 10.1186/1471-2202-10-s1-o10] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
65
|
Abstract
The dentate hilus has been extensively studied in relation to its potential role in memory and in temporal lobe epilepsy. Little is known, however, about the synapses formed between the two major cell types in this region, glutamatergic mossy cells and hilar interneurons, or the organization of local circuits involving these cells. Using triple and quadruple simultaneous intracellular recordings in rat hippocampal slices, we find that mossy cells evoke EPSPs with high failure rates onto hilar neurons. Mossy cells show profound synapse specificity; 87.5% of their intralamellar connections are onto hilar interneurons. Hilar interneurons also show synapse specificity and preferentially inhibit mossy cells; 81% of inhibitory hilar synapses are onto mossy cells. Hilar IPSPs have low failure rates, are blocked by the GABA(A) receptor antagonist gabazine, and exhibit short-term depression when tested at 17 Hz. Surprisingly, more than half (57%) of the mossy cell synapses we found onto interneurons were part of reciprocal excitatory/inhibitory local circuit motifs. Neither the high degree of target cell specificity, nor the significant enrichment of structured polysynaptic local circuit motifs, could be explained by nonrandom sampling or somatic proximity. Intralamellar hilar synapses appear to function primarily by integrating synchronous inputs and presynaptic burst discharges, allowing hilar cells to respond over a large dynamic range of input strengths. The reciprocal mossy cell/interneuron local circuit motifs we find enriched in the hilus may generate sparse neural representations involved in hippocampal memory operations.
Collapse
|
66
|
Subunit-dependent postsynaptic expression of kainate receptors on hippocampal interneurons in area CA1. J Neurosci 2009; 29:563-74. [PMID: 19144856 DOI: 10.1523/jneurosci.4788-08.2009] [Citation(s) in RCA: 28] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Kainate receptors (KARs) contribute to postsynaptic excitation in only a select subset of neurons. To define the parameters that specify the postsynaptic expression of KARs, we examined the contribution of KARs to EPSCs on hippocampal interneurons in area CA1. Interneurons in stratum radiatum/lacunosum-moleculare express KARs both with and without the GluR5 subunit, but KAR-mediated EPSCs are generated mainly, if not entirely, by GluR5-containing KARs. Extrasynaptic glutamate spillover profoundly recruits AMPA receptors (AMPARs) with little effect on KARs, indicating that KARs are targeted at the synapse more precisely than AMPARs. However, spontaneous EPSCs with a conventional AMPAR component did not have a resolvable contribution of KARs, suggesting that the KARs that contribute to the evoked EPSCs are at a distinct set of synapses. GluR5-containing KARs on interneurons in stratum oriens do not contribute substantially to the EPSC. We conclude that KARs are localized to synapses by cell type-, synapse-, and subunit-selective mechanisms.
Collapse
|
67
|
Krofczik S, Menzel R, Nawrot MP. Rapid odor processing in the honeybee antennal lobe network. Front Comput Neurosci 2009; 2:9. [PMID: 19221584 PMCID: PMC2636688 DOI: 10.3389/neuro.10.009.2008] [Citation(s) in RCA: 86] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2008] [Accepted: 12/17/2008] [Indexed: 12/03/2022] Open
Abstract
In their natural environment, many insects need to identify and evaluate behaviorally relevant odorants on a rich and dynamic olfactory background. Behavioral studies have demonstrated that bees recognize learned odors within <200 ms, indicating a rapid processing of olfactory input in the sensory pathway. We studied the role of the honeybee antennal lobe network in constructing a fast and reliable code of odor identity using in vivo intracellular recordings of individual projection neurons (PNs) and local interneurons (LNs). We found a complementary ensemble code where odor identity is encoded in the spatio-temporal pattern of response latencies as well as in the pattern of activated and inactivated PN firing. This coding scheme rapidly reaches a stable representation within 50-150 ms after stimulus onset. Testing an odor mixture versus its individual compounds revealed different representations in the two morphologically distinct types of lateral- and median PNs (l- and m-PNs). Individual m-PNs mixture responses were dominated by the most effective compound (elemental representation) whereas l-PNs showed suppressed responses to the mixture but not to its individual compounds (synthetic representation). The onset of inhibition in the membrane potential of l-PNs coincided with the responses of putative inhibitory interneurons that responded significantly faster than PNs. Taken together, our results suggest that processing within the LN network of the AL is an essential component of constructing the antennal lobe population code.
Collapse
Affiliation(s)
- Sabine Krofczik
- Institut für Biologie – Neurobiologie, Freie Universität BerlinGermany
- Bernstein Center for Computational NeuroscienceBerlin, Germany
| | - Randolf Menzel
- Institut für Biologie – Neurobiologie, Freie Universität BerlinGermany
- Bernstein Center for Computational NeuroscienceBerlin, Germany
| | - Martin P. Nawrot
- Institut für Biologie – Neurobiologie, Freie Universität BerlinGermany
- Bernstein Center for Computational NeuroscienceBerlin, Germany
| |
Collapse
|
68
|
|
69
|
Bazhenov M, Rulkov NF, Timofeev I. Effect of synaptic connectivity on long-range synchronization of fast cortical oscillations. J Neurophysiol 2008; 100:1562-75. [PMID: 18632897 DOI: 10.1152/jn.90613.2008] [Citation(s) in RCA: 52] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Cortical gamma oscillations in the 20- to 80-Hz range are associated with attentiveness and sensory perception and have strong connections to both cognitive processing and temporal binding of sensory stimuli. These gamma oscillations become synchronized within a few milliseconds over distances spanning a few millimeters in spite of synaptic delays. In this study using in vivo recordings and large-scale cortical network models, we reveal a critical role played by the network geometry in achieving precise long-range synchronization in the gamma frequency band. Our results indicate that the presence of many independent synaptic pathways in a two-dimensional network facilitate precise phase synchronization of fast gamma band oscillations with nearly zero phase delays between remote network sites. These findings predict a common mechanism of precise oscillatory synchronization in neuronal networks.
Collapse
Affiliation(s)
- M Bazhenov
- Department of Cell Biology and Neuroscience, University of California, Riverside, Riverside, CA 92521, USA.
| | | | | |
Collapse
|
70
|
|
71
|
Oscillations and synchrony in large-scale cortical network models. J Biol Phys 2008; 34:279-99. [PMID: 19669478 DOI: 10.1007/s10867-008-9079-y] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2008] [Accepted: 04/11/2008] [Indexed: 10/21/2022] Open
Abstract
Intrinsic neuronal and circuit properties control the responses of large ensembles of neurons by creating spatiotemporal patterns of activity that are used for sensory processing, memory formation, and other cognitive tasks. The modeling of such systems requires computationally efficient single-neuron models capable of displaying realistic response properties. We developed a set of reduced models based on difference equations (map-based models) to simulate the intrinsic dynamics of biological neurons. These phenomenological models were designed to capture the main response properties of specific types of neurons while ensuring realistic model behavior across a sufficient dynamic range of inputs. This approach allows for fast simulations and efficient parameter space analysis of networks containing hundreds of thousands of neurons of different types using a conventional workstation. Drawing on results obtained using large-scale networks of map-based neurons, we discuss spatiotemporal cortical network dynamics as a function of parameters that affect synaptic interactions and intrinsic states of the neurons.
Collapse
|
72
|
Finelli LA, Haney S, Bazhenov M, Stopfer M, Sejnowski TJ. Synaptic learning rules and sparse coding in a model sensory system. PLoS Comput Biol 2008; 4:e1000062. [PMID: 18421373 PMCID: PMC2278376 DOI: 10.1371/journal.pcbi.1000062] [Citation(s) in RCA: 44] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2007] [Accepted: 03/18/2008] [Indexed: 11/18/2022] Open
Abstract
Neural circuits exploit numerous strategies for encoding information. Although the functional significance of individual coding mechanisms has been investigated, ways in which multiple mechanisms interact and integrate are not well understood. The locust olfactory system, in which dense, transiently synchronized spike trains across ensembles of antenna lobe (AL) neurons are transformed into a sparse representation in the mushroom body (MB; a region associated with memory), provides a well-studied preparation for investigating the interaction of multiple coding mechanisms. Recordings made in vivo from the insect MB demonstrated highly specific responses to odors in Kenyon cells (KCs). Typically, only a few KCs from the recorded population of neurons responded reliably when a specific odor was presented. Different odors induced responses in different KCs. Here, we explored with a biologically plausible model the possibility that a form of plasticity may control and tune synaptic weights of inputs to the mushroom body to ensure the specificity of KCs' responses to familiar or meaningful odors. We found that plasticity at the synapses between the AL and the MB efficiently regulated the delicate tuning necessary to selectively filter the intense AL oscillatory output and condense it to a sparse representation in the MB. Activity-dependent plasticity drove the observed specificity, reliability, and expected persistence of odor representations, suggesting a role for plasticity in information processing and making a testable prediction about synaptic plasticity at AL-MB synapses. The way in which the brain encodes, processes, transforms, and stores sensory information is a fundamental question in systems neuroscience. One challenge is to understand how neural oscillations, synchrony, population coding, and sparseness interact in the process of transforming and transferring information. Another question is how synaptic plasticity, the ability of synapses to change their strength, interacts efficiently with these different coding strategies to support learning and information storage. We approached these questions, rarely accessible to direct experimental investigation, in the olfactory system of the locust, a well-studied example. Here, the neurons in the antennal lobe carry neural representations of odor identity using dense, spatially distributed, oscillatory synchronized patterns of neural activity. Odor information cannot be interpreted by considering their activity independently. On the contrary, in the mushroom body—the next processing region, involved in the storage and retrieval of olfactory memories and analogous to the olfactory cortex—odor representations are sparse and carried by more selective neurons. Sparse information coding by ensembles of neurons provides several important advantages including high memory capacity, low overlap between stored objects, and easy information retrieval. How is this sparseness achieved? Here, with a rigorous computational model of the olfactory system, we demonstrate that plasticity at the input afferents to the mushroom body can efficiently mediate the delicate tuning necessary to selectively filter intense sensory input, condensing it to the sparse responses observed in the mushroom body. Our results suggest a general mechanism for plasticity-enabled sparse representations in other sensory systems, such as the visual system. Overall, we illustrate a potential central role for plasticity in the transfer of information across different coding strategies within neural systems.
Collapse
Affiliation(s)
- Luca A Finelli
- Computational Neurobiology Laboratory, The Salk Institute for Biological Studies, La Jolla, California, United States of America.
| | | | | | | | | |
Collapse
|