1
|
Tamosiunaite M, Tetzlaff C, Wörgötter F. Unsupervised learning of perceptual feature combinations. PLoS Comput Biol 2024; 20:e1011926. [PMID: 38442095 PMCID: PMC10942261 DOI: 10.1371/journal.pcbi.1011926] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2023] [Revised: 03/15/2024] [Accepted: 02/19/2024] [Indexed: 03/07/2024] Open
Abstract
In many situations it is behaviorally relevant for an animal to respond to co-occurrences of perceptual, possibly polymodal features, while these features alone may have no importance. Thus, it is crucial for animals to learn such feature combinations in spite of the fact that they may occur with variable intensity and occurrence frequency. Here, we present a novel unsupervised learning mechanism that is largely independent of these contingencies and allows neurons in a network to achieve specificity for different feature combinations. This is achieved by a novel correlation-based (Hebbian) learning rule, which allows for linear weight growth and which is combined with a mechanism for gradually reducing the learning rate as soon as the neuron's response becomes feature combination specific. In a set of control experiments, we show that other existing advanced learning rules cannot satisfactorily form ordered multi-feature representations. In addition, we show that networks, which use this type of learning always stabilize and converge to subsets of neurons with different feature-combination specificity. Neurons with this property may, thus, serve as an initial stage for the processing of ecologically relevant real world situations for an animal.
Collapse
Affiliation(s)
- Minija Tamosiunaite
- Department for Computational Neuroscience, Third Physics Institute, University of Göttingen, Göttingen, Germany
- Vytautas Magnus University, Faculty of Informatics, Kaunas, Lithuania
| | - Christian Tetzlaff
- Computational Synaptic Physiology, Department for Neuro- and Sensory Physiology, University Medical Center Göttingen, Göttingen, Germany
- Campus Institute Data Science, Göttingen, Germany
| | - Florentin Wörgötter
- Department for Computational Neuroscience, Third Physics Institute, University of Göttingen, Göttingen, Germany
| |
Collapse
|
2
|
Kern FB, Chao ZC. Short-term neuronal and synaptic plasticity act in synergy for deviance detection in spiking networks. PLoS Comput Biol 2023; 19:e1011554. [PMID: 37831721 PMCID: PMC10599548 DOI: 10.1371/journal.pcbi.1011554] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2023] [Revised: 10/25/2023] [Accepted: 09/29/2023] [Indexed: 10/15/2023] Open
Abstract
Sensory areas of cortex respond more strongly to infrequent stimuli when these violate previously established regularities, a phenomenon known as deviance detection (DD). Previous modeling work has mainly attempted to explain DD on the basis of synaptic plasticity. However, a large fraction of cortical neurons also exhibit firing rate adaptation, an underexplored potential mechanism. Here, we investigate DD in a spiking neuronal network model with two types of short-term plasticity, fast synaptic short-term depression (STD) and slower threshold adaptation (TA). We probe the model with an oddball stimulation paradigm and assess DD by evaluating the network responses. We find that TA is sufficient to elicit DD. It achieves this by habituating neurons near the stimulation site that respond earliest to the frequently presented standard stimulus (local fatigue), which diminishes the response and promotes the recovery (global fatigue) of the wider network. Further, we find a synergy effect between STD and TA, where they interact with each other to achieve greater DD than the sum of their individual effects. We show that this synergy is caused by the local fatigue added by STD, which inhibits the global response to the frequently presented stimulus, allowing greater recovery of TA-mediated global fatigue and making the network more responsive to the deviant stimulus. Finally, we show that the magnitude of DD strongly depends on the timescale of stimulation. We conclude that highly predictable information can be encoded in strong local fatigue, which allows greater global recovery and subsequent heightened sensitivity for DD.
Collapse
Affiliation(s)
- Felix Benjamin Kern
- International Research Center for Neurointelligence (WPI-IRCN), The University of Tokyo, Tokyo, Japan
| | - Zenas C. Chao
- International Research Center for Neurointelligence (WPI-IRCN), The University of Tokyo, Tokyo, Japan
| |
Collapse
|
3
|
Sörensen LKA, Bohté SM, de Jong D, Slagter HA, Scholte HS. Mechanisms of human dynamic object recognition revealed by sequential deep neural networks. PLoS Comput Biol 2023; 19:e1011169. [PMID: 37294830 DOI: 10.1371/journal.pcbi.1011169] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2022] [Accepted: 05/09/2023] [Indexed: 06/11/2023] Open
Abstract
Humans can quickly recognize objects in a dynamically changing world. This ability is showcased by the fact that observers succeed at recognizing objects in rapidly changing image sequences, at up to 13 ms/image. To date, the mechanisms that govern dynamic object recognition remain poorly understood. Here, we developed deep learning models for dynamic recognition and compared different computational mechanisms, contrasting feedforward and recurrent, single-image and sequential processing as well as different forms of adaptation. We found that only models that integrate images sequentially via lateral recurrence mirrored human performance (N = 36) and were predictive of trial-by-trial responses across image durations (13-80 ms/image). Importantly, models with sequential lateral-recurrent integration also captured how human performance changes as a function of image presentation durations, with models processing images for a few time steps capturing human object recognition at shorter presentation durations and models processing images for more time steps capturing human object recognition at longer presentation durations. Furthermore, augmenting such a recurrent model with adaptation markedly improved dynamic recognition performance and accelerated its representational dynamics, thereby predicting human trial-by-trial responses using fewer processing resources. Together, these findings provide new insights into the mechanisms rendering object recognition so fast and effective in a dynamic visual world.
Collapse
Affiliation(s)
- Lynn K A Sörensen
- Department of Psychology, University of Amsterdam, Amsterdam, Netherlands
- Amsterdam Brain & Cognition (ABC), University of Amsterdam, Amsterdam, Netherlands
| | - Sander M Bohté
- Machine Learning Group, Centrum Wiskunde & Informatica, Amsterdam, Netherlands
- Swammerdam Institute of Life Sciences (SILS), University of Amsterdam, Amsterdam, Netherlands
- Bernoulli Institute, Rijksuniversiteit Groningen, Groningen, Netherlands
| | - Dorina de Jong
- Istituto Italiano di Tecnologia, Center for Translational Neurophysiology of Speech and Communication, (CTNSC), Ferrara, Italy
- Università di Ferrara, Dipartimento di Scienze Biomediche e Chirurgico Specialistiche, Ferrara, Italy
| | - Heleen A Slagter
- Department of Experimental and Applied Psychology, Vrije Universiteit Amsterdam, Amsterdam, Netherlands
- Institute of Brain and Behaviour Amsterdam, Vrije Universiteit Amsterdam, Amsterdam, Netherlands
| | - H Steven Scholte
- Department of Psychology, University of Amsterdam, Amsterdam, Netherlands
- Amsterdam Brain & Cognition (ABC), University of Amsterdam, Amsterdam, Netherlands
| |
Collapse
|
4
|
Reconstruction of sparse recurrent connectivity and inputs from the nonlinear dynamics of neuronal networks. J Comput Neurosci 2023; 51:43-58. [PMID: 35849304 DOI: 10.1007/s10827-022-00831-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2020] [Revised: 06/16/2022] [Accepted: 07/13/2022] [Indexed: 01/18/2023]
Abstract
Reconstructing the recurrent structural connectivity of neuronal networks is a challenge crucial to address in characterizing neuronal computations. While directly measuring the detailed connectivity structure is generally prohibitive for large networks, we develop a novel framework for reverse-engineering large-scale recurrent network connectivity matrices from neuronal dynamics by utilizing the widespread sparsity of neuronal connections. We derive a linear input-output mapping that underlies the irregular dynamics of a model network composed of both excitatory and inhibitory integrate-and-fire neurons with pulse coupling, thereby relating network inputs to evoked neuronal activity. Using this embedded mapping and experimentally feasible measurements of the firing rate as well as voltage dynamics in response to a relatively small ensemble of random input stimuli, we efficiently reconstruct the recurrent network connectivity via compressive sensing techniques. Through analogous analysis, we then recover high dimensional natural stimuli from evoked neuronal network dynamics over a short time horizon. This work provides a generalizable methodology for rapidly recovering sparse neuronal network data and underlines the natural role of sparsity in facilitating the efficient encoding of network data in neuronal dynamics.
Collapse
|
5
|
Barkdoll K, Lu Y, Barranca VJ. New insights into binocular rivalry from the reconstruction of evolving percepts using model network dynamics. Front Comput Neurosci 2023; 17:1137015. [PMID: 37034441 PMCID: PMC10079880 DOI: 10.3389/fncom.2023.1137015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2023] [Accepted: 03/07/2023] [Indexed: 04/11/2023] Open
Abstract
When the two eyes are presented with highly distinct stimuli, the resulting visual percept generally switches every few seconds between the two monocular images in an irregular fashion, giving rise to a phenomenon known as binocular rivalry. While a host of theoretical studies have explored potential mechanisms for binocular rivalry in the context of evoked model dynamics in response to simple stimuli, here we investigate binocular rivalry directly through complex stimulus reconstructions based on the activity of a two-layer neuronal network model with competing downstream pools driven by disparate monocular stimuli composed of image pixels. To estimate the dynamic percept, we derive a linear input-output mapping rooted in the non-linear network dynamics and iteratively apply compressive sensing techniques for signal recovery. Utilizing a dominance metric, we are able to identify when percept alternations occur and use data collected during each dominance period to generate a sequence of percept reconstructions. We show that despite the approximate nature of the input-output mapping and the significant reduction in neurons downstream relative to stimulus pixels, the dominant monocular image is well-encoded in the network dynamics and improvements are garnered when realistic spatial receptive field structure is incorporated into the feedforward connectivity. Our model demonstrates gamma-distributed dominance durations and well obeys Levelt's four laws for how dominance durations change with stimulus strength, agreeing with key recurring experimental observations often used to benchmark rivalry models. In light of evidence that individuals with autism exhibit relatively slow percept switching in binocular rivalry, we corroborate the ubiquitous hypothesis that autism manifests from reduced inhibition in the brain by systematically probing our model alternation rate across choices of inhibition strength. We exhibit sufficient conditions for producing binocular rivalry in the context of natural scene stimuli, opening a clearer window into the dynamic brain computations that vary with the generated percept and a potential path toward further understanding neurological disorders.
Collapse
|
6
|
Deng Y, Liu B, Huang Z, Liu X, He S, Li Q, Guo D. Fractional Spiking Neuron: Fractional Leaky Integrate-and-Fire Circuit Described with Dendritic Fractal Model. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 2022; 16:1375-1386. [PMID: 36315548 DOI: 10.1109/tbcas.2022.3218294] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/05/2023]
Abstract
As dendrites are essential parts of neurons, they are crucial factors for neuronal activities to follow multiple timescale dynamics, which ultimately affect information processing and cognition. However, in the common SNN (Spiking Neural Networks), the hardware-based LIF (Leaky Integrate-and-Fire) circuit only simulates the single timescale dynamic of soma without relating dendritic morphologies, which may limit the capability of simulating neurons to process information. This study proposes the dendritic fractal model mainly for quantifying dendritic morphological effects containing branch and length. To realize this model, We design multiple analog fractional-order circuits (AFCs) which match their extended structures and parameters with the dendritic features. Then introducing AFC into FLIF (Fractional Leaky Integrate-and-Fire) neuron circuits can demonstrate the same multiple timescale dynamics of spiking patterns as biological neurons, including spiking adaptation, inter-spike variability with power-law distribution, first-spike latency, and intrinsic memory. By contrast, it further enhances the degree of mimicry of neuron models and provides a more accurate model for understanding neural computation and cognition mechanisms.
Collapse
|
7
|
Manneschi L, Gigante G, Vasilaki E, Del Giudice P. Signal neutrality, scalar property, and collapsing boundaries as consequences of a learned multi-timescale strategy. PLoS Comput Biol 2022; 18:e1009393. [PMID: 35930590 PMCID: PMC9462745 DOI: 10.1371/journal.pcbi.1009393] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2021] [Revised: 09/09/2022] [Accepted: 06/08/2022] [Indexed: 11/18/2022] Open
Abstract
We postulate that three fundamental elements underlie a decision making process: perception of time passing, information processing in multiple timescales and reward maximisation. We build a simple reinforcement learning agent upon these principles that we train on a random dot-like task. Our results, similar to the experimental data, demonstrate three emerging signatures. (1) signal neutrality: insensitivity to the signal coherence in the interval preceding the decision. (2) Scalar property: the mean of the response times varies widely for different signal coherences, yet the shape of the distributions stays almost unchanged. (3) Collapsing boundaries: the “effective” decision-making boundary changes over time in a manner reminiscent of the theoretical optimal. Removing the perception of time or the multiple timescales from the model does not preserve the distinguishing signatures. Our results suggest an alternative explanation for signal neutrality. We propose that it is not part of motor planning. It is part of the decision-making process and emerges from information processing on multiple timescales.
Collapse
Affiliation(s)
- Luca Manneschi
- Department of Computer Science, University of Sheffield, Sheffield, United Kingdom
- * E-mail:
| | - Guido Gigante
- Istituto Superiore di Sanità, Rome, Italy
- INFN, Sezione di Roma, Rome, Italy
| | - Eleni Vasilaki
- Department of Computer Science, University of Sheffield, Sheffield, United Kingdom
- Institute of Neuroinformatics, University of Zurich and ETH Zurich, Switzerland
| | - Paolo Del Giudice
- Istituto Superiore di Sanità, Rome, Italy
- INFN, Sezione di Roma, Rome, Italy
| |
Collapse
|
8
|
Barranca VJ, Bhuiyan A, Sundgren M, Xing F. Functional Implications of Dale's Law in Balanced Neuronal Network Dynamics and Decision Making. Front Neurosci 2022; 16:801847. [PMID: 35295091 PMCID: PMC8919085 DOI: 10.3389/fnins.2022.801847] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2021] [Accepted: 02/02/2022] [Indexed: 11/28/2022] Open
Abstract
The notion that a neuron transmits the same set of neurotransmitters at all of its post-synaptic connections, typically known as Dale's law, is well supported throughout the majority of the brain and is assumed in almost all theoretical studies investigating the mechanisms for computation in neuronal networks. Dale's law has numerous functional implications in fundamental sensory processing and decision-making tasks, and it plays a key role in the current understanding of the structure-function relationship in the brain. However, since exceptions to Dale's law have been discovered for certain neurons and because other biological systems with complex network structure incorporate individual units that send both positive and negative feedback signals, we investigate the functional implications of network model dynamics that violate Dale's law by allowing each neuron to send out both excitatory and inhibitory signals to its neighbors. We show how balanced network dynamics, in which large excitatory and inhibitory inputs are dynamically adjusted such that input fluctuations produce irregular firing events, are theoretically preserved for a single population of neurons violating Dale's law. We further leverage this single-population network model in the context of two competing pools of neurons to demonstrate that effective decision-making dynamics are also produced, agreeing with experimental observations from honeybee dynamics in selecting a food source and artificial neural networks trained in optimal selection. Through direct comparison with the classical two-population balanced neuronal network, we argue that the one-population network demonstrates more robust balanced activity for systems with less computational units, such as honeybee colonies, whereas the two-population network exhibits a more rapid response to temporal variations in network inputs, as required by the brain. We expect this study will shed light on the role of neurons violating Dale's law found in experiment as well as shared design principles across biological systems that perform complex computations.
Collapse
|
9
|
Mezzasalma SA, Grassi L, Grassi M. Physical and chemical properties of carbon nanotubes in view of mechanistic neuroscience investigations. Some outlook from condensed matter, materials science and physical chemistry. MATERIALS SCIENCE & ENGINEERING. C, MATERIALS FOR BIOLOGICAL APPLICATIONS 2021; 131:112480. [PMID: 34857266 DOI: 10.1016/j.msec.2021.112480] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/23/2021] [Revised: 09/08/2021] [Accepted: 10/07/2021] [Indexed: 01/17/2023]
Abstract
The open border between non-living and living matter, suggested by increasingly emerging fields of nanoscience interfaced to biological systems, requires a detailed knowledge of nanomaterials properties. An account of the wide spectrum of phenomena, belonging to physical chemistry of interfaces, materials science, solid state physics at the nanoscale and bioelectrochemistry, thus is acquainted for a comprehensive application of carbon nanotubes interphased with neuron cells. This review points out a number of conceptual tools to further address the ongoing advances in coupling neuronal networks with (carbon) nanotube meshworks, and to deepen the basic issues that govern a biological cell or tissue interacting with a nanomaterial. Emphasis is given here to the properties and roles of carbon nanotube systems at relevant spatiotemporal scales of individual molecules, junctions and molecular layers, as well as to the point of view of a condensed matter or materials scientist. Carbon nanotube interactions with blood-brain barrier, drug delivery, biocompatibility and functionalization issues are also regarded.
Collapse
Affiliation(s)
- Stefano A Mezzasalma
- Ruder Bošković Institute, Materials Physics Division, Bijeniška cesta 54, 10000 Zagreb, Croatia; Lund Institute for advanced Neutron and X-ray Science (LINXS), Lund University, IDEON Building, Delta 5, Scheelevägen 19, 223 70 Lund, Sweden.
| | - Lucia Grassi
- Department of Engineering and Architecture, Trieste University, via Valerio 6, I-34127 Trieste, Italy
| | - Mario Grassi
- Department of Engineering and Architecture, Trieste University, via Valerio 6, I-34127 Trieste, Italy.
| |
Collapse
|
10
|
Schmitt LM, Erb J, Tune S, Rysop AU, Hartwigsen G, Obleser J. Predicting speech from a cortical hierarchy of event-based time scales. SCIENCE ADVANCES 2021. [PMID: 34860554 DOI: 10.1101/2020.12.19.423616] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/05/2023]
Abstract
How do predictions in the brain incorporate the temporal unfolding of context in our natural environment? We here provide evidence for a neural coding scheme that sparsely updates contextual representations at the boundary of events. This yields a hierarchical, multilayered organization of predictive language comprehension. Training artificial neural networks to predict the next word in a story at five stacked time scales and then using model-based functional magnetic resonance imaging, we observe an event-based “surprisal hierarchy” evolving along a temporoparietal pathway. Along this hierarchy, surprisal at any given time scale gated bottom-up and top-down connectivity to neighboring time scales. In contrast, surprisal derived from continuously updated context influenced temporoparietal activity only at short time scales. Representing context in the form of increasingly coarse events constitutes a network architecture for making predictions that is both computationally efficient and contextually diverse.
Collapse
Affiliation(s)
- Lea-Maria Schmitt
- Department of Psychology, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
- Center of Brain, Behavior and Metabolism, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| | - Julia Erb
- Department of Psychology, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
- Center of Brain, Behavior and Metabolism, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| | - Sarah Tune
- Department of Psychology, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
- Center of Brain, Behavior and Metabolism, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| | - Anna U Rysop
- Lise Meitner Research Group Cognition and Plasticity, Max Planck Institute for Human Cognitive and Brain Sciences, Stephanstraße 1 A, 04103 Leipzig, Germany
| | - Gesa Hartwigsen
- Lise Meitner Research Group Cognition and Plasticity, Max Planck Institute for Human Cognitive and Brain Sciences, Stephanstraße 1 A, 04103 Leipzig, Germany
| | - Jonas Obleser
- Department of Psychology, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
- Center of Brain, Behavior and Metabolism, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| |
Collapse
|
11
|
Schmitt LM, Erb J, Tune S, Rysop AU, Hartwigsen G, Obleser J. Predicting speech from a cortical hierarchy of event-based time scales. SCIENCE ADVANCES 2021; 7:eabi6070. [PMID: 34860554 PMCID: PMC8641937 DOI: 10.1126/sciadv.abi6070] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/19/2021] [Accepted: 10/15/2021] [Indexed: 05/30/2023]
Abstract
How do predictions in the brain incorporate the temporal unfolding of context in our natural environment? We here provide evidence for a neural coding scheme that sparsely updates contextual representations at the boundary of events. This yields a hierarchical, multilayered organization of predictive language comprehension. Training artificial neural networks to predict the next word in a story at five stacked time scales and then using model-based functional magnetic resonance imaging, we observe an event-based “surprisal hierarchy” evolving along a temporoparietal pathway. Along this hierarchy, surprisal at any given time scale gated bottom-up and top-down connectivity to neighboring time scales. In contrast, surprisal derived from continuously updated context influenced temporoparietal activity only at short time scales. Representing context in the form of increasingly coarse events constitutes a network architecture for making predictions that is both computationally efficient and contextually diverse.
Collapse
Affiliation(s)
- Lea-Maria Schmitt
- Department of Psychology, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
- Center of Brain, Behavior and Metabolism, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| | - Julia Erb
- Department of Psychology, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
- Center of Brain, Behavior and Metabolism, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| | - Sarah Tune
- Department of Psychology, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
- Center of Brain, Behavior and Metabolism, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| | - Anna U. Rysop
- Lise Meitner Research Group Cognition and Plasticity, Max Planck Institute for Human Cognitive and Brain Sciences, Stephanstraße 1 A, 04103 Leipzig, Germany
| | - Gesa Hartwigsen
- Lise Meitner Research Group Cognition and Plasticity, Max Planck Institute for Human Cognitive and Brain Sciences, Stephanstraße 1 A, 04103 Leipzig, Germany
| | - Jonas Obleser
- Department of Psychology, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
- Center of Brain, Behavior and Metabolism, University of Lübeck, Ratzeburger Allee 160, 23562 Lübeck, Germany
| |
Collapse
|
12
|
Romero-Sosa JL, Motanis H, Buonomano DV. Differential Excitability of PV and SST Neurons Results in Distinct Functional Roles in Inhibition Stabilization of Up States. J Neurosci 2021; 41:7182-7196. [PMID: 34253625 PMCID: PMC8387123 DOI: 10.1523/jneurosci.2830-20.2021] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2020] [Revised: 06/10/2021] [Accepted: 06/13/2021] [Indexed: 11/21/2022] Open
Abstract
Up states are the best studied example of an emergent neural dynamic regime. Computational models based on a single class of inhibitory neurons indicate that Up states reflect bistable dynamic systems in which positive feedback is stabilized by strong inhibition and predict a paradoxical effect in which increased drive to inhibitory neurons results in decreased inhibitory activity. To date, however, computational models have not incorporated empirically defined properties of parvalbumin (PV) and somatostatin (SST) neurons. Here we first experimentally characterized the frequency-current (F-I) curves of pyramidal (Pyr), PV, and SST neurons from mice of either sex, and confirmed a sharp difference between the threshold and slopes of PV and SST neurons. The empirically defined F-I curves were incorporated into a three-population computational model that simulated the empirically derived firing rates of pyramidal, PV, and SST neurons. Simulations revealed that the intrinsic properties were sufficient to predict that PV neurons are primarily responsible for generating the nontrivial fixed points representing Up states. Simulations and analytical methods demonstrated that while the paradoxical effect is not obligatory in a model with two classes of inhibitory neurons, it is present in most regimes. Finally, experimental tests validated predictions of the model that the Pyr ↔ PV inhibitory loop is stronger than the Pyr ↔ SST loop.SIGNIFICANCE STATEMENT Many cortical computations, such as working memory, rely on the local recurrent excitatory connections that define cortical circuit motifs. Up states are among the best studied examples of neural dynamic regimes that rely on recurrent excitatory excitation. However, this positive feedback must be held in check by inhibition. To address the relative contribution of PV and SST neurons, we characterized the intrinsic input-output differences between these classes of inhibitory neurons and, using experimental and theoretical methods, show that the higher threshold and gain of PV leads to a dominant role in network stabilization.
Collapse
Affiliation(s)
- Juan L Romero-Sosa
- Department of Neurobiology, Integrative Center for Learning and Memory, University of California, Los Angeles, Los Angeles, California 90095
- Department of Psychology, University of California, Los Angeles, Los Angeles, California 90095
| | - Helen Motanis
- Department of Neurobiology, Integrative Center for Learning and Memory, University of California, Los Angeles, Los Angeles, California 90095
- Department of Neurosurgery, University of California, Los Angeles, Los Angeles, California 90095
| | - Dean V Buonomano
- Department of Neurobiology, Integrative Center for Learning and Memory, University of California, Los Angeles, Los Angeles, California 90095
- Department of Psychology, University of California, Los Angeles, Los Angeles, California 90095
| |
Collapse
|
13
|
Badman RP, Hills TT, Akaishi R. Multiscale Computation and Dynamic Attention in Biological and Artificial Intelligence. Brain Sci 2020; 10:E396. [PMID: 32575758 PMCID: PMC7348831 DOI: 10.3390/brainsci10060396] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2020] [Revised: 05/23/2020] [Accepted: 06/17/2020] [Indexed: 11/16/2022] Open
Abstract
Biological and artificial intelligence (AI) are often defined by their capacity to achieve a hierarchy of short-term and long-term goals that require incorporating information over time and space at both local and global scales. More advanced forms of this capacity involve the adaptive modulation of integration across scales, which resolve computational inefficiency and explore-exploit dilemmas at the same time. Research in neuroscience and AI have both made progress towards understanding architectures that achieve this. Insight into biological computations come from phenomena such as decision inertia, habit formation, information search, risky choices and foraging. Across these domains, the brain is equipped with mechanisms (such as the dorsal anterior cingulate and dorsolateral prefrontal cortex) that can represent and modulate across scales, both with top-down control processes and by local to global consolidation as information progresses from sensory to prefrontal areas. Paralleling these biological architectures, progress in AI is marked by innovations in dynamic multiscale modulation, moving from recurrent and convolutional neural networks-with fixed scalings-to attention, transformers, dynamic convolutions, and consciousness priors-which modulate scale to input and increase scale breadth. The use and development of these multiscale innovations in robotic agents, game AI, and natural language processing (NLP) are pushing the boundaries of AI achievements. By juxtaposing biological and artificial intelligence, the present work underscores the critical importance of multiscale processing to general intelligence, as well as highlighting innovations and differences between the future of biological and artificial intelligence.
Collapse
Affiliation(s)
| | | | - Rei Akaishi
- Center for Brain Science, RIKEN, Saitama 351-0198, Japan
| |
Collapse
|
14
|
Stapmanns J, Kühn T, Dahmen D, Luu T, Honerkamp C, Helias M. Self-consistent formulations for stochastic nonlinear neuronal dynamics. Phys Rev E 2020; 101:042124. [PMID: 32422832 DOI: 10.1103/physreve.101.042124] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2019] [Accepted: 12/18/2019] [Indexed: 01/28/2023]
Abstract
Neural dynamics is often investigated with tools from bifurcation theory. However, many neuron models are stochastic, mimicking fluctuations in the input from unknown parts of the brain or the spiking nature of signals. Noise changes the dynamics with respect to the deterministic model; in particular classical bifurcation theory cannot be applied. We formulate the stochastic neuron dynamics in the Martin-Siggia-Rose de Dominicis-Janssen (MSRDJ) formalism and present the fluctuation expansion of the effective action and the functional renormalization group (fRG) as two systematic ways to incorporate corrections to the mean dynamics and time-dependent statistics due to fluctuations in the presence of nonlinear neuronal gain. To formulate self-consistency equations, we derive a fundamental link between the effective action in the Onsager-Machlup (OM) formalism, which allows the study of phase transitions, and the MSRDJ effective action, which is computationally advantageous. These results in particular allow the derivation of an OM effective action for systems with non-Gaussian noise. This approach naturally leads to effective deterministic equations for the first moment of the stochastic system; they explain how nonlinearities and noise cooperate to produce memory effects. Moreover, the MSRDJ formulation yields an effective linear system that has identical power spectra and linear response. Starting from the better known loopwise approximation, we then discuss the use of the fRG as a method to obtain self-consistency beyond the mean. We present a new efficient truncation scheme for the hierarchy of flow equations for the vertex functions by adapting the Blaizot, Méndez, and Wschebor approximation from the derivative expansion to the vertex expansion. The methods are presented by means of the simplest possible example of a stochastic differential equation that has generic features of neuronal dynamics.
Collapse
Affiliation(s)
- Jonas Stapmanns
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany.,Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany
| | - Tobias Kühn
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany.,Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany
| | - David Dahmen
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
| | - Thomas Luu
- Institut für Kernphysik (IKP-3), Institute for Advanced Simulation (IAS-4) and Jülich Center for Hadron Physics, Jülich Research Centre, Jülich, Germany
| | - Carsten Honerkamp
- Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany.,JARA-FIT, Jülich Aachen Research Alliance-Fundamentals of Future Information Technology, Germany
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany.,Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany
| |
Collapse
|
15
|
Biophysically grounded mean-field models of neural populations under electrical stimulation. PLoS Comput Biol 2020; 16:e1007822. [PMID: 32324734 PMCID: PMC7200022 DOI: 10.1371/journal.pcbi.1007822] [Citation(s) in RCA: 24] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2019] [Revised: 05/05/2020] [Accepted: 03/24/2020] [Indexed: 11/19/2022] Open
Abstract
Electrical stimulation of neural systems is a key tool for understanding neural dynamics and ultimately for developing clinical treatments. Many applications of electrical stimulation affect large populations of neurons. However, computational models of large networks of spiking neurons are inherently hard to simulate and analyze. We evaluate a reduced mean-field model of excitatory and inhibitory adaptive exponential integrate-and-fire (AdEx) neurons which can be used to efficiently study the effects of electrical stimulation on large neural populations. The rich dynamical properties of this basic cortical model are described in detail and validated using large network simulations. Bifurcation diagrams reflecting the network's state reveal asynchronous up- and down-states, bistable regimes, and oscillatory regions corresponding to fast excitation-inhibition and slow excitation-adaptation feedback loops. The biophysical parameters of the AdEx neuron can be coupled to an electric field with realistic field strengths which then can be propagated up to the population description. We show how on the edge of bifurcation, direct electrical inputs cause network state transitions, such as turning on and off oscillations of the population rate. Oscillatory input can frequency-entrain and phase-lock endogenous oscillations. Relatively weak electric field strengths on the order of 1 V/m are able to produce these effects, indicating that field effects are strongly amplified in the network. The effects of time-varying external stimulation are well-predicted by the mean-field model, further underpinning the utility of low-dimensional neural mass models.
Collapse
|
16
|
Mondal A, Sharma SK, Upadhyay RK, Mondal A. Firing activities of a fractional-order FitzHugh-Rinzel bursting neuron model and its coupled dynamics. Sci Rep 2019; 9:15721. [PMID: 31673009 PMCID: PMC6823374 DOI: 10.1038/s41598-019-52061-4] [Citation(s) in RCA: 27] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2018] [Accepted: 09/17/2019] [Indexed: 12/13/2022] Open
Abstract
Fractional-order dynamics of excitable systems can be physically described as a memory dependent phenomenon. It can produce diverse and fascinating oscillatory patterns for certain types of neuron models. To address these characteristics, we consider a nonlinear fast-slow FitzHugh-Rinzel (FH-R) model that exhibits elliptic bursting at a fixed set of parameters with a constant input current. The generalization of this classical order model provides a wide range of neuronal responses (regular spiking, fast-spiking, bursting, mixed-mode oscillations, etc.) in understanding the single neuron dynamics. So far, it is not completely understood to what extent the fractional-order dynamics may redesign the firing properties of excitable systems. We investigate how the classical order system changes its complex dynamics and how the bursting changes to different oscillations with stability and bifurcation analysis depending on the fractional exponent (0 < α ≤ 1). This occurs due to the memory trace of the fractional-order dynamics. The firing frequency of the fractional-order FH-R model is less than the classical order model, although the first spike latency exists there. Further, we investigate the responses of coupled FH-R neurons with small coupling strengths that synchronize at specific fractional-orders. The interesting dynamical characteristics suggest various neurocomputational features that can be induced in this fractional-order system which enriches the functional neuronal mechanisms.
Collapse
Affiliation(s)
- Argha Mondal
- Computational Neuroscience Center, University of Washington, Seattle, Washington, USA
| | - Sanjeev Kumar Sharma
- Department of Mathematics & Computing, Indian Institute of Technology (Indian School of Mines), Dhanbad, 826004, India
| | - Ranjit Kumar Upadhyay
- Department of Mathematics & Computing, Indian Institute of Technology (Indian School of Mines), Dhanbad, 826004, India.
| | - Arnab Mondal
- Department of Mathematics & Computing, Indian Institute of Technology (Indian School of Mines), Dhanbad, 826004, India
| |
Collapse
|
17
|
Inferring and validating mechanistic models of neural microcircuits based on spike-train data. Nat Commun 2019; 10:4933. [PMID: 31666513 PMCID: PMC6821748 DOI: 10.1038/s41467-019-12572-0] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/25/2018] [Accepted: 09/18/2019] [Indexed: 01/11/2023] Open
Abstract
The interpretation of neuronal spike train recordings often relies on abstract statistical models that allow for principled parameter estimation and model selection but provide only limited insights into underlying microcircuits. In contrast, mechanistic models are useful to interpret microcircuit dynamics, but are rarely quantitatively matched to experimental data due to methodological challenges. Here we present analytical methods to efficiently fit spiking circuit models to single-trial spike trains. Using derived likelihood functions, we statistically infer the mean and variance of hidden inputs, neuronal adaptation properties and connectivity for coupled integrate-and-fire neurons. Comprehensive evaluations on synthetic data, validations using ground truth in-vitro and in-vivo recordings, and comparisons with existing techniques demonstrate that parameter estimation is very accurate and efficient, even for highly subsampled networks. Our methods bridge statistical, data-driven and theoretical, model-based neurosciences at the level of spiking circuits, for the purpose of a quantitative, mechanistic interpretation of recorded neuronal population activity. It is difficult to fit mechanistic, biophysically constrained circuit models to spike train data from in vivo extracellular recordings. Here the authors present analytical methods that enable efficient parameter estimation for integrate-and-fire circuit models and inference of the underlying connectivity structure in subsampled networks.
Collapse
|
18
|
Barranca VJ, Zhou D. Compressive Sensing Inference of Neuronal Network Connectivity in Balanced Neuronal Dynamics. Front Neurosci 2019; 13:1101. [PMID: 31680835 PMCID: PMC6811502 DOI: 10.3389/fnins.2019.01101] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2019] [Accepted: 09/30/2019] [Indexed: 12/30/2022] Open
Abstract
Determining the structure of a network is of central importance to understanding its function in both neuroscience and applied mathematics. However, recovering the structural connectivity of neuronal networks remains a fundamental challenge both theoretically and experimentally. While neuronal networks operate in certain dynamical regimes, which may influence their connectivity reconstruction, there is widespread experimental evidence of a balanced neuronal operating state in which strong excitatory and inhibitory inputs are dynamically adjusted such that neuronal voltages primarily remain near resting potential. Utilizing the dynamics of model neurons in such a balanced regime in conjunction with the ubiquitous sparse connectivity structure of neuronal networks, we develop a compressive sensing theoretical framework for efficiently reconstructing network connections by measuring individual neuronal activity in response to a relatively small ensemble of random stimuli injected over a short time scale. By tuning the network dynamical regime, we determine that the highest fidelity reconstructions are achievable in the balanced state. We hypothesize the balanced dynamics observed in vivo may therefore be a result of evolutionary selection for optimal information encoding and expect the methodology developed to be generalizable for alternative model networks as well as experimental paradigms.
Collapse
Affiliation(s)
- Victor J Barranca
- Department of Mathematics and Statistics, Swarthmore College, Swarthmore, PA, United States
| | - Douglas Zhou
- School of Mathematical Sciences, Shanghai Jiao Tong University, Shanghai, China.,Ministry of Education Key Laboratory of Scientific and Engineering Computing, Shanghai Jiao Tong University, Shanghai, China.,Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| |
Collapse
|
19
|
Lim S. Mechanisms underlying sharpening of visual response dynamics with familiarity. eLife 2019; 8:44098. [PMID: 31393260 PMCID: PMC6711664 DOI: 10.7554/elife.44098] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/03/2018] [Accepted: 08/07/2019] [Indexed: 12/03/2022] Open
Abstract
Experience-dependent modifications of synaptic connections are thought to change patterns of network activities and stimulus tuning with learning. However, only a few studies explored how synaptic plasticity shapes the response dynamics of cortical circuits. Here, we investigated the mechanism underlying sharpening of both stimulus selectivity and response dynamics with familiarity observed in monkey inferotemporal cortex. Broadening the distribution of activities and stronger oscillations in the response dynamics after learning provide evidence for synaptic plasticity in recurrent connections modifying the strength of positive feedback. Its interplay with slow negative feedback via firing rate adaptation is critical in sharpening response dynamics. Analysis of changes in temporal patterns also enables us to disentangle recurrent and feedforward synaptic plasticity and provides a measure for the strengths of recurrent synaptic plasticity. Overall, this work highlights the importance of analyzing changes in dynamics as well as network patterns to further reveal the mechanisms of visual learning.
Collapse
Affiliation(s)
- Sukbin Lim
- Neural Science, NYU Shanghai, Shanghai, China.,NYU-ECNU Institute of Brain and Cognitive Science, NYU Shanghai, Shanghai, China
| |
Collapse
|
20
|
Muscinelli SP, Gerstner W, Schwalger T. How single neuron properties shape chaotic dynamics and signal transmission in random neural networks. PLoS Comput Biol 2019; 15:e1007122. [PMID: 31181063 PMCID: PMC6586367 DOI: 10.1371/journal.pcbi.1007122] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2019] [Revised: 06/20/2019] [Accepted: 05/22/2019] [Indexed: 02/07/2023] Open
Abstract
While most models of randomly connected neural networks assume single-neuron models with simple dynamics, neurons in the brain exhibit complex intrinsic dynamics over multiple timescales. We analyze how the dynamical properties of single neurons and recurrent connections interact to shape the effective dynamics in large randomly connected networks. A novel dynamical mean-field theory for strongly connected networks of multi-dimensional rate neurons shows that the power spectrum of the network activity in the chaotic phase emerges from a nonlinear sharpening of the frequency response function of single neurons. For the case of two-dimensional rate neurons with strong adaptation, we find that the network exhibits a state of "resonant chaos", characterized by robust, narrow-band stochastic oscillations. The coherence of stochastic oscillations is maximal at the onset of chaos and their correlation time scales with the adaptation timescale of single units. Surprisingly, the resonance frequency can be predicted from the properties of isolated neurons, even in the presence of heterogeneity in the adaptation parameters. In the presence of these internally-generated chaotic fluctuations, the transmission of weak, low-frequency signals is strongly enhanced by adaptation, whereas signal transmission is not influenced by adaptation in the non-chaotic regime. Our theoretical framework can be applied to other mechanisms at the level of single neurons, such as synaptic filtering, refractoriness or spike synchronization. These results advance our understanding of the interaction between the dynamics of single units and recurrent connectivity, which is a fundamental step toward the description of biologically realistic neural networks.
Collapse
Affiliation(s)
- Samuel P. Muscinelli
- School of Computer and Communication Sciences and School of Life Sciences, École polytechnique fédérale de Lausanne, Station 15, CH-1015 Lausanne EPFL, Switzerland
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, École polytechnique fédérale de Lausanne, Station 15, CH-1015 Lausanne EPFL, Switzerland
| | - Tilo Schwalger
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany
- Institut für Mathematik, Technische Universität Berlin, 10623 Berlin, Germany
| |
Collapse
|
21
|
Hertäg L, Sprekeler H. Amplifying the redistribution of somato-dendritic inhibition by the interplay of three interneuron types. PLoS Comput Biol 2019; 15:e1006999. [PMID: 31095556 PMCID: PMC6541306 DOI: 10.1371/journal.pcbi.1006999] [Citation(s) in RCA: 32] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2018] [Revised: 05/29/2019] [Accepted: 04/01/2019] [Indexed: 01/24/2023] Open
Abstract
GABAergic interneurons play an important role in shaping the activity of excitatory pyramidal cells (PCs). How the various inhibitory cell types contribute to neuronal information processing, however, is not resolved. Here, we propose a functional role for a widespread network motif consisting of parvalbumin- (PV), somatostatin- (SOM) and vasoactive intestinal peptide (VIP)-expressing interneurons. Following the idea that PV and SOM interneurons control the distribution of somatic and dendritic inhibition onto PCs, we suggest that mutual inhibition between VIP and SOM cells translates weak inputs to VIP interneurons into large changes of somato-dendritic inhibition of PCs. Using a computational model, we show that the neuronal and synaptic properties of the circuit support this hypothesis. Moreover, we demonstrate that the SOM-VIP motif allows transient inputs to persistently switch the circuit between two processing modes, in which top-down inputs onto apical dendrites of PCs are either integrated or cancelled. Neurons in the brain can be classified as excitatory or inhibitory based on whether they activate or deactivate the cells to whom they send signals. Compared to their excitatory counterpart, inhibitory neurons present themselves as a wild diversity of cell classes. It is broadly believed that these classes serve different purposes, but as of now, those are poorly understood. In this article, we suggest how an intricate interplay of three inhibitory cell classes can control whether internal signals—such as predictions, memory signals or motor commands—are taken into account when sensory signals are interpreted. Using a mathematical model and computer simulations, we show that such internal signals can be shut down by regulating which inhibitory cell types are active, and that the interaction of different cell classes allows weak control signals to do so.
Collapse
Affiliation(s)
- Loreen Hertäg
- Modelling of Cognitive Processes, Berlin Institute of Technology, Berlin, Germany.,Bernstein Center for Computational Neuroscience, Berlin, Germany
| | - Henning Sprekeler
- Modelling of Cognitive Processes, Berlin Institute of Technology, Berlin, Germany.,Bernstein Center for Computational Neuroscience, Berlin, Germany
| |
Collapse
|
22
|
Deviation from the matching law reflects an optimal strategy involving learning over multiple timescales. Nat Commun 2019; 10:1466. [PMID: 30931937 PMCID: PMC6443814 DOI: 10.1038/s41467-019-09388-3] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2017] [Accepted: 03/08/2019] [Indexed: 11/08/2022] Open
Abstract
Behavior deviating from our normative expectations often appears irrational. For example, even though behavior following the so-called matching law can maximize reward in a stationary foraging task, actual behavior commonly deviates from matching. Such behavioral deviations are interpreted as a failure of the subject; however, here we instead suggest that they reflect an adaptive strategy, suitable for uncertain, non-stationary environments. To prove it, we analyzed the behavior of primates that perform a dynamic foraging task. In such nonstationary environment, learning on both fast and slow timescales is beneficial: fast learning allows the animal to react to sudden changes, at the price of large fluctuations (variance) in the estimates of task relevant variables. Slow learning reduces the fluctuations but costs a bias that causes systematic behavioral deviations. Our behavioral analysis shows that the animals solved this bias-variance tradeoff by combining learning on both fast and slow timescales, suggesting that learning on multiple timescales can be a biologically plausible mechanism for optimizing decisions under uncertainty. Recent experience can only provide limited information to guide decisions in a volatile environment. Here, the authors report that the choices made by a monkey in a dynamic foraging task can be better explained by a model that combines learning on both fast and slow timescales.
Collapse
|
23
|
Beiran M, Ostojic S. Contrasting the effects of adaptation and synaptic filtering on the timescales of dynamics in recurrent networks. PLoS Comput Biol 2019; 15:e1006893. [PMID: 30897092 PMCID: PMC6445477 DOI: 10.1371/journal.pcbi.1006893] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2018] [Revised: 04/02/2019] [Accepted: 02/19/2019] [Indexed: 11/19/2022] Open
Abstract
Neural activity in awake behaving animals exhibits a vast range of timescales that can be several fold larger than the membrane time constant of individual neurons. Two types of mechanisms have been proposed to explain this conundrum. One possibility is that large timescales are generated by a network mechanism based on positive feedback, but this hypothesis requires fine-tuning of the strength or structure of the synaptic connections. A second possibility is that large timescales in the neural dynamics are inherited from large timescales of underlying biophysical processes, two prominent candidates being intrinsic adaptive ionic currents and synaptic transmission. How the timescales of adaptation or synaptic transmission influence the timescale of the network dynamics has however not been fully explored. To address this question, here we analyze large networks of randomly connected excitatory and inhibitory units with additional degrees of freedom that correspond to adaptation or synaptic filtering. We determine the fixed points of the systems, their stability to perturbations and the corresponding dynamical timescales. Furthermore, we apply dynamical mean field theory to study the temporal statistics of the activity in the fluctuating regime, and examine how the adaptation and synaptic timescales transfer from individual units to the whole population. Our overarching finding is that synaptic filtering and adaptation in single neurons have very different effects at the network level. Unexpectedly, the macroscopic network dynamics do not inherit the large timescale present in adaptive currents. In contrast, the timescales of network activity increase proportionally to the time constant of the synaptic filter. Altogether, our study demonstrates that the timescales of different biophysical processes have different effects on the network level, so that the slow processes within individual neurons do not necessarily induce slow activity in large recurrent neural networks.
Collapse
Affiliation(s)
- Manuel Beiran
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| | - Srdjan Ostojic
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| |
Collapse
|
24
|
Barranca VJ, Huang H, Kawakita G. Network structure and input integration in competing firing rate models for decision-making. J Comput Neurosci 2019; 46:145-168. [PMID: 30661144 DOI: 10.1007/s10827-018-0708-6] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2018] [Revised: 12/05/2018] [Accepted: 12/17/2018] [Indexed: 11/30/2022]
Abstract
Making a decision among numerous alternatives is a pervasive and central undertaking encountered by mammals in natural settings. While decision making for two-option tasks has been studied extensively both experimentally and theoretically, characterizing decision making in the face of a large set of alternatives remains challenging. We explore this issue by formulating a scalable mechanistic network model for decision making and analyzing the dynamics evoked given various potential network structures. In the case of a fully-connected network, we provide an analytical characterization of the model fixed points and their stability with respect to winner-take-all behavior for fair tasks. We compare several means of input integration, demonstrating a more gradual sigmoidal transfer function is likely evolutionarily advantageous relative to binary gain commonly utilized in engineered systems. We show via asymptotic analysis and numerical simulation that sigmoidal transfer functions with smaller steepness yield faster response times but depreciation in accuracy. However, in the presence of noise or degradation of connections, a sigmoidal transfer function garners significantly more robust and accurate decision-making dynamics. For fair tasks and sigmoidal gain, our model network also exhibits a stable parameter regime that produces high accuracy and persists across tasks with diverse numbers of alternatives and difficulties, satisfying physiological energetic constraints. In the case of more sparse and structured network topologies, including random, regular, and small-world connectivity, we show the high-accuracy parameter regime persists for biologically realistic connection densities. Our work shows how neural system architecture is potentially optimal in making economic, reliable, and advantageous decisions across tasks.
Collapse
Affiliation(s)
| | - Han Huang
- Swarthmore College, 500 College Avenue, Swarthmore, PA, 19081, USA
| | - Genji Kawakita
- Swarthmore College, 500 College Avenue, Swarthmore, PA, 19081, USA
| |
Collapse
|
25
|
The impact of spike-frequency adaptation on balanced network dynamics. Cogn Neurodyn 2018; 13:105-120. [PMID: 30728874 DOI: 10.1007/s11571-018-9504-2] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2018] [Revised: 07/20/2018] [Accepted: 08/28/2018] [Indexed: 10/28/2022] Open
Abstract
A dynamic balance between strong excitatory and inhibitory neuronal inputs is hypothesized to play a pivotal role in information processing in the brain. While there is evidence of the existence of a balanced operating regime in several cortical areas and idealized neuronal network models, it is important for the theory of balanced networks to be reconciled with more physiological neuronal modeling assumptions. In this work, we examine the impact of spike-frequency adaptation, observed widely across neurons in the brain, on balanced dynamics. We incorporate adaptation into binary and integrate-and-fire neuronal network models, analyzing the theoretical effect of adaptation in the large network limit and performing an extensive numerical investigation of the model adaptation parameter space. Our analysis demonstrates that balance is well preserved for moderate adaptation strength even if the entire network exhibits adaptation. In the common physiological case in which only excitatory neurons undergo adaptation, we show that the balanced operating regime in fact widens relative to the non-adaptive case. We hypothesize that spike-frequency adaptation may have been selected through evolution to robustly facilitate balanced dynamics across diverse cognitive operating states.
Collapse
|
26
|
Li X, Yamawaki N, Barrett JM, Körding KP, Shepherd GMG. Scaling of Optogenetically Evoked Signaling in a Higher-Order Corticocortical Pathway in the Anesthetized Mouse. Front Syst Neurosci 2018; 12:16. [PMID: 29867381 PMCID: PMC5962832 DOI: 10.3389/fnsys.2018.00016] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2018] [Accepted: 04/13/2018] [Indexed: 12/16/2022] Open
Abstract
Quantitative analysis of corticocortical signaling is needed to understand and model information processing in cerebral networks. However, higher-order pathways, hodologically remote from sensory input, are not amenable to spatiotemporally precise activation by sensory stimuli. Here, we combined parametric channelrhodopsin-2 (ChR2) photostimulation with multi-unit electrophysiology to study corticocortical driving in a parietofrontal pathway from retrosplenial cortex (RSC) to posterior secondary motor cortex (M2) in mice in vivo. Ketamine anesthesia was used both to eliminate complex activity associated with the awake state and to enable stable recordings of responses over a wide range of stimulus parameters. Photostimulation of ChR2-expressing neurons in RSC, the upstream area, produced local activity that decayed quickly. This activity in turn drove downstream activity in M2 that arrived rapidly (5-10 ms latencies), and scaled in amplitude across a wide range of stimulus parameters as an approximately constant fraction (~0.1) of the upstream activity. A model-based analysis could explain the corticocortically driven activity with exponentially decaying kernels (~20 ms time constant) and small delay. Reverse (antidromic) driving was similarly robust. The results show that corticocortical signaling in this pathway drives downstream activity rapidly and scalably, in a mostly linear manner. These properties, identified in anesthetized mice and represented in a simple model, suggest a robust basis for supporting complex non-linear dynamic activity in corticocortical circuits in the awake state.
Collapse
Affiliation(s)
- Xiaojian Li
- Department of Physiology, Feinberg School of Medicine, Northwestern University, Chicago, IL, United States
| | - Naoki Yamawaki
- Department of Physiology, Feinberg School of Medicine, Northwestern University, Chicago, IL, United States
| | - John M. Barrett
- Department of Physiology, Feinberg School of Medicine, Northwestern University, Chicago, IL, United States
| | - Konrad P. Körding
- Department of Physiology, Feinberg School of Medicine, Northwestern University, Chicago, IL, United States
- Department of Physical Medicine and Rehabilitation, Feinberg School of Medicine, Northwestern University, Chicago, IL, United States
- Department of Bioengineering, School of Engineering and Applied Science, University of Pennsylvania, Philadelphia, PA, United States
| | - Gordon M. G. Shepherd
- Department of Physiology, Feinberg School of Medicine, Northwestern University, Chicago, IL, United States
| |
Collapse
|
27
|
Devalle F, Roxin A, Montbrió E. Firing rate equations require a spike synchrony mechanism to correctly describe fast oscillations in inhibitory networks. PLoS Comput Biol 2017; 13:e1005881. [PMID: 29287081 PMCID: PMC5764488 DOI: 10.1371/journal.pcbi.1005881] [Citation(s) in RCA: 48] [Impact Index Per Article: 6.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2017] [Revised: 01/11/2018] [Accepted: 11/15/2017] [Indexed: 12/25/2022] Open
Abstract
Recurrently coupled networks of inhibitory neurons robustly generate oscillations in the gamma band. Nonetheless, the corresponding Wilson-Cowan type firing rate equation for such an inhibitory population does not generate such oscillations without an explicit time delay. We show that this discrepancy is due to a voltage-dependent spike-synchronization mechanism inherent in networks of spiking neurons which is not captured by standard firing rate equations. Here we investigate an exact low-dimensional description for a network of heterogeneous canonical Class 1 inhibitory neurons which includes the sub-threshold dynamics crucial for generating synchronous states. In the limit of slow synaptic kinetics the spike-synchrony mechanism is suppressed and the standard Wilson-Cowan equations are formally recovered as long as external inputs are also slow. However, even in this limit synchronous spiking can be elicited by inputs which fluctuate on a time-scale of the membrane time-constant of the neurons. Our meanfield equations therefore represent an extension of the standard Wilson-Cowan equations in which spike synchrony is also correctly described. Population models describing the average activity of large neuronal ensembles are a powerful mathematical tool to investigate the principles underlying cooperative function of large neuronal systems. However, these models do not properly describe the phenomenon of spike synchrony in networks of neurons. In particular, they fail to capture the onset of synchronous oscillations in networks of inhibitory neurons. We show that this limitation is due to a voltage-dependent synchronization mechanism which is naturally present in spiking neuron models but not captured by traditional firing rate equations. Here we investigate a novel set of macroscopic equations which incorporate both firing rate and membrane potential dynamics, and that correctly generate fast inhibition-based synchronous oscillations. In the limit of slow-synaptic processing oscillations are suppressed, and the model reduces to an equation formally equivalent to the Wilson-Cowan model.
Collapse
Affiliation(s)
- Federico Devalle
- Center for Brain and Cognition, Department of Information and Communication Technologies, Universitat Pompeu Fabra, Barcelona, Spain
- Department of Physics, Lancaster University, Lancaster, United Kingdom
| | - Alex Roxin
- Centre de Recerca Matemàtica, Campus de Bellaterra, Edifici C, Bellaterra, Barcelona, Spain
| | - Ernest Montbrió
- Center for Brain and Cognition, Department of Information and Communication Technologies, Universitat Pompeu Fabra, Barcelona, Spain
- * E-mail:
| |
Collapse
|
28
|
D’Albis T, Kempter R. A single-cell spiking model for the origin of grid-cell patterns. PLoS Comput Biol 2017; 13:e1005782. [PMID: 28968386 PMCID: PMC5638623 DOI: 10.1371/journal.pcbi.1005782] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2017] [Revised: 10/12/2017] [Accepted: 09/18/2017] [Indexed: 11/19/2022] Open
Abstract
Spatial cognition in mammals is thought to rely on the activity of grid cells in the entorhinal cortex, yet the fundamental principles underlying the origin of grid-cell firing are still debated. Grid-like patterns could emerge via Hebbian learning and neuronal adaptation, but current computational models remained too abstract to allow direct confrontation with experimental data. Here, we propose a single-cell spiking model that generates grid firing fields via spike-rate adaptation and spike-timing dependent plasticity. Through rigorous mathematical analysis applicable in the linear limit, we quantitatively predict the requirements for grid-pattern formation, and we establish a direct link to classical pattern-forming systems of the Turing type. Our study lays the groundwork for biophysically-realistic models of grid-cell activity.
Collapse
Affiliation(s)
- Tiziano D’Albis
- Institute for Theoretical Biology, Department of Biology, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Richard Kempter
- Institute for Theoretical Biology, Department of Biology, Humboldt-Universität zu Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Einstein Center for Neurosciences Berlin, Berlin, Germany
| |
Collapse
|
29
|
Teka WW, Upadhyay RK, Mondal A. Fractional-order leaky integrate-and-fire model with long-term memory and power law dynamics. Neural Netw 2017; 93:110-125. [PMID: 28575735 DOI: 10.1016/j.neunet.2017.05.007] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2016] [Revised: 04/30/2017] [Accepted: 05/05/2017] [Indexed: 11/26/2022]
Abstract
Pyramidal neurons produce different spiking patterns to process information, communicate with each other and transform information. These spiking patterns have complex and multiple time scale dynamics that have been described with the fractional-order leaky integrate-and-Fire (FLIF) model. Models with fractional (non-integer) order differentiation that generalize power law dynamics can be used to describe complex temporal voltage dynamics. The main characteristic of FLIF model is that it depends on all past values of the voltage that causes long-term memory. The model produces spikes with high interspike interval variability and displays several spiking properties such as upward spike-frequency adaptation and long spike latency in response to a constant stimulus. We show that the subthreshold voltage and the firing rate of the fractional-order model make transitions from exponential to power law dynamics when the fractional order α decreases from 1 to smaller values. The firing rate displays different types of spike timing adaptation caused by changes on initial values. We also show that the voltage-memory trace and fractional coefficient are the causes of these different types of spiking properties. The voltage-memory trace that represents the long-term memory has a feedback regulatory mechanism and affects spiking activity. The results suggest that fractional-order models might be appropriate for understanding multiple time scale neuronal dynamics. Overall, a neuron with fractional dynamics displays history dependent activities that might be very useful and powerful for effective information processing.
Collapse
Affiliation(s)
- Wondimu W Teka
- UTSA Neurosciences Institute, The University of Texas at San Antonio, San Antonio, TX, USA.
| | - Ranjit Kumar Upadhyay
- Department of Applied Mathematics, Indian Institute of Technology (Indian School of Mines), Dhanbad-826004, Jharkhand, India.
| | - Argha Mondal
- Department of Applied Mathematics, Indian Institute of Technology (Indian School of Mines), Dhanbad-826004, Jharkhand, India.
| |
Collapse
|
30
|
Braun W, Thul R, Longtin A. Evolution of moments and correlations in nonrenewal escape-time processes. Phys Rev E 2017; 95:052127. [PMID: 28618562 DOI: 10.1103/physreve.95.052127] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2016] [Indexed: 06/07/2023]
Abstract
The theoretical description of nonrenewal stochastic systems is a challenge. Analytical results are often not available or can be obtained only under strong conditions, limiting their applicability. Also, numerical results have mostly been obtained by ad hoc Monte Carlo simulations, which are usually computationally expensive when a high degree of accuracy is needed. To gain quantitative insight into these systems under general conditions, we here introduce a numerical iterated first-passage time approach based on solving the time-dependent Fokker-Planck equation (FPE) to describe the statistics of nonrenewal stochastic systems. We illustrate the approach using spike-triggered neuronal adaptation in the leaky and perfect integrate-and-fire model, respectively. The transition to stationarity of first-passage time moments and their sequential correlations occur on a nontrivial time scale that depends on all system parameters. Surprisingly this is so for both single exponential and scale-free power-law adaptation. The method works beyond the small noise and time-scale separation approximations. It shows excellent agreement with direct Monte Carlo simulations, which allow for the computation of transient and stationary distributions. We compare different methods to compute the evolution of the moments and serial correlation coefficients (SCCs) and discuss the challenge of reliably computing the SCCs, which we find to be very sensitive to numerical inaccuracies for both the leaky and perfect integrate-and-fire models. In conclusion, our methods provide a general picture of nonrenewal dynamics in a wide range of stochastic systems exhibiting short- and long-range correlations.
Collapse
Affiliation(s)
- Wilhelm Braun
- Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
- University of Ottawa Brain and Mind Research Institute, University of Ottawa, 451 Smyth Road, Ottawa, ON K1H 8M5, Canada
| | - Rüdiger Thul
- Centre for Mathematical Medicine and Biology, School of Mathematical Sciences, University of Nottingham, Nottingham NG7 2RD, United Kingdom
| | - André Longtin
- Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
- University of Ottawa Brain and Mind Research Institute, University of Ottawa, 451 Smyth Road, Ottawa, ON K1H 8M5, Canada
| |
Collapse
|
31
|
Dettner A, Münzberg S, Tchumatchenko T. Temporal pairwise spike correlations fully capture single-neuron information. Nat Commun 2016; 7:13805. [PMID: 27976717 PMCID: PMC5171810 DOI: 10.1038/ncomms13805] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2015] [Accepted: 11/01/2016] [Indexed: 11/09/2022] Open
Abstract
To crack the neural code and read out the information neural spikes convey, it is essential to understand how the information is coded and how much of it is available for decoding. To this end, it is indispensable to derive from first principles a minimal set of spike features containing the complete information content of a neuron. Here we present such a complete set of coding features. We show that temporal pairwise spike correlations fully determine the information conveyed by a single spiking neuron with finite temporal memory and stationary spike statistics. We reveal that interspike interval temporal correlations, which are often neglected, can significantly change the total information. Our findings provide a conceptual link between numerous disparate observations and recommend shifting the focus of future studies from addressing firing rates to addressing pairwise spike correlation functions as the primary determinants of neural information.
Collapse
Affiliation(s)
- Amadeus Dettner
- Theory of Neural Dynamics Group, Max Planck Institute for Brain Research, Max-von-Laue-Strasse 4, 60438 Frankfurt, Germany
| | - Sabrina Münzberg
- Theory of Neural Dynamics Group, Max Planck Institute for Brain Research, Max-von-Laue-Strasse 4, 60438 Frankfurt, Germany
| | - Tatjana Tchumatchenko
- Theory of Neural Dynamics Group, Max Planck Institute for Brain Research, Max-von-Laue-Strasse 4, 60438 Frankfurt, Germany
| |
Collapse
|
32
|
The Impact of Structural Heterogeneity on Excitation-Inhibition Balance in Cortical Networks. Neuron 2016; 92:1106-1121. [PMID: 27866797 PMCID: PMC5158120 DOI: 10.1016/j.neuron.2016.10.027] [Citation(s) in RCA: 68] [Impact Index Per Article: 8.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2015] [Revised: 08/26/2016] [Accepted: 09/29/2016] [Indexed: 11/21/2022]
Abstract
Models of cortical dynamics often assume a homogeneous connectivity structure. However, we show that heterogeneous input connectivity can prevent the dynamic balance between excitation and inhibition, a hallmark of cortical dynamics, and yield unrealistically sparse and temporally regular firing. Anatomically based estimates of the connectivity of layer 4 (L4) rat barrel cortex and numerical simulations of this circuit indicate that the local network possesses substantial heterogeneity in input connectivity, sufficient to disrupt excitation-inhibition balance. We show that homeostatic plasticity in inhibitory synapses can align the functional connectivity to compensate for structural heterogeneity. Alternatively, spike-frequency adaptation can give rise to a novel state in which local firing rates adjust dynamically so that adaptation currents and synaptic inputs are balanced. This theory is supported by simulations of L4 barrel cortex during spontaneous and stimulus-evoked conditions. Our study shows how synaptic and cellular mechanisms yield fluctuation-driven dynamics despite structural heterogeneity in cortical circuits. Structural heterogeneity threatens the dynamic balance of excitation and inhibition Reconstruction of cortical networks reveals significant structural heterogeneity Spike-frequency adaptation can act locally to facilitate global balance Inhibitory homeostatic plasticity can compensate for structural imbalance
Collapse
|
33
|
Abstract
Adaptation is fundamental to life. All organisms adapt over timescales that span from evolution to generations and lifetimes to moment-by-moment interactions. The nervous system is particularly adept at rapidly adapting to change, and this in fact may be one of its fundamental principles of organization and function. Rapid forms of sensory adaptation have been well documented across all sensory modalities in a wide range of organisms, yet we do not have a comprehensive understanding of the adaptive cellular mechanisms that ultimately give rise to the corresponding percepts, due in part to the complexity of the circuitry. In this Perspective, we aim to build links between adaptation at multiple scales of neural circuitry by investigating the differential adaptation across brain regions and sub-regions and across specific cell types, for which the explosion of modern tools has just begun to enable. This investigation points to a set of challenges for the field to link functional observations to adaptive properties of the neural circuit that ultimately underlie percepts.
Collapse
Affiliation(s)
- Clarissa J Whitmire
- Wallace H. Coulter Department of Biomedical Engineering, Georgia Institute of Technology and Emory University, Atlanta, GA 30332, USA
| | - Garrett B Stanley
- Wallace H. Coulter Department of Biomedical Engineering, Georgia Institute of Technology and Emory University, Atlanta, GA 30332, USA.
| |
Collapse
|
34
|
Barranca VJ, Zhou D, Cai D. Compressive sensing reconstruction of feed-forward connectivity in pulse-coupled nonlinear networks. Phys Rev E 2016; 93:060201. [PMID: 27415190 DOI: 10.1103/physreve.93.060201] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2015] [Indexed: 06/06/2023]
Abstract
Utilizing the sparsity ubiquitous in real-world network connectivity, we develop a theoretical framework for efficiently reconstructing sparse feed-forward connections in a pulse-coupled nonlinear network through its output activities. Using only a small ensemble of random inputs, we solve this inverse problem through the compressive sensing theory based on a hidden linear structure intrinsic to the nonlinear network dynamics. The accuracy of the reconstruction is further verified by the fact that complex inputs can be well recovered using the reconstructed connectivity. We expect this Rapid Communication provides a new perspective for understanding the structure-function relationship as well as compressive sensing principle in nonlinear network dynamics.
Collapse
Affiliation(s)
- Victor J Barranca
- Department of Mathematics and Statistics, Swarthmore College, Swarthmore, Pennsylvania 19081, USA
| | - Douglas Zhou
- Department of Mathematics, MOE-LSC, Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai 200240, China
| | - David Cai
- Department of Mathematics, MOE-LSC, Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai 200240, China
- Courant Institute of Mathematical Sciences and Center for Neural Science, New York University, New York, New York 10012, USA
- NYUAD Institute, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates
| |
Collapse
|
35
|
Mazzucato L, Fontanini A, La Camera G. Stimuli Reduce the Dimensionality of Cortical Activity. Front Syst Neurosci 2016; 10:11. [PMID: 26924968 PMCID: PMC4756130 DOI: 10.3389/fnsys.2016.00011] [Citation(s) in RCA: 67] [Impact Index Per Article: 8.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2015] [Accepted: 02/02/2016] [Indexed: 12/31/2022] Open
Abstract
The activity of ensembles of simultaneously recorded neurons can be represented as a set of points in the space of firing rates. Even though the dimension of this space is equal to the ensemble size, neural activity can be effectively localized on smaller subspaces. The dimensionality of the neural space is an important determinant of the computational tasks supported by the neural activity. Here, we investigate the dimensionality of neural ensembles from the sensory cortex of alert rats during periods of ongoing (inter-trial) and stimulus-evoked activity. We find that dimensionality grows linearly with ensemble size, and grows significantly faster during ongoing activity compared to evoked activity. We explain these results using a spiking network model based on a clustered architecture. The model captures the difference in growth rate between ongoing and evoked activity and predicts a characteristic scaling with ensemble size that could be tested in high-density multi-electrode recordings. Moreover, we present a simple theory that predicts the existence of an upper bound on dimensionality. This upper bound is inversely proportional to the amount of pair-wise correlations and, compared to a homogeneous network without clusters, it is larger by a factor equal to the number of clusters. The empirical estimation of such bounds depends on the number and duration of trials and is well predicted by the theory. Together, these results provide a framework to analyze neural dimensionality in alert animals, its behavior under stimulus presentation, and its theoretical dependence on ensemble size, number of clusters, and correlations in spiking network models.
Collapse
Affiliation(s)
- Luca Mazzucato
- Department of Neurobiology and Behavior, State University of New York at Stony Brook Stony Brook, NY, USA
| | - Alfredo Fontanini
- Department of Neurobiology and Behavior, State University of New York at Stony BrookStony Brook, NY, USA; Graduate Program in Neuroscience, State University of New York at Stony BrookStony Brook, NY, USA
| | - Giancarlo La Camera
- Department of Neurobiology and Behavior, State University of New York at Stony BrookStony Brook, NY, USA; Graduate Program in Neuroscience, State University of New York at Stony BrookStony Brook, NY, USA
| |
Collapse
|
36
|
Ralston BN, Flagg LQ, Faggin E, Birmingham JT. Incorporating spike-rate adaptation into a rate code in mathematical and biological neurons. J Neurophysiol 2016; 115:2501-18. [PMID: 26888106 DOI: 10.1152/jn.00993.2015] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2015] [Accepted: 02/14/2016] [Indexed: 11/22/2022] Open
Abstract
For a slowly varying stimulus, the simplest relationship between a neuron's input and output is a rate code, in which the spike rate is a unique function of the stimulus at that instant. In the case of spike-rate adaptation, there is no unique relationship between input and output, because the spike rate at any time depends both on the instantaneous stimulus and on prior spiking (the "history"). To improve the decoding of spike trains produced by neurons that show spike-rate adaptation, we developed a simple scheme that incorporates "history" into a rate code. We utilized this rate-history code successfully to decode spike trains produced by 1) mathematical models of a neuron in which the mechanism for adaptation (IAHP) is specified, and 2) the gastropyloric receptor (GPR2), a stretch-sensitive neuron in the stomatogastric nervous system of the crab Cancer borealis, that exhibits long-lasting adaptation of unknown origin. Moreover, when we modified the spike rate either mathematically in a model system or by applying neuromodulatory agents to the experimental system, we found that changes in the rate-history code could be related to the biophysical mechanisms responsible for altering the spiking.
Collapse
Affiliation(s)
- Bridget N Ralston
- Department of Physics, Santa Clara University, Santa Clara, California
| | - Lucas Q Flagg
- Department of Physics, Santa Clara University, Santa Clara, California
| | - Eric Faggin
- Department of Physics, Santa Clara University, Santa Clara, California
| | - John T Birmingham
- Department of Physics, Santa Clara University, Santa Clara, California
| |
Collapse
|
37
|
French CR, Zeng Z, Williams DA, Hill-Yardin EL, O'Brien TJ. Properties of an intermediate-duration inactivation process of the voltage-gated sodium conductance in rat hippocampal CA1 neurons. J Neurophysiol 2016; 115:790-802. [DOI: 10.1152/jn.01000.2014] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2014] [Accepted: 11/02/2015] [Indexed: 12/13/2022] Open
Abstract
Rapid transmembrane flow of sodium ions produces the depolarizing phase of action potentials (APs) in most excitable tissue through voltage-gated sodium channels (NaV). Macroscopic currents display rapid activation followed by fast inactivation (IF) within milliseconds. Slow inactivation (IS) has been subsequently observed in several preparations including neuronal tissues. IS serves important physiological functions, but the kinetic properties are incompletely characterized, especially the operative timescales. Here we present evidence for an “intermediate inactivation” (II) process in rat hippocampal CA1 neurons with time constants of the order of 100 ms. The half-inactivation potentials ( V0.5) of steady-state inactivation curves were hyperpolarized by increasing conditioning pulse duration from 50 to 500 ms and could be described by a sum of Boltzmann relations. II state transitions were observed after opening as well as subthreshold potentials. Entry into II after opening was relatively insensitive to membrane potential, and recovery of II became more rapid at hyperpolarized potentials. Removal of fast inactivation with cytoplasmic papaine revealed time constants of INa decay corresponding to II and IS with long depolarizations. Dynamic clamp revealed attenuation of trains of APs over the 102-ms timescale, suggesting a functional role of II in repetitive firing accommodation. These experimental findings could be reproduced with a five-state Markov model. It is likely that II affects important aspects of hippocampal neuron response and may provide a drug target for sodium channel modulation.
Collapse
Affiliation(s)
- Christopher R. French
- Department of Neurobiology, Royal Melbourne Hospital, Melbourne, Victoria, Australia
- Department of Medicine, University of Melbourne, Melbourne, Victoria, Australia; and
| | - Zhen Zeng
- Department of Medicine, University of Melbourne, Melbourne, Victoria, Australia; and
| | - David A. Williams
- Department of Physiology, University of Melbourne, Melbourne, Victoria, Australia
| | - Elisa L. Hill-Yardin
- Department of Physiology, University of Melbourne, Melbourne, Victoria, Australia
| | - Terence J. O'Brien
- Department of Neurobiology, Royal Melbourne Hospital, Melbourne, Victoria, Australia
- Department of Medicine, University of Melbourne, Melbourne, Victoria, Australia; and
| |
Collapse
|
38
|
Abstract
Single-trial analyses of ensemble activity in alert animals demonstrate that cortical circuits dynamics evolve through temporal sequences of metastable states. Metastability has been studied for its potential role in sensory coding, memory, and decision-making. Yet, very little is known about the network mechanisms responsible for its genesis. It is often assumed that the onset of state sequences is triggered by an external stimulus. Here we show that state sequences can be observed also in the absence of overt sensory stimulation. Analysis of multielectrode recordings from the gustatory cortex of alert rats revealed ongoing sequences of states, where single neurons spontaneously attain several firing rates across different states. This single-neuron multistability represents a challenge to existing spiking network models, where typically each neuron is at most bistable. We present a recurrent spiking network model that accounts for both the spontaneous generation of state sequences and the multistability in single-neuron firing rates. Each state results from the activation of neural clusters with potentiated intracluster connections, with the firing rate in each cluster depending on the number of active clusters. Simulations show that the model's ensemble activity hops among the different states, reproducing the ongoing dynamics observed in the data. When probed with external stimuli, the model predicts the quenching of single-neuron multistability into bistability and the reduction of trial-by-trial variability. Both predictions were confirmed in the data. Together, these results provide a theoretical framework that captures both ongoing and evoked network dynamics in a single mechanistic model.
Collapse
|
39
|
Donnarumma F, Prevete R, Chersi F, Pezzulo G. A Programmer–Interpreter Neural Network Architecture for Prefrontal Cognitive Control. Int J Neural Syst 2015; 25:1550017. [DOI: 10.1142/s0129065715500173] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
Abstract
There is wide consensus that the prefrontal cortex (PFC) is able to exert cognitive control on behavior by biasing processing toward task-relevant information and by modulating response selection. This idea is typically framed in terms of top-down influences within a cortical control hierarchy, where prefrontal-basal ganglia loops gate multiple input–output channels, which in turn can activate or sequence motor primitives expressed in (pre-)motor cortices. Here we advance a new hypothesis, based on the notion of programmability and an interpreter–programmer computational scheme, on how the PFC can flexibly bias the selection of sensorimotor patterns depending on internal goal and task contexts. In this approach, multiple elementary behaviors representing motor primitives are expressed by a single multi-purpose neural network, which is seen as a reusable area of "recycled" neurons (interpreter). The PFC thus acts as a "programmer" that, without modifying the network connectivity, feeds the interpreter networks with specific input parameters encoding the programs (corresponding to network structures) to be interpreted by the (pre-)motor areas. Our architecture is validated in a standard test for executive function: the 1-2-AX task. Our results show that this computational framework provides a robust, scalable and flexible scheme that can be iterated at different hierarchical layers, supporting the realization of multiple goals. We discuss the plausibility of the "programmer–interpreter" scheme to explain the functioning of prefrontal-(pre)motor cortical hierarchies.
Collapse
Affiliation(s)
- Francesco Donnarumma
- Institute of Cognitive Sciences and Technologies, National Research Council of Italy, Via S. Martino della Battaglia 44-00185 Roma, Italy
| | - Roberto Prevete
- Università degli Studi di Napoli Federico II, Dipartimento di Ingegneria Elettrica e Tecnologie dell'Informazione (DIETI), Via Claudio, 21, 80125 Napoli, Italy
| | - Fabian Chersi
- University College London, Institute of Cognitive Neuroscience, 17 Queen Square, London, WC1N 3AR, England
| | - Giovanni Pezzulo
- Institute of Cognitive Sciences and Technologies, National Research Council of Italy, Via S. Martino della Battaglia 44-00185 Rome, Italy
| |
Collapse
|
40
|
Lundstrom BN. Modeling multiple time scale firing rate adaptation in a neural network of local field potentials. J Comput Neurosci 2014; 38:189-202. [PMID: 25319064 DOI: 10.1007/s10827-014-0536-2] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2014] [Revised: 10/05/2014] [Accepted: 10/08/2014] [Indexed: 11/30/2022]
Abstract
In response to stimulus changes, the firing rates of many neurons adapt, such that stimulus change is emphasized. Previous work has emphasized that rate adaptation can span a wide range of time scales and produce time scale invariant power law adaptation. However, neuronal rate adaptation is typically modeled using single time scale dynamics, and constructing a conductance-based model with arbitrary adaptation dynamics is nontrivial. Here, a modeling approach is developed in which firing rate adaptation, or spike frequency adaptation, can be understood as a filtering of slow stimulus statistics. Adaptation dynamics are modeled by a stimulus filter, and quantified by measuring the phase leads of the firing rate in response to varying input frequencies. Arbitrary adaptation dynamics are approximated by a set of weighted exponentials with parameters obtained by fitting to a desired filter. With this approach it is straightforward to assess the effect of multiple time scale adaptation dynamics on neural networks. To demonstrate this, single time scale and power law adaptation were added to a network model of local field potentials. Rate adaptation enhanced the slow oscillations of the network and flattened the output power spectrum, dampening intrinsic network frequencies. Thus, rate adaptation may play an important role in network dynamics.
Collapse
|
41
|
Spanne A, Geborek P, Bengtsson F, Jörntell H. Spike generation estimated from stationary spike trains in a variety of neurons in vivo. Front Cell Neurosci 2014; 8:199. [PMID: 25120429 PMCID: PMC4111083 DOI: 10.3389/fncel.2014.00199] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2014] [Accepted: 07/02/2014] [Indexed: 12/03/2022] Open
Abstract
To any model of brain function, the variability of neuronal spike firing is a problem that needs to be taken into account. Whereas the synaptic integration can be described in terms of the original Hodgkin-Huxley (H-H) formulations of conductance-based electrical signaling, the transformation of the resulting membrane potential into patterns of spike output is subjected to stochasticity that may not be captured with standard single neuron H-H models. The dynamics of the spike output is dependent on the normal background synaptic noise present in vivo, but the neuronal spike firing variability in vivo is not well studied. In the present study, we made long-term whole cell patch clamp recordings of stationary spike firing states across a range of membrane potentials from a variety of subcortical neurons in the non-anesthetized, decerebrated state in vivo. Based on the data, we formulated a simple, phenomenological model of the properties of the spike generation in each neuron that accurately captured the stationary spike firing statistics across all membrane potentials. The model consists of a parametric relationship between the mean and standard deviation of the inter-spike intervals, where the parameter is linearly related to the injected current over the membrane. This enabled it to generate accurate approximations of spike firing also under inhomogeneous conditions with input that varies over time. The parameters describing the spike firing statistics for different neuron types overlapped extensively, suggesting that the spike generation had similar properties across neurons.
Collapse
Affiliation(s)
- Anton Spanne
- Neural Basis of Sensorimotor Control, Department of Experimental Medical Science, Lund University Lund, Sweden
| | - Pontus Geborek
- Neural Basis of Sensorimotor Control, Department of Experimental Medical Science, Lund University Lund, Sweden
| | - Fredrik Bengtsson
- Neural Basis of Sensorimotor Control, Department of Experimental Medical Science, Lund University Lund, Sweden
| | - Henrik Jörntell
- Neural Basis of Sensorimotor Control, Department of Experimental Medical Science, Lund University Lund, Sweden
| |
Collapse
|
42
|
Gershman SJ. The penumbra of learning: a statistical theory of synaptic tagging and capture. NETWORK (BRISTOL, ENGLAND) 2014; 25:97-115. [PMID: 24679103 DOI: 10.3109/0954898x.2013.862749] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
Learning in humans and animals is accompanied by a penumbra: Learning one task benefits from learning an unrelated task shortly before or after. At the cellular level, the penumbra of learning appears when weak potentiation of one synapse is amplified by strong potentiation of another synapse on the same neuron during a critical time window. Weak potentiation sets a molecular tag that enables the synapse to capture plasticity-related proteins synthesized in response to strong potentiation at another synapse. This paper describes a computational model which formalizes synaptic tagging and capture in terms of statistical learning mechanisms. According to this model, synaptic strength encodes a probabilistic inference about the dynamically changing association between pre- and post-synaptic firing rates. The rate of change is itself inferred, coupling together different synapses on the same neuron. When the inputs to one synapse change rapidly, the inferred rate of change increases, amplifying learning at other synapses.
Collapse
Affiliation(s)
- Samuel J Gershman
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology , Cambridge, MA , USA
| |
Collapse
|
43
|
Teka W, Marinov TM, Santamaria F. Neuronal spike timing adaptation described with a fractional leaky integrate-and-fire model. PLoS Comput Biol 2014; 10:e1003526. [PMID: 24675903 PMCID: PMC3967934 DOI: 10.1371/journal.pcbi.1003526] [Citation(s) in RCA: 40] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2013] [Accepted: 01/20/2014] [Indexed: 11/22/2022] Open
Abstract
The voltage trace of neuronal activities can follow multiple timescale dynamics that arise from correlated membrane conductances. Such processes can result in power-law behavior in which the membrane voltage cannot be characterized with a single time constant. The emergent effect of these membrane correlations is a non-Markovian process that can be modeled with a fractional derivative. A fractional derivative is a non-local process in which the value of the variable is determined by integrating a temporal weighted voltage trace, also called the memory trace. Here we developed and analyzed a fractional leaky integrate-and-fire model in which the exponent of the fractional derivative can vary from 0 to 1, with 1 representing the normal derivative. As the exponent of the fractional derivative decreases, the weights of the voltage trace increase. Thus, the value of the voltage is increasingly correlated with the trajectory of the voltage in the past. By varying only the fractional exponent, our model can reproduce upward and downward spike adaptations found experimentally in neocortical pyramidal cells and tectal neurons in vitro. The model also produces spikes with longer first-spike latency and high inter-spike variability with power-law distribution. We further analyze spike adaptation and the responses to noisy and oscillatory input. The fractional model generates reliable spike patterns in response to noisy input. Overall, the spiking activity of the fractional leaky integrate-and-fire model deviates from the spiking activity of the Markovian model and reflects the temporal accumulated intrinsic membrane dynamics that affect the response of the neuron to external stimulation. Spike adaptation is a property of most neurons. When spike time adaptation occurs over multiple time scales, the dynamics can be described by a power-law. We study the computational properties of a leaky integrate-and-fire model with power-law adaptation. Instead of explicitly modeling the adaptation process by the contribution of slowly changing conductances, we use a fractional temporal derivative framework. The exponent of the fractional derivative represents the degree of adaptation of the membrane voltage, where 1 is the normal leaky integrator while values less than 1 produce increasing correlations in the voltage trace. The temporal correlation is interpreted as a memory trace that depends on the value of the fractional derivative. We identify the memory trace in the fractional model as the sum of the instantaneous differentiation weighted by a function that depends on the fractional exponent, and it provides non-local information to the incoming stimulus. The spiking dynamics of the fractional leaky integrate-and-fire model show memory dependence that can result in downward or upward spike adaptation. Our model provides a framework for understanding how long-range membrane voltage correlations affect spiking dynamics and information integration in neurons.
Collapse
Affiliation(s)
- Wondimu Teka
- UTSA Neurosciences Institute, The University of Texas at San Antonio, San Antonio, Texas, United States of America
| | - Toma M. Marinov
- UTSA Neurosciences Institute, The University of Texas at San Antonio, San Antonio, Texas, United States of America
| | - Fidel Santamaria
- UTSA Neurosciences Institute, The University of Texas at San Antonio, San Antonio, Texas, United States of America
- * E-mail:
| |
Collapse
|
44
|
Processing of hedonic and chemosensory features of taste in medial prefrontal and insular networks. J Neurosci 2014; 33:18966-78. [PMID: 24285901 DOI: 10.1523/jneurosci.2974-13.2013] [Citation(s) in RCA: 81] [Impact Index Per Article: 8.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Most of the research on cortical processing of taste has focused on either the primary gustatory cortex (GC) or the orbitofrontal cortex (OFC). However, these are not the only areas involved in taste processing. Gustatory information can also reach another frontal region, the medial prefrontal cortex (mPFC), via direct projections from GC. mPFC has been studied extensively in relation to its role in controlling goal-directed action and reward-guided behaviors, yet very little is known about its involvement in taste coding. The experiments presented here address this important point and test whether neurons in mPFC can significantly process the physiochemical and hedonic dimensions of taste. Spiking responses to intraorally delivered tastants were recorded from rats implanted with bundles of electrodes in mPFC and GC. Analysis of single-neuron and ensemble activity revealed similarities and differences between the two areas. Neurons in mPFC can encode the chemosensory identity of gustatory stimuli. However, responses in mPFC are sparser, more narrowly tuned, and have a later onset than in GC. Although taste quality is more robustly represented in GC, taste palatability is coded equally well in the two areas. Additional analysis of responses in neurons processing the hedonic value of taste revealed differences between the two areas in temporal dynamics and sensitivities to palatability. These results add mPFC to the network of areas involved in processing gustatory stimuli and demonstrate significant differences in taste-coding between GC and mPFC.
Collapse
|
45
|
Processing of hedonic and chemosensory features of taste in medial prefrontal and insular networks. J Neurosci 2013. [PMID: 24285901 DOI: 10.1523/jneurosci.2974‐13.2013] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Most of the research on cortical processing of taste has focused on either the primary gustatory cortex (GC) or the orbitofrontal cortex (OFC). However, these are not the only areas involved in taste processing. Gustatory information can also reach another frontal region, the medial prefrontal cortex (mPFC), via direct projections from GC. mPFC has been studied extensively in relation to its role in controlling goal-directed action and reward-guided behaviors, yet very little is known about its involvement in taste coding. The experiments presented here address this important point and test whether neurons in mPFC can significantly process the physiochemical and hedonic dimensions of taste. Spiking responses to intraorally delivered tastants were recorded from rats implanted with bundles of electrodes in mPFC and GC. Analysis of single-neuron and ensemble activity revealed similarities and differences between the two areas. Neurons in mPFC can encode the chemosensory identity of gustatory stimuli. However, responses in mPFC are sparser, more narrowly tuned, and have a later onset than in GC. Although taste quality is more robustly represented in GC, taste palatability is coded equally well in the two areas. Additional analysis of responses in neurons processing the hedonic value of taste revealed differences between the two areas in temporal dynamics and sensitivities to palatability. These results add mPFC to the network of areas involved in processing gustatory stimuli and demonstrate significant differences in taste-coding between GC and mPFC.
Collapse
|
46
|
Ladenbauer J, Augustin M, Obermayer K. How adaptation currents change threshold, gain, and variability of neuronal spiking. J Neurophysiol 2013; 111:939-53. [PMID: 24174646 DOI: 10.1152/jn.00586.2013] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Many types of neurons exhibit spike rate adaptation, mediated by intrinsic slow K(+) currents, which effectively inhibit neuronal responses. How these adaptation currents change the relationship between in vivo like fluctuating synaptic input, spike rate output, and the spike train statistics, however, is not well understood. In this computational study we show that an adaptation current that primarily depends on the subthreshold membrane voltage changes the neuronal input-output relationship (I-O curve) subtractively, thereby increasing the response threshold, and decreases its slope (response gain) for low spike rates. A spike-dependent adaptation current alters the I-O curve divisively, thus reducing the response gain. Both types of an adaptation current naturally increase the mean interspike interval (ISI), but they can affect ISI variability in opposite ways. A subthreshold current always causes an increase of variability while a spike-triggered current decreases high variability caused by fluctuation-dominated inputs and increases low variability when the average input is large. The effects on I-O curves match those caused by synaptic inhibition in networks with asynchronous irregular activity, for which we find subtractive and divisive changes caused by external and recurrent inhibition, respectively. Synaptic inhibition, however, always increases the ISI variability. We analytically derive expressions for the I-O curve and ISI variability, which demonstrate the robustness of our results. Furthermore, we show how the biophysical parameters of slow K(+) conductances contribute to the two different types of an adaptation current and find that Ca(2+)-activated K(+) currents are effectively captured by a simple spike-dependent description, while muscarine-sensitive or Na(+)-activated K(+) currents show a dominant subthreshold component.
Collapse
Affiliation(s)
- Josef Ladenbauer
- Neural Information Processing Group, Technische Universität Berlin, Berlin, Germany; and
| | | | | |
Collapse
|
47
|
Termsaithong T, Aihara K. Dynamical correlation patterns and corresponding community structure in neural spontaneous activity at criticality. Cogn Neurodyn 2013; 7:381-93. [PMID: 24427213 PMCID: PMC3773324 DOI: 10.1007/s11571-013-9251-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2012] [Revised: 03/18/2013] [Accepted: 03/28/2013] [Indexed: 11/27/2022] Open
Abstract
It has been considered that the state in the vicinity of a critical point, which is the point between ordered and disordered states, can underlie and facilitate information processing of the brain in various aspects. In this research, we numerically study the influence of criticality on one aspect of brain information processing, i.e., the community structure, which is an important characteristic of complex networks. We examine community structure of the functional connectivity in simulated brain spontaneous activity, which is based on dynamical correlations between neural activity patterns at different positions. The brain spontaneous activity is simulated by a neural field model whose parameter covers subcritical, critical, and supercritical regions. Then, the corresponding dynamical correlation patterns and community structure are compared. In the critical region, we found some distinctive properties, namely high correlation and correlation switching, high modularity and a low number of modules, high stability of the dynamical functional connectivity, and moderate flexibility of the community structure across temporal scales. We also discuss how these characteristics might improve information processing of the brain.
Collapse
Affiliation(s)
- T. Termsaithong
- Institute of Industrial Science, University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo, 153-8505 Japan
| | - K. Aihara
- Institute of Industrial Science, University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo, 153-8505 Japan
| |
Collapse
|
48
|
Temporal whitening by power-law adaptation in neocortical neurons. Nat Neurosci 2013; 16:942-8. [PMID: 23749146 DOI: 10.1038/nn.3431] [Citation(s) in RCA: 115] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2012] [Accepted: 05/08/2013] [Indexed: 11/08/2022]
Abstract
Spike-frequency adaptation (SFA) is widespread in the CNS, but its function remains unclear. In neocortical pyramidal neurons, adaptation manifests itself by an increase in the firing threshold and by adaptation currents triggered after each spike. Combining electrophysiological recordings in mice with modeling, we found that these adaptation processes lasted for more than 20 s and decayed over multiple timescales according to a power law. The power-law decay associated with adaptation mirrored and canceled the temporal correlations of input current received in vivo at the somata of layer 2/3 somatosensory pyramidal neurons. These findings suggest that, in the cortex, SFA causes temporal decorrelation of output spikes (temporal whitening), an energy-efficient coding procedure that, at high signal-to-noise ratio, improves the information transfer.
Collapse
|
49
|
Augustin M, Ladenbauer J, Obermayer K. How adaptation shapes spike rate oscillations in recurrent neuronal networks. Front Comput Neurosci 2013; 7:9. [PMID: 23450654 PMCID: PMC3583173 DOI: 10.3389/fncom.2013.00009] [Citation(s) in RCA: 37] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2012] [Accepted: 02/08/2013] [Indexed: 12/31/2022] Open
Abstract
Neural mass signals from in-vivo recordings often show oscillations with frequencies ranging from <1 to 100 Hz. Fast rhythmic activity in the beta and gamma range can be generated by network-based mechanisms such as recurrent synaptic excitation-inhibition loops. Slower oscillations might instead depend on neuronal adaptation currents whose timescales range from tens of milliseconds to seconds. Here we investigate how the dynamics of such adaptation currents contribute to spike rate oscillations and resonance properties in recurrent networks of excitatory and inhibitory neurons. Based on a network of sparsely coupled spiking model neurons with two types of adaptation current and conductance-based synapses with heterogeneous strengths and delays we use a mean-field approach to analyze oscillatory network activity. For constant external input, we find that spike-triggered adaptation currents provide a mechanism to generate slow oscillations over a wide range of adaptation timescales as long as recurrent synaptic excitation is sufficiently strong. Faster rhythms occur when recurrent inhibition is slower than excitation and oscillation frequency increases with the strength of inhibition. Adaptation facilitates such network-based oscillations for fast synaptic inhibition and leads to decreased frequencies. For oscillatory external input, adaptation currents amplify a narrow band of frequencies and cause phase advances for low frequencies in addition to phase delays at higher frequencies. Our results therefore identify the different key roles of neuronal adaptation dynamics for rhythmogenesis and selective signal propagation in recurrent networks.
Collapse
Affiliation(s)
- Moritz Augustin
- Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin Berlin, Germany ; Bernstein Center for Computational Neuroscience Berlin Berlin, Germany
| | | | | |
Collapse
|
50
|
Naud R, Gerstner W. Coding and decoding with adapting neurons: a population approach to the peri-stimulus time histogram. PLoS Comput Biol 2012; 8:e1002711. [PMID: 23055914 PMCID: PMC3464223 DOI: 10.1371/journal.pcbi.1002711] [Citation(s) in RCA: 34] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2012] [Accepted: 08/03/2012] [Indexed: 11/18/2022] Open
Abstract
The response of a neuron to a time-dependent stimulus, as measured in a Peri-Stimulus-Time-Histogram (PSTH), exhibits an intricate temporal structure that reflects potential temporal coding principles. Here we analyze the encoding and decoding of PSTHs for spiking neurons with arbitrary refractoriness and adaptation. As a modeling framework, we use the spike response model, also known as the generalized linear neuron model. Because of refractoriness, the effect of the most recent spike on the spiking probability a few milliseconds later is very strong. The influence of the last spike needs therefore to be described with high precision, while the rest of the neuronal spiking history merely introduces an average self-inhibition or adaptation that depends on the expected number of past spikes but not on the exact spike timings. Based on these insights, we derive a ‘quasi-renewal equation’ which is shown to yield an excellent description of the firing rate of adapting neurons. We explore the domain of validity of the quasi-renewal equation and compare it with other rate equations for populations of spiking neurons. The problem of decoding the stimulus from the population response (or PSTH) is addressed analogously. We find that for small levels of activity and weak adaptation, a simple accumulator of the past activity is sufficient to decode the original input, but when refractory effects become large decoding becomes a non-linear function of the past activity. The results presented here can be applied to the mean-field analysis of coupled neuron networks, but also to arbitrary point processes with negative self-interaction. How can information be encoded and decoded in populations of adapting neurons? A quantitative answer to this question requires a mathematical expression relating neuronal activity to the external stimulus, and, conversely, stimulus to neuronal activity. Although widely used equations and models exist for the special problem of relating external stimulus to the action potentials of a single neuron, the analogous problem of relating the external stimulus to the activity of a population has proven more difficult. There is a bothersome gap between the dynamics of single adapting neurons and the dynamics of populations. Moreover, if we ignore the single neurons and describe directly the population dynamics, we are faced with the ambiguity of the adapting neural code. The neural code of adapting populations is ambiguous because it is possible to observe a range of population activities in response to a given instantaneous input. Somehow the ambiguity is resolved by the knowledge of the population history, but how precisely? In this article we use approximation methods to provide mathematical expressions that describe the encoding and decoding of external stimuli in adapting populations. The theory presented here helps to bridge the gap between the dynamics of single neurons and that of populations.
Collapse
Affiliation(s)
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, Ecole Polytechnique Fédérale de Lausanne, Lausanne-EPFL, Lausanne, Switzerland
- * E-mail:
| |
Collapse
|