1
|
Tang D, Zylberberg J, Jia X, Choi H. Stimulus type shapes the topology of cellular functional networks in mouse visual cortex. Nat Commun 2024; 15:5753. [PMID: 38982078 PMCID: PMC11233648 DOI: 10.1038/s41467-024-49704-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2023] [Accepted: 06/13/2024] [Indexed: 07/11/2024] Open
Abstract
On the timescale of sensory processing, neuronal networks have relatively fixed anatomical connectivity, while functional interactions between neurons can vary depending on the ongoing activity of the neurons within the network. We thus hypothesized that different types of stimuli could lead those networks to display stimulus-dependent functional connectivity patterns. To test this hypothesis, we analyzed single-cell resolution electrophysiological data from the Allen Institute, with simultaneous recordings of stimulus-evoked activity from neurons across 6 different regions of mouse visual cortex. Comparing the functional connectivity patterns during different stimulus types, we made several nontrivial observations: (1) while the frequencies of different functional motifs were preserved across stimuli, the identities of the neurons within those motifs changed; (2) the degree to which functional modules are contained within a single brain region increases with stimulus complexity. Altogether, our work reveals unexpected stimulus-dependence to the way groups of neurons interact to process incoming sensory information.
Collapse
Affiliation(s)
- Disheng Tang
- School of Life Sciences, Tsinghua University, Beijing, 100084, PR China.
- Quantitative Biosciences Program, Georgia Institute of Technology, Atlanta, 30332, GA, USA.
- IDG/McGovern Institute for Brain Research, Tsinghua University, Beijing, 100084, PR China.
| | - Joel Zylberberg
- Department of Physics and Astronomy, and Centre for Vision Research, York University, Toronto, ON M3J 1P3, ON, Canada.
- Learning in Machines and Brains Program, CIFAR, Toronto, ON M5G 1M1, ON, Canada.
| | - Xiaoxuan Jia
- School of Life Sciences, Tsinghua University, Beijing, 100084, PR China.
- IDG/McGovern Institute for Brain Research, Tsinghua University, Beijing, 100084, PR China.
- Tsinghua-Peking Center for Life Sciences, Tsinghua University, Beijing, 100084, PR China.
| | - Hannah Choi
- Quantitative Biosciences Program, Georgia Institute of Technology, Atlanta, 30332, GA, USA.
- School of Mathematics, Georgia Institute of Technology, Atlanta, 30332, GA, USA.
| |
Collapse
|
2
|
Kromer JA, Tass PA. Simulated dataset on coordinated reset stimulation of homogeneous and inhomogeneous networks of excitatory leaky integrate-and-fire neurons with spike-timing-dependent plasticity. Data Brief 2024; 54:110345. [PMID: 38586130 PMCID: PMC10998034 DOI: 10.1016/j.dib.2024.110345] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2024] [Accepted: 03/12/2024] [Indexed: 04/09/2024] Open
Abstract
We present simulated data on coordinated reset stimulation (CRS) of plastic neuronal networks. The neuronal network consists of excitatory leaky integrate-and-fire neurons and plasticity is implemented as spike-timing-dependent plasticity (STDP). A synchronized state with strong synaptic connectivity and a desynchronized state with weak synaptic connectivity coexist. CRS may drive the network from the synchronized state into a desynchronized state inducing long-lasting desynchronization effects that persist after cessation of stimulation. This is used to model brain stimulation-induced transitions between a pathological state, with abnormally strong neuronal synchrony, and a physiological state, e.g., in Parkinson's disease. During CRS, a sequence of stimuli is delivered to multiple stimulation sites - called CR sequence. We present simulated data for the analysis of long-lasting desynchronization effects of CRS with shuffled CR sequences versus non-shuffled CR sequences in which the order of stimulus deliveries to the sites remains unchanged throughout the entire stimulation period. Such data are presented for networks with homogeneous synaptic connectivity and networks with inhomogeneous synaptic connectivity. Homogeneous synaptic connectivity refers to a network in which the probability of a synaptic connection does not depend on the pre- and postsynaptic neurons' locations. In contrast, inhomogeneous synaptic connectivity refers to a network in which the probability of a synaptic connection depends on the neurons' locations. The presented neuronal network model was used to analyse the impact of the CR sequences and their shuffling on the long-lasting effects of CRS [1].
Collapse
Affiliation(s)
- Justus A Kromer
- Department of Neurosurgery, Stanford University, Stanford, CA, United States of America
| | - Peter A Tass
- Department of Neurosurgery, Stanford University, Stanford, CA, United States of America
| |
Collapse
|
3
|
Duchet B, Bick C, Byrne Á. Mean-Field Approximations With Adaptive Coupling for Networks With Spike-Timing-Dependent Plasticity. Neural Comput 2023; 35:1481-1528. [PMID: 37437202 PMCID: PMC10422128 DOI: 10.1162/neco_a_01601] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2022] [Accepted: 04/26/2023] [Indexed: 07/14/2023]
Abstract
Understanding the effect of spike-timing-dependent plasticity (STDP) is key to elucidating how neural networks change over long timescales and to design interventions aimed at modulating such networks in neurological disorders. However, progress is restricted by the significant computational cost associated with simulating neural network models with STDP and by the lack of low-dimensional description that could provide analytical insights. Phase-difference-dependent plasticity (PDDP) rules approximate STDP in phase oscillator networks, which prescribe synaptic changes based on phase differences of neuron pairs rather than differences in spike timing. Here we construct mean-field approximations for phase oscillator networks with STDP to describe part of the phase space for this very high-dimensional system. We first show that single-harmonic PDDP rules can approximate a simple form of symmetric STDP, while multiharmonic rules are required to accurately approximate causal STDP. We then derive exact expressions for the evolution of the average PDDP coupling weight in terms of network synchrony. For adaptive networks of Kuramoto oscillators that form clusters, we formulate a family of low-dimensional descriptions based on the mean-field dynamics of each cluster and average coupling weights between and within clusters. Finally, we show that such a two-cluster mean-field model can be fitted to synthetic data to provide a low-dimensional approximation of a full adaptive network with symmetric STDP. Our framework represents a step toward a low-dimensional description of adaptive networks with STDP, and could for example inform the development of new therapies aimed at maximizing the long-lasting effects of brain stimulation.
Collapse
Affiliation(s)
- Benoit Duchet
- Nuffield Department of Clinical Neuroscience, University of Oxford, Oxford X3 9DU, U.K
- MRC Brain Network Dynamics Unit, University of Oxford, Oxford X1 3TH, U.K.
| | - Christian Bick
- Department of Mathematics, Vrije Universiteit Amsterdam, Amsterdam 1081 HV, the Netherlands
- Amsterdam Neuroscience-Systems and Network Neuroscience, Amsterdam 1081 HV, the Netherlands
- Mathematical Institute, University of Oxford, Oxford X2 6GG, U.K.
| | - Áine Byrne
- School of Mathematics and Statistics, University College Dublin, Dublin D04 V1W8, Ireland
| |
Collapse
|
4
|
Tang D, Zylberberg J, Jia X, Choi H. Stimulus-dependent functional network topology in mouse visual cortex. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.07.03.547364. [PMID: 37461471 PMCID: PMC10349950 DOI: 10.1101/2023.07.03.547364] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/21/2023]
Abstract
Information is processed by networks of neurons in the brain. On the timescale of sensory processing, those neuronal networks have relatively fixed anatomical connectivity, while functional connectivity, which defines the interactions between neurons, can vary depending on the ongoing activity of the neurons within the network. We thus hypothesized that different types of stimuli, which drive different neuronal activities in the network, could lead those networks to display stimulus-dependent functional connectivity patterns. To test this hypothesis, we analyzed electrophysiological data from the Allen Brain Observatory, which utilized Neuropixels probes to simultaneously record stimulus-evoked activity from hundreds of neurons across 6 different regions of mouse visual cortex. The recordings had single-cell resolution and high temporal fidelity, enabling us to determine fine-scale functional connectivity. Comparing the functional connectivity patterns observed when different stimuli were presented to the mice, we made several nontrivial observations. First, while the frequencies of different connectivity motifs (i.e., the patterns of connectivity between triplets of neurons) were preserved across stimuli, the identities of the neurons within those motifs changed. This means that functional connectivity dynamically changes along with the input stimulus, but does so in a way that preserves the motif frequencies. Secondly, we found that the degree to which functional modules are contained within a single brain region (as opposed to being distributed between regions) increases with increasing stimulus complexity. This suggests a mechanism for how the brain could dynamically alter its computations based on its inputs. Altogether, our work reveals unexpected stimulus-dependence to the way groups of neurons interact to process incoming sensory information.
Collapse
Affiliation(s)
- Disheng Tang
- School of Life Sciences, Tsinghua University
- Quantitative Biosciences Program, Georgia Institute of Technology
- IDG/McGovern Institute for Brain Research, Tsinghua University
| | - Joel Zylberberg
- Department of Physics and Astronomy, and Centre for Vision Research, York University
- Learning in Machines and Brains Program, CIFAR
- These authors jointly supervised this work: Joel Zylberberg, Xiaoxuan Jia, Hannah Choi
| | - Xiaoxuan Jia
- School of Life Sciences, Tsinghua University
- IDG/McGovern Institute for Brain Research, Tsinghua University
- Tsinghua–Peking Center for Life Sciences
- Allen Institute for Brain Science
- These authors jointly supervised this work: Joel Zylberberg, Xiaoxuan Jia, Hannah Choi
| | - Hannah Choi
- Quantitative Biosciences Program, Georgia Institute of Technology
- School of Mathematics, Georgia Institute of Technology
- These authors jointly supervised this work: Joel Zylberberg, Xiaoxuan Jia, Hannah Choi
| |
Collapse
|
5
|
Kromer JA, Tass PA. Synaptic reshaping of plastic neuronal networks by periodic multichannel stimulation with single-pulse and burst stimuli. PLoS Comput Biol 2022; 18:e1010568. [PMID: 36327232 PMCID: PMC9632832 DOI: 10.1371/journal.pcbi.1010568] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2022] [Accepted: 09/14/2022] [Indexed: 11/06/2022] Open
Abstract
Synaptic dysfunction is associated with several brain disorders, including Alzheimer's disease, Parkinson's disease (PD) and obsessive compulsive disorder (OCD). Utilizing synaptic plasticity, brain stimulation is capable of reshaping synaptic connectivity. This may pave the way for novel therapies that specifically counteract pathological synaptic connectivity. For instance, in PD, novel multichannel coordinated reset stimulation (CRS) was designed to counteract neuronal synchrony and down-regulate pathological synaptic connectivity. CRS was shown to entail long-lasting therapeutic aftereffects in PD patients and related animal models. This is in marked contrast to conventional deep brain stimulation (DBS) therapy, where PD symptoms return shortly after stimulation ceases. In the present paper, we study synaptic reshaping by periodic multichannel stimulation (PMCS) in networks of leaky integrate-and-fire (LIF) neurons with spike-timing-dependent plasticity (STDP). During PMCS, phase-shifted periodic stimulus trains are delivered to segregated neuronal subpopulations. Harnessing STDP, PMCS leads to changes of the synaptic network structure. We found that the PMCS-induced changes of the network structure depend on both the phase lags between stimuli and the shape of individual stimuli. Single-pulse stimuli and burst stimuli with low intraburst frequency down-regulate synapses between neurons receiving stimuli simultaneously. In contrast, burst stimuli with high intraburst frequency up-regulate these synapses. We derive theoretical approximations of the stimulation-induced network structure. This enables us to formulate stimulation strategies for inducing a variety of network structures. Our results provide testable hypotheses for future pre-clinical and clinical studies and suggest that periodic multichannel stimulation may be suitable for reshaping plastic neuronal networks to counteract pathological synaptic connectivity. Furthermore, we provide novel insight on how the stimulus type may affect the long-lasting outcome of conventional DBS. This may strongly impact parameter adjustment procedures for clinical DBS, which, so far, primarily focused on acute effects of stimulation.
Collapse
Affiliation(s)
- Justus A Kromer
- Department of Neurosurgery, Stanford University, Stanford, California, United States of America
| | - Peter A Tass
- Department of Neurosurgery, Stanford University, Stanford, California, United States of America
| |
Collapse
|
6
|
Miehl C, Onasch S, Festa D, Gjorgjieva J. Formation and computational implications of assemblies in neural circuits. J Physiol 2022. [PMID: 36068723 DOI: 10.1113/jp282750] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2022] [Accepted: 08/22/2022] [Indexed: 11/08/2022] Open
Abstract
In the brain, patterns of neural activity represent sensory information and store it in non-random synaptic connectivity. A prominent theoretical hypothesis states that assemblies, groups of neurons that are strongly connected to each other, are the key computational units underlying perception and memory formation. Compatible with these hypothesised assemblies, experiments have revealed groups of neurons that display synchronous activity, either spontaneously or upon stimulus presentation, and exhibit behavioural relevance. While it remains unclear how assemblies form in the brain, theoretical work has vastly contributed to the understanding of various interacting mechanisms in this process. Here, we review the recent theoretical literature on assembly formation by categorising the involved mechanisms into four components: synaptic plasticity, symmetry breaking, competition and stability. We highlight different approaches and assumptions behind assembly formation and discuss recent ideas of assemblies as the key computational unit in the brain. Abstract figure legend Assembly Formation. Assemblies are groups of strongly connected neurons formed by the interaction of multiple mechanisms and with vast computational implications. Four interacting components are thought to drive assembly formation: synaptic plasticity, symmetry breaking, competition and stability. This article is protected by copyright. All rights reserved.
Collapse
Affiliation(s)
- Christoph Miehl
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Sebastian Onasch
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Dylan Festa
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Julijana Gjorgjieva
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| |
Collapse
|
7
|
Chauhan K, Khaledi-Nasab A, Neiman AB, Tass PA. Dynamics of phase oscillator networks with synaptic weight and structural plasticity. Sci Rep 2022; 12:15003. [PMID: 36056151 PMCID: PMC9440105 DOI: 10.1038/s41598-022-19417-9] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2022] [Accepted: 08/29/2022] [Indexed: 11/08/2022] Open
Abstract
We study the dynamics of Kuramoto oscillator networks with two distinct adaptation processes, one varying the coupling strengths and the other altering the network structure. Such systems model certain networks of oscillatory neurons where the neuronal dynamics, synaptic weights, and network structure interact with and shape each other. We model synaptic weight adaptation with spike-timing-dependent plasticity (STDP) that runs on a longer time scale than neuronal spiking. Structural changes that include addition and elimination of contacts occur at yet a longer time scale than the weight adaptations. First, we study the steady-state dynamics of Kuramoto networks that are bistable and can settle in synchronized or desynchronized states. To compare the impact of adding structural plasticity, we contrast the network with only STDP to one with a combination of STDP and structural plasticity. We show that the inclusion of structural plasticity optimizes the synchronized state of a network by allowing for synchronization with fewer links than a network with STDP alone. With non-identical units in the network, the addition of structural plasticity leads to the emergence of correlations between the oscillators' natural frequencies and node degrees. In the desynchronized regime, the structural plasticity decreases the number of contacts, leading to a sparse network. In this way, adding structural plasticity strengthens both synchronized and desynchronized states of a network. Second, we use desynchronizing coordinated reset stimulation and synchronizing periodic stimulation to induce desynchronized and synchronized states, respectively. Our findings indicate that a network with a combination of STDP and structural plasticity may require stronger and longer stimulation to switch between the states than a network with STDP only.
Collapse
Affiliation(s)
- Kanishk Chauhan
- Department of Physics and Astronomy, Ohio University, Athens, OH, 45701, USA.
- Neuroscience Program, Ohio University, Athens, OH, 45701, USA.
| | - Ali Khaledi-Nasab
- Department of Neurosurgery, Stanford University, Stanford, CA, 94305, USA
| | - Alexander B Neiman
- Department of Physics and Astronomy, Ohio University, Athens, OH, 45701, USA
- Neuroscience Program, Ohio University, Athens, OH, 45701, USA
| | - Peter A Tass
- Department of Neurosurgery, Stanford University, Stanford, CA, 94305, USA
| |
Collapse
|
8
|
Wei J, Li L, Song H, Du Z, Yang J, Zhang M, Liu X. Response of a neuronal network computational model to infrared neural stimulation. Front Comput Neurosci 2022; 16:933818. [PMID: 36045903 PMCID: PMC9423709 DOI: 10.3389/fncom.2022.933818] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2022] [Accepted: 07/25/2022] [Indexed: 11/13/2022] Open
Abstract
Infrared neural stimulation (INS), as a novel form of neuromodulation, allows modulating the activity of nerve cells through thermally induced capacitive currents and thermal sensitivity ion channels. However, fundamental questions remain about the exact mechanism of INS and how the photothermal effect influences the neural response. Computational neural modeling can provide a powerful methodology for understanding the law of action of INS. We developed a temperature-dependent model of ion channels and membrane capacitance based on the photothermal effect to quantify the effect of INS on the direct response of individual neurons and neuronal networks. The neurons were connected through excitatory and inhibitory synapses and constituted a complex neuronal network model. Our results showed that a slight increase in temperature promoted the neuronal spikes and enhanced network activity, whereas the ultra-temperature inhibited neuronal activity. This biophysically based simulation illustrated the optical dose-dependent biphasic cell response with capacitive current as the core change condition. The computational model provided a new sight to elucidate mechanisms and inform parameter selection of INS.
Collapse
Affiliation(s)
- Jinzhao Wei
- Key Laboratory of Digital Medical Engineering of Hebei, Hebei University, Baoding, China
- College of Electronic and Information Engineering, Hebei University, Baoding, China
| | - Licong Li
- Key Laboratory of Digital Medical Engineering of Hebei, Hebei University, Baoding, China
- College of Electronic and Information Engineering, Hebei University, Baoding, China
- Licong Li
| | - Hao Song
- Key Laboratory of Digital Medical Engineering of Hebei, Hebei University, Baoding, China
- College of Electronic and Information Engineering, Hebei University, Baoding, China
| | - Zhaoning Du
- Key Laboratory of Digital Medical Engineering of Hebei, Hebei University, Baoding, China
- College of Electronic and Information Engineering, Hebei University, Baoding, China
| | - Jianli Yang
- Key Laboratory of Digital Medical Engineering of Hebei, Hebei University, Baoding, China
- College of Electronic and Information Engineering, Hebei University, Baoding, China
| | - Mingsha Zhang
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, China
- IDG/McGovern Institute for Brain Research at BNU, Beijing Normal University, Beijing, China
- Division of Psychology, Beijing Normal University, Beijing, China
| | - Xiuling Liu
- Key Laboratory of Digital Medical Engineering of Hebei, Hebei University, Baoding, China
- College of Electronic and Information Engineering, Hebei University, Baoding, China
- *Correspondence: Xiuling Liu
| |
Collapse
|
9
|
Wang B, Aljadeff J. Multiplicative Shot-Noise: A New Route to Stability of Plastic Networks. PHYSICAL REVIEW LETTERS 2022; 129:068101. [PMID: 36018633 DOI: 10.1103/physrevlett.129.068101] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/21/2022] [Accepted: 06/30/2022] [Indexed: 06/15/2023]
Abstract
Fluctuations of synaptic weights, among many other physical, biological, and ecological quantities, are driven by coincident events of two "parent" processes. We propose a multiplicative shot-noise model that can capture the behaviors of a broad range of such natural phenomena, and analytically derive an approximation that accurately predicts its statistics. We apply our results to study the effects of a multiplicative synaptic plasticity rule that was recently extracted from measurements in physiological conditions. Using mean-field theory analysis and network simulations, we investigate how this rule shapes the connectivity and dynamics of recurrent spiking neural networks. The multiplicative plasticity rule is shown to support efficient learning of input stimuli, and it gives a stable, unimodal synaptic-weight distribution with a large fraction of strong synapses. The strong synapses remain stable over long times but do not "run away." Our results suggest that the multiplicative shot-noise offers a new route to understand the tradeoff between flexibility and stability in neural circuits and other dynamic networks.
Collapse
Affiliation(s)
- Bin Wang
- Department of Physics, University of California San Diego, La Jolla, California 92093, USA
| | - Johnatan Aljadeff
- Department of Neurobiology, University of California San Diego, La Jolla, California 92093, USA
| |
Collapse
|
10
|
Brinkman BAW, Yan H, Maffei A, Park IM, Fontanini A, Wang J, La Camera G. Metastable dynamics of neural circuits and networks. APPLIED PHYSICS REVIEWS 2022; 9:011313. [PMID: 35284030 PMCID: PMC8900181 DOI: 10.1063/5.0062603] [Citation(s) in RCA: 13] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/06/2021] [Accepted: 01/31/2022] [Indexed: 05/14/2023]
Abstract
Cortical neurons emit seemingly erratic trains of action potentials or "spikes," and neural network dynamics emerge from the coordinated spiking activity within neural circuits. These rich dynamics manifest themselves in a variety of patterns, which emerge spontaneously or in response to incoming activity produced by sensory inputs. In this Review, we focus on neural dynamics that is best understood as a sequence of repeated activations of a number of discrete hidden states. These transiently occupied states are termed "metastable" and have been linked to important sensory and cognitive functions. In the rodent gustatory cortex, for instance, metastable dynamics have been associated with stimulus coding, with states of expectation, and with decision making. In frontal, parietal, and motor areas of macaques, metastable activity has been related to behavioral performance, choice behavior, task difficulty, and attention. In this article, we review the experimental evidence for neural metastable dynamics together with theoretical approaches to the study of metastable activity in neural circuits. These approaches include (i) a theoretical framework based on non-equilibrium statistical physics for network dynamics; (ii) statistical approaches to extract information about metastable states from a variety of neural signals; and (iii) recent neural network approaches, informed by experimental results, to model the emergence of metastable dynamics. By discussing these topics, we aim to provide a cohesive view of how transitions between different states of activity may provide the neural underpinnings for essential functions such as perception, memory, expectation, or decision making, and more generally, how the study of metastable neural activity may advance our understanding of neural circuit function in health and disease.
Collapse
Affiliation(s)
| | - H. Yan
- State Key Laboratory of Electroanalytical Chemistry, Changchun Institute of Applied Chemistry, Chinese Academy of Sciences, Changchun, Jilin 130022, People's Republic of China
| | | | | | | | - J. Wang
- Authors to whom correspondence should be addressed: and
| | - G. La Camera
- Authors to whom correspondence should be addressed: and
| |
Collapse
|
11
|
Darbin O, Indriani DW, Ardalan A, Eghbalnia HR, Assadi A, Nambu A, Montgomery E. Spectrum dependency to Rate and Spike Timing in neuronal spike trains. J Neurosci Methods 2022; 372:109532. [PMID: 35182602 DOI: 10.1016/j.jneumeth.2022.109532] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2021] [Revised: 02/11/2022] [Accepted: 02/14/2022] [Indexed: 10/19/2022]
Abstract
BACKGROUND Spike trains are series of interspike intervals in a specific order that can be characterized by their probability distributions and order in time which refer to the concepts of rate and spike timing features. Periodic structure in the spike train can be reflected in oscillatory activities. Thus, there is a direct link between oscillator activities and the spike train. The proposed methods are to investigate the dependency of emerging oscillatory activities to the rate and the spike timing features. METHOD First, the circular statistics methods were compared to Fast Fourier Transform for best estimation of spectra. Second, two statistical tests were introduced to help make decisions regarding the dependency of spectrum, or individual frequencies, onto rate and spike timing. Third, the methodology is applied to in-vivo recordings of basal ganglia neurons in mouse, primate, and human. Finally, this novel framework is shown to allow the investigation of subsets of spikes contributing to individual oscillators. RESULTS Use of circular statistical methods, in comparison to FFT, minimizes spectral leakage. Using virtual spike trains, the Rate versus Timing Dependency Spectrum Test (or RTDs-Test) permits identifying spectral spike trains solely dependent on the rate feature from those that are also dependent on the spike timing feature. Similarly, the Rate versus Timing Dependency Frequency Test (or RTDf-Test), allows to identify individual oscillators with partial dependency on spike timing. Dependency on spike timing was found for all in-vivo recordings but only in few frequencies. The mapping in frequency and time of dependencies showed a dynamical process that may be organizing the basal ganglia function. CONCLUSIONS The methodology may improve our understanding of the emergence of oscillatory activities and, possibly, the relation between oscillatory activities and circuitry functions.
Collapse
Affiliation(s)
- Olivier Darbin
- Department of Neurology, University South Alabama, 307 University Blvd, Mobile, AL 36688, USA; Division of System Neurophysiology, National Institute for Physiological Sciences, 38 Nishigonaka, Myodaiji, Okazaki, 444-8585, JAPAN.
| | - Dwi Wahyu Indriani
- Division of System Neurophysiology, National Institute for Physiological Sciences, 38 Nishigonaka, Myodaiji, Okazaki, 444-8585, JAPAN; Department of Physiological Sciences, SOKENDAI, 38 Nishigonaka, Myodaiji, Okazaki, 444-8585, JAPAN
| | - Adel Ardalan
- Zuckerman Mind Brain Behavior Institute, Columbia University, 3227 Broadway, New York, NY, 10027, USA
| | - Hamid R Eghbalnia
- Department of Molecular Biology and Biophysics. UConn Health. 263 Farmington Avenue Farmington, CT 06030, USA
| | - Amir Assadi
- Department of Mathematics, University of Wisconsin Madison, 480 Lincoln Drive, 213 Van Vleck Hall, Madison, WI 53706 5, USA
| | - Atsushi Nambu
- Division of System Neurophysiology, National Institute for Physiological Sciences, 38 Nishigonaka, Myodaiji, Okazaki, 444-8585, JAPAN; Department of Physiological Sciences, SOKENDAI, 38 Nishigonaka, Myodaiji, Okazaki, 444-8585, JAPAN
| | - Erwin Montgomery
- Department of Medicine (Neurology), Health Sciences, McMaster University, Hamilton, ON L8L 2X2, CA, USA
| |
Collapse
|
12
|
Khaledi-Nasab A, Kromer JA, Tass PA. Long-Lasting Desynchronization of Plastic Neuronal Networks by Double-Random Coordinated Reset Stimulation. FRONTIERS IN NETWORK PHYSIOLOGY 2022; 2:864859. [PMID: 36926109 PMCID: PMC10013062 DOI: 10.3389/fnetp.2022.864859] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/28/2022] [Accepted: 03/18/2022] [Indexed: 11/13/2022]
Abstract
Hypersynchrony of neuronal activity is associated with several neurological disorders, including essential tremor and Parkinson's disease (PD). Chronic high-frequency deep brain stimulation (HF DBS) is the standard of care for medically refractory PD. Symptoms may effectively be suppressed by HF DBS, but return shortly after cessation of stimulation. Coordinated reset (CR) stimulation is a theory-based stimulation technique that was designed to specifically counteract neuronal synchrony by desynchronization. During CR, phase-shifted stimuli are delivered to multiple neuronal subpopulations. Computational studies on CR stimulation of plastic neuronal networks revealed long-lasting desynchronization effects obtained by down-regulating abnormal synaptic connectivity. This way, networks are moved into attractors of stable desynchronized states such that stimulation-induced desynchronization persists after cessation of stimulation. Preclinical and clinical studies confirmed corresponding long-lasting therapeutic and desynchronizing effects in PD. As PD symptoms are associated with different pathological synchronous rhythms, stimulation-induced long-lasting desynchronization effects should favorably be robust to variations of the stimulation frequency. Recent computational studies suggested that this robustness can be improved by randomizing the timings of stimulus deliveries. We study the long-lasting effects of CR stimulation with randomized stimulus amplitudes and/or randomized stimulus timing in networks of leaky integrate-and-fire (LIF) neurons with spike-timing-dependent plasticity. Performing computer simulations and analytical calculations, we study long-lasting desynchronization effects of CR with and without randomization of stimulus amplitudes alone, randomization of stimulus times alone as well as the combination of both. Varying the CR stimulation frequency (with respect to the frequency of abnormal target rhythm) and the number of separately stimulated neuronal subpopulations, we reveal parameter regions and related mechanisms where the two qualitatively different randomization mechanisms improve the robustness of long-lasting desynchronization effects of CR. In particular, for clinically relevant parameter ranges double-random CR stimulation, i.e., CR stimulation with the specific combination of stimulus amplitude randomization and stimulus time randomization, may outperform regular CR stimulation with respect to long-lasting desynchronization. In addition, our results provide the first evidence that an effective reduction of the overall stimulation current by stimulus amplitude randomization may improve the frequency robustness of long-lasting therapeutic effects of brain stimulation.
Collapse
Affiliation(s)
- Ali Khaledi-Nasab
- Department of Neurosurgery, Stanford University, Stanford, CA, United States
| | - Justus A Kromer
- Department of Neurosurgery, Stanford University, Stanford, CA, United States
| | - Peter A Tass
- Department of Neurosurgery, Stanford University, Stanford, CA, United States
| |
Collapse
|
13
|
Tukker JJ, Beed P, Brecht M, Kempter R, Moser EI, Schmitz D. Microcircuits for spatial coding in the medial entorhinal cortex. Physiol Rev 2021; 102:653-688. [PMID: 34254836 PMCID: PMC8759973 DOI: 10.1152/physrev.00042.2020] [Citation(s) in RCA: 20] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/31/2022] Open
Abstract
The hippocampal formation is critically involved in learning and memory and contains a large proportion of neurons encoding aspects of the organism’s spatial surroundings. In the medial entorhinal cortex (MEC), this includes grid cells with their distinctive hexagonal firing fields as well as a host of other functionally defined cell types including head direction cells, speed cells, border cells, and object-vector cells. Such spatial coding emerges from the processing of external inputs by local microcircuits. However, it remains unclear exactly how local microcircuits and their dynamics within the MEC contribute to spatial discharge patterns. In this review we focus on recent investigations of intrinsic MEC connectivity, which have started to describe and quantify both excitatory and inhibitory wiring in the superficial layers of the MEC. Although the picture is far from complete, it appears that these layers contain robust recurrent connectivity that could sustain the attractor dynamics posited to underlie grid pattern formation. These findings pave the way to a deeper understanding of the mechanisms underlying spatial navigation and memory.
Collapse
Affiliation(s)
- John J Tukker
- Network Dysfunction, German Center for Neurodegenerative Diseases, Berlin, Germany
| | - Prateep Beed
- NeuroScientific Research Center, Charite Berlin, Germany
| | - Michael Brecht
- Systems Neuroscience, Humboldt University of Berlin, Berlin, Germany
| | - Richard Kempter
- Department of Biology, Institute for Theoretical Biology, Humbolt-Universität zu Berlin, Berlin, Germany
| | - Edvard I Moser
- Kavli Institute for Systems Neuroscience and Centre for the Biology of Memory, Norwegian University of Science and Technology
| | - Dietmar Schmitz
- Neuroscience Research Center, Charité Universitätsmedizin Berlin, Berlin, Germany
| |
Collapse
|
14
|
Aljadeff J, Gillett M, Pereira Obilinovic U, Brunel N. From synapse to network: models of information storage and retrieval in neural circuits. Curr Opin Neurobiol 2021; 70:24-33. [PMID: 34175521 DOI: 10.1016/j.conb.2021.05.005] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2021] [Revised: 05/06/2021] [Accepted: 05/25/2021] [Indexed: 10/21/2022]
Abstract
The mechanisms of information storage and retrieval in brain circuits are still the subject of debate. It is widely believed that information is stored at least in part through changes in synaptic connectivity in networks that encode this information and that these changes lead in turn to modifications of network dynamics, such that the stored information can be retrieved at a later time. Here, we review recent progress in deriving synaptic plasticity rules from experimental data and in understanding how plasticity rules affect the dynamics of recurrent networks. We show that the dynamics generated by such networks exhibit a large degree of diversity, depending on parameters, similar to experimental observations in vivo during delayed response tasks.
Collapse
Affiliation(s)
- Johnatan Aljadeff
- Neurobiology Section, Division of Biological Sciences, UC San Diego, USA
| | | | | | - Nicolas Brunel
- Department of Neurobiology, Duke University, USA; Department of Physics, Duke University, USA.
| |
Collapse
|
15
|
Hasani R, Ferrari G, Yamamoto H, Tanii T, Prati E. Role of Noise in Spontaneous Activity of Networks of Neurons on Patterned Silicon Emulated by Noise–activated CMOS Neural Nanoelectronic Circuits. NANO EXPRESS 2021. [DOI: 10.1088/2632-959x/abf2ae] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Abstract
Background noise in biological cortical microcircuits constitutes a powerful resource to assess their computational tasks, including, for instance, the synchronization of spiking activity, the enhancement of the speed of information transmission, and the minimization of the corruption of signals. We explore the correlation of spontaneous firing activity of ≈ 100 biological neurons adhering to engineered scaffolds by governing the number of functionalized patterned connection pathways among groups of neurons. We then emulate the biological system by a series of noise-activated silicon neural network simulations. We show that by suitably tuning both the amplitude of noise and the number of synapses between the silicon neurons, the same controlled correlation of the biological population is achieved. Our results extend to a realistic silicon nanoelectronics neuron design using noise injection to be exploited in artificial spiking neural networks such as liquid state machines and recurrent neural networks for stochastic computation.
Collapse
|
16
|
Akil AE, Rosenbaum R, Josić K. Balanced networks under spike-time dependent plasticity. PLoS Comput Biol 2021; 17:e1008958. [PMID: 33979336 PMCID: PMC8143429 DOI: 10.1371/journal.pcbi.1008958] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2020] [Revised: 05/24/2021] [Accepted: 04/12/2021] [Indexed: 11/28/2022] Open
Abstract
The dynamics of local cortical networks are irregular, but correlated. Dynamic excitatory–inhibitory balance is a plausible mechanism that generates such irregular activity, but it remains unclear how balance is achieved and maintained in plastic neural networks. In particular, it is not fully understood how plasticity induced changes in the network affect balance, and in turn, how correlated, balanced activity impacts learning. How do the dynamics of balanced networks change under different plasticity rules? How does correlated spiking activity in recurrent networks change the evolution of weights, their eventual magnitude, and structure across the network? To address these questions, we develop a theory of spike–timing dependent plasticity in balanced networks. We show that balance can be attained and maintained under plasticity–induced weight changes. We find that correlations in the input mildly affect the evolution of synaptic weights. Under certain plasticity rules, we find an emergence of correlations between firing rates and synaptic weights. Under these rules, synaptic weights converge to a stable manifold in weight space with their final configuration dependent on the initial state of the network. Lastly, we show that our framework can also describe the dynamics of plastic balanced networks when subsets of neurons receive targeted optogenetic input. Animals are able to learn complex tasks through changes in individual synapses between cells. Such changes lead to the coevolution of neural activity patterns and the structure of neural connectivity, but the consequences of these interactions are not fully understood. We consider plasticity in model neural networks which achieve an average balance between the excitatory and inhibitory synaptic inputs to different cells, and display cortical–like, irregular activity. We extend the theory of balanced networks to account for synaptic plasticity and show which rules can maintain balance, and which will drive the network into a different state. This theory of plasticity can provide insights into the relationship between stimuli, network dynamics, and synaptic circuitry.
Collapse
Affiliation(s)
- Alan Eric Akil
- Department of Mathematics, University of Houston, Houston, Texas, United States of America
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana, United States of America
- Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, Indiana, United States of America
| | - Krešimir Josić
- Department of Mathematics, University of Houston, Houston, Texas, United States of America
- Department of Biology and Biochemistry, University of Houston, Houston, Texas, United States of America
- * E-mail:
| |
Collapse
|
17
|
Gamma Oscillations Facilitate Effective Learning in Excitatory-Inhibitory Balanced Neural Circuits. Neural Plast 2021; 2021:6668175. [PMID: 33542728 PMCID: PMC7840255 DOI: 10.1155/2021/6668175] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2020] [Revised: 12/19/2020] [Accepted: 01/07/2021] [Indexed: 12/26/2022] Open
Abstract
Gamma oscillation in neural circuits is believed to associate with effective learning in the brain, while the underlying mechanism is unclear. This paper aims to study how spike-timing-dependent plasticity (STDP), a typical mechanism of learning, with its interaction with gamma oscillation in neural circuits, shapes the network dynamics properties and the network structure formation. We study an excitatory-inhibitory (E-I) integrate-and-fire neuronal network with triplet STDP, heterosynaptic plasticity, and a transmitter-induced plasticity. Our results show that the performance of plasticity is diverse in different synchronization levels. We find that gamma oscillation is beneficial to synaptic potentiation among stimulated neurons by forming a special network structure where the sum of excitatory input synaptic strength is correlated with the sum of inhibitory input synaptic strength. The circuit can maintain E-I balanced input on average, whereas the balance is temporal broken during the learning-induced oscillations. Our study reveals a potential mechanism about the benefits of gamma oscillation on learning in biological neural circuits.
Collapse
|
18
|
Ingrosso A. Optimal learning with excitatory and inhibitory synapses. PLoS Comput Biol 2020; 16:e1008536. [PMID: 33370266 PMCID: PMC7793294 DOI: 10.1371/journal.pcbi.1008536] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2020] [Revised: 01/08/2021] [Accepted: 11/13/2020] [Indexed: 11/22/2022] Open
Abstract
Characterizing the relation between weight structure and input/output statistics is fundamental for understanding the computational capabilities of neural circuits. In this work, I study the problem of storing associations between analog signals in the presence of correlations, using methods from statistical mechanics. I characterize the typical learning performance in terms of the power spectrum of random input and output processes. I show that optimal synaptic weight configurations reach a capacity of 0.5 for any fraction of excitatory to inhibitory weights and have a peculiar synaptic distribution with a finite fraction of silent synapses. I further provide a link between typical learning performance and principal components analysis in single cases. These results may shed light on the synaptic profile of brain circuits, such as cerebellar structures, that are thought to engage in processing time-dependent signals and performing on-line prediction.
Collapse
Affiliation(s)
- Alessandro Ingrosso
- Zuckerman Mind, Brain, Behavior Institute, Columbia University, New York, New York, United States of America
| |
Collapse
|
19
|
Laing CR, Bläsche C. The effects of within-neuron degree correlations in networks of spiking neurons. BIOLOGICAL CYBERNETICS 2020; 114:337-347. [PMID: 32124039 DOI: 10.1007/s00422-020-00822-0] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/13/2019] [Accepted: 02/15/2020] [Indexed: 05/20/2023]
Abstract
We consider the effects of correlations between the in- and out-degrees of individual neurons on the dynamics of a network of neurons. By using theta neurons, we can derive a set of coupled differential equations for the expected dynamics of neurons with the same in-degree. A Gaussian copula is used to introduce correlations between a neuron's in- and out-degree, and numerical bifurcation analysis is used determine the effects of these correlations on the network's dynamics. For excitatory coupling, we find that inducing positive correlations has a similar effect to increasing the coupling strength between neurons, while for inhibitory coupling it has the opposite effect. We also determine the propensity of various two- and three-neuron motifs to occur as correlations are varied and give a plausible explanation for the observed changes in dynamics.
Collapse
Affiliation(s)
- Carlo R Laing
- School of Natural and Computational Sciences, Massey University, NSMC, Private Bag 102-904, Auckland, New Zealand.
| | - Christian Bläsche
- School of Natural and Computational Sciences, Massey University, NSMC, Private Bag 102-904, Auckland, New Zealand
| |
Collapse
|
20
|
Montangie L, Miehl C, Gjorgjieva J. Autonomous emergence of connectivity assemblies via spike triplet interactions. PLoS Comput Biol 2020; 16:e1007835. [PMID: 32384081 PMCID: PMC7239496 DOI: 10.1371/journal.pcbi.1007835] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2019] [Revised: 05/20/2020] [Accepted: 03/31/2020] [Indexed: 01/08/2023] Open
Abstract
Non-random connectivity can emerge without structured external input driven by activity-dependent mechanisms of synaptic plasticity based on precise spiking patterns. Here we analyze the emergence of global structures in recurrent networks based on a triplet model of spike timing dependent plasticity (STDP), which depends on the interactions of three precisely-timed spikes, and can describe plasticity experiments with varying spike frequency better than the classical pair-based STDP rule. We derive synaptic changes arising from correlations up to third-order and describe them as the sum of structural motifs, which determine how any spike in the network influences a given synaptic connection through possible connectivity paths. This motif expansion framework reveals novel structural motifs under the triplet STDP rule, which support the formation of bidirectional connections and ultimately the spontaneous emergence of global network structure in the form of self-connected groups of neurons, or assemblies. We propose that under triplet STDP assembly structure can emerge without the need for externally patterned inputs or assuming a symmetric pair-based STDP rule common in previous studies. The emergence of non-random network structure under triplet STDP occurs through internally-generated higher-order correlations, which are ubiquitous in natural stimuli and neuronal spiking activity, and important for coding. We further demonstrate how neuromodulatory mechanisms that modulate the shape of the triplet STDP rule or the synaptic transmission function differentially promote structural motifs underlying the emergence of assemblies, and quantify the differences using graph theoretic measures. Emergent non-random connectivity structures in different brain regions are tightly related to specific patterns of neural activity and support diverse brain functions. For instance, self-connected groups of neurons, known as assemblies, have been proposed to represent functional units in brain circuits and can emerge even without patterned external instruction. Here we investigate the emergence of non-random connectivity in recurrent networks using a particular plasticity rule, triplet STDP, which relies on the interaction of spike triplets and can capture higher-order statistical dependencies in neural activity. We derive the evolution of the synaptic strengths in the network and explore the conditions for the self-organization of connectivity into assemblies. We demonstrate key differences of the triplet STDP rule compared to the classical pair-based rule in terms of how assemblies are formed, including the realistic asymmetric shape and influence of novel connectivity motifs on network plasticity driven by higher-order correlations. Assembly formation depends on the specific shape of the STDP window and synaptic transmission function, pointing towards an important role of neuromodulatory signals on formation of intrinsically generated assemblies.
Collapse
Affiliation(s)
- Lisandro Montangie
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt, Germany
| | - Christoph Miehl
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt, Germany
- Technical University of Munich, School of Life Sciences, Freising, Germany
| | - Julijana Gjorgjieva
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt, Germany
- Technical University of Munich, School of Life Sciences, Freising, Germany
- * E-mail:
| |
Collapse
|
21
|
Swanson RA, Levenstein D, McClain K, Tingley D, Buzsáki G. Variable specificity of memory trace reactivation during hippocampal sharp wave ripples. Curr Opin Behav Sci 2020; 32:126-135. [DOI: 10.1016/j.cobeha.2020.02.008] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/14/2023]
|
22
|
Shamir M. Theories of rhythmogenesis. Curr Opin Neurobiol 2019; 58:70-77. [PMID: 31408837 DOI: 10.1016/j.conb.2019.07.005] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/01/2019] [Accepted: 07/14/2019] [Indexed: 12/31/2022]
Abstract
Rhythmogenesis is the process that develops the capacity for rhythmic activity in a non-rhythmic system. Theoretical works suggested a wide array of possible mechanisms for rhythmogenesis ranging from the regulation of cellular properties to top-down control. Here we discuss theories of rhythmogenesis with an emphasis on spike timing-dependent plasticity. We argue that even though the specifics of different mechanisms vary greatly they all share certain key features. Namely, rhythmogenesis can be described as a flow on the phase diagram leading the system into a rhythmic region and stabilizing it on a specific manifold characterized by the desired rhythmic activity. Functionality is retained despite biological diversity by forcing the system into a specific manifold, but allowing fluctuations within that manifold.
Collapse
Affiliation(s)
- Maoz Shamir
- Department of Physiology and Cell Biology, Faculty of Health Sciences, Department of Physics, Faculty of Natural Sciences, Zlotowski Center for Neuroscience, Ben-Gurion University of the Negev, Be'er-Sheva, Israel; The Kavli Institute for Theoretical Physics, University of California, Santa Barbara, USA.
| |
Collapse
|
23
|
Curto C, Morrison K. Relating network connectivity to dynamics: opportunities and challenges for theoretical neuroscience. Curr Opin Neurobiol 2019; 58:11-20. [PMID: 31319287 DOI: 10.1016/j.conb.2019.06.003] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2019] [Accepted: 06/22/2019] [Indexed: 11/29/2022]
Abstract
We review recent work relating network connectivity to the dynamics of neural activity. While concepts stemming from network science provide a valuable starting point, the interpretation of graph-theoretic structures and measures can be highly dependent on the dynamics associated to the network. Properties that are quite meaningful for linear dynamics, such as random walk and network flow models, may be of limited relevance in the neuroscience setting. Theoretical and computational neuroscience are playing a vital role in understanding the relationship between network connectivity and the nonlinear dynamics associated to neural networks.
Collapse
Affiliation(s)
- Carina Curto
- The Pennsylvania State University, PA 16802, United States.
| | - Katherine Morrison
- School of Mathematical Sciences, University of Northern Colorado, Greeley, CO 80639, USA
| |
Collapse
|
24
|
La Camera G, Fontanini A, Mazzucato L. Cortical computations via metastable activity. Curr Opin Neurobiol 2019; 58:37-45. [PMID: 31326722 DOI: 10.1016/j.conb.2019.06.007] [Citation(s) in RCA: 29] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/27/2018] [Accepted: 06/22/2019] [Indexed: 12/27/2022]
Abstract
Metastable brain dynamics are characterized by abrupt, jump-like modulations so that the neural activity in single trials appears to unfold as a sequence of discrete, quasi-stationary 'states'. Evidence that cortical neural activity unfolds as a sequence of metastable states is accumulating at fast pace. Metastable activity occurs both in response to an external stimulus and during ongoing, self-generated activity. These spontaneous metastable states are increasingly found to subserve internal representations that are not locked to external triggers, including states of deliberations, attention and expectation. Moreover, decoding stimuli or decisions via metastable states can be carried out trial-by-trial. Focusing on metastability will allow us to shift our perspective on neural coding from traditional concepts based on trial-averaging to models based on dynamic ensemble representations. Recent theoretical work has started to characterize the mechanistic origin and potential roles of metastable representations. In this article we review recent findings on metastable activity, how it may arise in biologically realistic models, and its potential role for representing internal states as well as relevant task variables.
Collapse
Affiliation(s)
- Giancarlo La Camera
- Department of Neurobiology and Behavior, State University of New York at Stony Brook, Stony Brook, NY 11794, United States; Graduate Program in Neuroscience, State University of New York at Stony Brook, Stony Brook, NY 11794, United States.
| | - Alfredo Fontanini
- Department of Neurobiology and Behavior, State University of New York at Stony Brook, Stony Brook, NY 11794, United States; Graduate Program in Neuroscience, State University of New York at Stony Brook, Stony Brook, NY 11794, United States
| | - Luca Mazzucato
- Departments of Biology and Mathematics and Institute of Neuroscience, University of Oregon, Eugene, OR 97403, United States
| |
Collapse
|
25
|
Recanatesi S, Ocker GK, Buice MA, Shea-Brown E. Dimensionality in recurrent spiking networks: Global trends in activity and local origins in connectivity. PLoS Comput Biol 2019; 15:e1006446. [PMID: 31299044 PMCID: PMC6655892 DOI: 10.1371/journal.pcbi.1006446] [Citation(s) in RCA: 31] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2018] [Revised: 07/24/2019] [Accepted: 04/03/2019] [Indexed: 11/25/2022] Open
Abstract
The dimensionality of a network's collective activity is of increasing interest in neuroscience. This is because dimensionality provides a compact measure of how coordinated network-wide activity is, in terms of the number of modes (or degrees of freedom) that it can independently explore. A low number of modes suggests a compressed low dimensional neural code and reveals interpretable dynamics [1], while findings of high dimension may suggest flexible computations [2, 3]. Here, we address the fundamental question of how dimensionality is related to connectivity, in both autonomous and stimulus-driven networks. Working with a simple spiking network model, we derive three main findings. First, the dimensionality of global activity patterns can be strongly, and systematically, regulated by local connectivity structures. Second, the dimensionality is a better indicator than average correlations in determining how constrained neural activity is. Third, stimulus evoked neural activity interacts systematically with neural connectivity patterns, leading to network responses of either greater or lesser dimensionality than the stimulus.
Collapse
Affiliation(s)
- Stefano Recanatesi
- Center for Computational Neuroscience, University of Washington, Seattle, Washington, United States of America
| | - Gabriel Koch Ocker
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Michael A. Buice
- Center for Computational Neuroscience, University of Washington, Seattle, Washington, United States of America
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
| | - Eric Shea-Brown
- Center for Computational Neuroscience, University of Washington, Seattle, Washington, United States of America
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
| |
Collapse
|
26
|
Lappalainen J, Herpich J, Tetzlaff C. A Theoretical Framework to Derive Simple, Firing-Rate-Dependent Mathematical Models of Synaptic Plasticity. Front Comput Neurosci 2019; 13:26. [PMID: 31133837 PMCID: PMC6517541 DOI: 10.3389/fncom.2019.00026] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2018] [Accepted: 04/10/2019] [Indexed: 11/13/2022] Open
Abstract
Synaptic plasticity serves as an essential mechanism underlying cognitive processes as learning and memory. For a better understanding detailed theoretical models combine experimental underpinnings of synaptic plasticity and match experimental results. However, these models are mathematically complex impeding the comprehensive investigation of their link to cognitive processes generally executed on the neuronal network level. Here, we derive a mathematical framework enabling the simplification of such detailed models of synaptic plasticity facilitating further mathematical analyses. By this framework we obtain a compact, firing-rate-dependent mathematical formulation, which includes the essential dynamics of the detailed model and, thus, of experimentally verified properties of synaptic plasticity. Amongst others, by testing our framework by abstracting the dynamics of two well-established calcium-dependent synaptic plasticity models, we derived that the synaptic changes depend on the square of the presynaptic firing rate, which is in contrast to previous assumptions. Thus, the here-presented framework enables the derivation of biologically plausible but simple mathematical models of synaptic plasticity allowing to analyze the underlying dependencies of synaptic dynamics from neuronal properties such as the firing rate and to investigate their implications in complex neuronal networks.
Collapse
Affiliation(s)
- Janne Lappalainen
- Department of Computational Neuroscience, Third Institute of Physics-Biophysics, Georg-August-University, Göttingen, Germany
| | - Juliane Herpich
- Department of Computational Neuroscience, Third Institute of Physics-Biophysics, Georg-August-University, Göttingen, Germany.,Bernstein Center for Computational Neuroscience, Georg-August-University, Göttingen, Germany
| | - Christian Tetzlaff
- Department of Computational Neuroscience, Third Institute of Physics-Biophysics, Georg-August-University, Göttingen, Germany.,Bernstein Center for Computational Neuroscience, Georg-August-University, Göttingen, Germany
| |
Collapse
|
27
|
Ocker GK, Doiron B. Training and Spontaneous Reinforcement of Neuronal Assemblies by Spike Timing Plasticity. Cereb Cortex 2019; 29:937-951. [PMID: 29415191 PMCID: PMC7963120 DOI: 10.1093/cercor/bhy001] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2016] [Revised: 01/01/2018] [Accepted: 01/05/2018] [Indexed: 12/15/2022] Open
Abstract
The synaptic connectivity of cortex is plastic, with experience shaping the ongoing interactions between neurons. Theoretical studies of spike timing-dependent plasticity (STDP) have focused on either just pairs of neurons or large-scale simulations. A simple analytic account for how fast spike time correlations affect both microscopic and macroscopic network structure is lacking. We develop a low-dimensional mean field theory for STDP in recurrent networks and show the emergence of assemblies of strongly coupled neurons with shared stimulus preferences. After training, this connectivity is actively reinforced by spike train correlations during the spontaneous dynamics. Furthermore, the stimulus coding by cell assemblies is actively maintained by these internally generated spiking correlations, suggesting a new role for noise correlations in neural coding. Assembly formation has often been associated with firing rate-based plasticity schemes; our theory provides an alternative and complementary framework, where fine temporal correlations and STDP form and actively maintain learned structure in cortical networks.
Collapse
Affiliation(s)
- Gabriel Koch Ocker
- Department of Neuroscience, University of Pittsburgh, Pittsburgh, PA, USA
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Mellon University, Pittsburgh, PA, USA
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Brent Doiron
- Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Mellon University, Pittsburgh, PA, USA
- Department of Mathematics, University of Pittsburgh, Pittsburgh, PA, USA
| |
Collapse
|
28
|
Interplay of multiple pathways and activity-dependent rules in STDP. PLoS Comput Biol 2018; 14:e1006184. [PMID: 30106953 PMCID: PMC6112684 DOI: 10.1371/journal.pcbi.1006184] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2018] [Revised: 08/28/2018] [Accepted: 05/09/2018] [Indexed: 12/13/2022] Open
Abstract
Hebbian plasticity describes a basic mechanism for synaptic plasticity whereby synaptic weights evolve depending on the relative timing of paired activity of the pre- and postsynaptic neurons. Spike-timing-dependent plasticity (STDP) constitutes a central experimental and theoretical synaptic Hebbian learning rule. Various mechanisms, mostly calcium-based, account for the induction and maintenance of STDP. Classically STDP is assumed to gradually emerge in a monotonic way as the number of pairings increases. However, non-monotonic STDP accounting for fast associative learning led us to challenge this monotonicity hypothesis and explore how the existence of multiple plasticity pathways affects the dynamical establishment of plasticity. To account for distinct forms of STDP emerging from increasing numbers of pairings and the variety of signaling pathways involved, we developed a general class of simple mathematical models of plasticity based on calcium transients and accommodating various calcium-based plasticity mechanisms. These mechanisms can either compete or cooperate for the establishment of long-term potentiation (LTP) and depression (LTD), that emerge depending on past calcium activity. Our model reproduces accurately the striatal STDP that involves endocannabinoid and NMDAR signaling pathways. Moreover, we predict how stimulus frequency alters plasticity, and how triplet rules are affected by the number of pairings. We further investigate the general model with an arbitrary number of pathways and show that depending on those pathways and their properties, a variety of plasticities may emerge upon variation of the number and/or the frequency of pairings, even when the outcome after large numbers of pairings is identical. These findings, built upon a biologically realistic example and generalized to other applications, argue that in order to fully describe synaptic plasticity it is not sufficient to record STDP curves at fixed pairing numbers and frequencies. In fact, considering the whole spectrum of activity-dependent parameters could have a great impact on the description of plasticity, and a better understanding of the engram. The brain’s capacity to treat information, learn and store memory relies on synaptic connectivity patterns, which are altered through synaptic plasticity mechanisms. Experimentally, such plasticities were evidenced through protocols involving numerous repetitive stimulations of a given synapse, and were shown to be supported by multiple pathways. Using a simple biologically grounded mathematical model, we show how activation timescales and inactivation levels of each pathway interact and alter plasticity in an intricate manner as stimuli are presented. Building upon data from the synapse between cortex and striatum, we show that synaptic changes may revert or re-emerge as stimuli are presented, and predict specific responses to changes in stimulus frequency or to distinct simulation patterns. Our general model shows that a given plasticity profile emerging in response to a repetitive stimulation protocol can unfold into various scenarii upon variations of the number of stimulus presentations or patterns, which tightly depends on the underlying activated pathways. Altogether, these results argue that in order to better understand learning and memory, single plasticity responses obtained through intensive stimulations do not reveal the complexity of the responses for smaller number of presentations, which may have a strong impact in fast learning of stimuli with low numbers of presentations.
Collapse
|
29
|
Development of Microplatforms to Mimic the In Vivo Architecture of CNS and PNS Physiology and Their Diseases. Genes (Basel) 2018; 9:genes9060285. [PMID: 29882823 PMCID: PMC6027402 DOI: 10.3390/genes9060285] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2018] [Revised: 05/28/2018] [Accepted: 05/31/2018] [Indexed: 12/16/2022] Open
Abstract
Understanding the mechanisms that govern nervous tissues function remains a challenge. In vitro two-dimensional (2D) cell culture systems provide a simplistic platform to evaluate systematic investigations but often result in unreliable responses that cannot be translated to pathophysiological settings. Recently, microplatforms have emerged to provide a better approximation of the in vivo scenario with better control over the microenvironment, stimuli and structure. Advances in biomaterials enable the construction of three-dimensional (3D) scaffolds, which combined with microfabrication, allow enhanced biomimicry through precise control of the architecture, cell positioning, fluid flows and electrochemical stimuli. This manuscript reviews, compares and contrasts advances in nervous tissues-on-a-chip models and their applications in neural physiology and disease. Microplatforms used for neuro-glia interactions, neuromuscular junctions (NMJs), blood-brain barrier (BBB) and studies on brain cancer, metastasis and neurodegenerative diseases are addressed. Finally, we highlight challenges that can be addressed with interdisciplinary efforts to achieve a higher degree of biomimicry. Nervous tissue microplatforms provide a powerful tool that is destined to provide a better understanding of neural health and disease.
Collapse
|
30
|
Rost T, Deger M, Nawrot MP. Winnerless competition in clustered balanced networks: inhibitory assemblies do the trick. BIOLOGICAL CYBERNETICS 2018; 112:81-98. [PMID: 29075845 PMCID: PMC5908874 DOI: 10.1007/s00422-017-0737-7] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/21/2017] [Accepted: 10/11/2017] [Indexed: 06/07/2023]
Abstract
Balanced networks are a frequently employed basic model for neuronal networks in the mammalian neocortex. Large numbers of excitatory and inhibitory neurons are recurrently connected so that the numerous positive and negative inputs that each neuron receives cancel out on average. Neuronal firing is therefore driven by fluctuations in the input and resembles the irregular and asynchronous activity observed in cortical in vivo data. Recently, the balanced network model has been extended to accommodate clusters of strongly interconnected excitatory neurons in order to explain persistent activity in working memory-related tasks. This clustered topology introduces multistability and winnerless competition between attractors and can capture the high trial-to-trial variability and its reduction during stimulation that has been found experimentally. In this prospect article, we review the mean field description of balanced networks of binary neurons and apply the theory to clustered networks. We show that the stable fixed points of networks with clustered excitatory connectivity tend quickly towards firing rate saturation, which is generally inconsistent with experimental data. To remedy this shortcoming, we then present a novel perspective on networks with locally balanced clusters of both excitatory and inhibitory neuron populations. This approach allows for true multistability and moderate firing rates in activated clusters over a wide range of parameters. Our findings are supported by mean field theory and numerical network simulations. Finally, we discuss possible applications of the concept of joint excitatory and inhibitory clustering in future cortical network modelling studies.
Collapse
Affiliation(s)
- Thomas Rost
- Computational Systems Neuroscience, Institute for Zoology, Faculty of Mathematics and Natural Sciences, University of Cologne, Cologne, Germany
| | - Moritz Deger
- Computational Systems Neuroscience, Institute for Zoology, Faculty of Mathematics and Natural Sciences, University of Cologne, Cologne, Germany
| | - Martin P Nawrot
- Computational Systems Neuroscience, Institute for Zoology, Faculty of Mathematics and Natural Sciences, University of Cologne, Cologne, Germany.
| |
Collapse
|
31
|
Min B, Zhou D, Cai D. Effects of Firing Variability on Network Structures with Spike-Timing-Dependent Plasticity. Front Comput Neurosci 2018; 12:1. [PMID: 29410621 PMCID: PMC5787127 DOI: 10.3389/fncom.2018.00001] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2017] [Accepted: 01/03/2018] [Indexed: 11/17/2022] Open
Abstract
Synaptic plasticity is believed to be the biological substrate underlying learning and memory. One of the most widespread forms of synaptic plasticity, spike-timing-dependent plasticity (STDP), uses the spike timing information of presynaptic and postsynaptic neurons to induce synaptic potentiation or depression. An open question is how STDP organizes the connectivity patterns in neuronal circuits. Previous studies have placed much emphasis on the role of firing rate in shaping connectivity patterns. Here, we go beyond the firing rate description to develop a self-consistent linear response theory that incorporates the information of both firing rate and firing variability. By decomposing the pairwise spike correlation into one component associated with local direct connections and the other associated with indirect connections, we identify two distinct regimes regarding the network structures learned through STDP. In one regime, the contribution of the direct-connection correlations dominates over that of the indirect-connection correlations in the learning dynamics; this gives rise to a network structure consistent with the firing rate description. In the other regime, the contribution of the indirect-connection correlations dominates in the learning dynamics, leading to a network structure different from the firing rate description. We demonstrate that the heterogeneity of firing variability across neuronal populations induces a temporally asymmetric structure of indirect-connection correlations. This temporally asymmetric structure underlies the emergence of the second regime. Our study provides a new perspective that emphasizes the role of high-order statistics of spiking activity in the spike-correlation-sensitive learning dynamics.
Collapse
Affiliation(s)
- Bin Min
- Center for Neural Science, Courant Institute of Mathematical Sciences, New York University, New York, NY, United States.,NYUAD Institute, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates
| | - Douglas Zhou
- School of Mathematical Sciences, MOE-LSC, Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| | - David Cai
- Center for Neural Science, Courant Institute of Mathematical Sciences, New York University, New York, NY, United States.,NYUAD Institute, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates.,School of Mathematical Sciences, MOE-LSC, Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| |
Collapse
|
32
|
Blackwell JM, Geffen MN. Progress and challenges for understanding the function of cortical microcircuits in auditory processing. Nat Commun 2017; 8:2165. [PMID: 29255268 PMCID: PMC5735136 DOI: 10.1038/s41467-017-01755-2] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2017] [Accepted: 10/12/2017] [Indexed: 12/21/2022] Open
Abstract
An important outstanding question in auditory neuroscience is to identify the mechanisms by which specific motifs within inter-connected neural circuits affect auditory processing and, ultimately, behavior. In the auditory cortex, a combination of large-scale electrophysiological recordings and concurrent optogenetic manipulations are improving our understanding of the role of inhibitory–excitatory interactions. At the same time, computational approaches have grown to incorporate diverse neuronal types and connectivity patterns. However, we are still far from understanding how cortical microcircuits encode and transmit information about complex acoustic scenes. In this review, we focus on recent results identifying the special function of different cortical neurons in the auditory cortex and discuss a computational framework for future work that incorporates ideas from network science and network dynamics toward the coding of complex auditory scenes. Advances in multi-neuron recordings and optogenetic manipulation have resulted in an interrogation of the function of specific cortical cell types in auditory cortex during sound processing. Here, the authors review this literature and discuss the merits of integrating computational approaches from dynamic network science.
Collapse
Affiliation(s)
- Jennifer M Blackwell
- Department of Otorhinolaryngology: HNS, Department of Neuroscience, Neuroscience Graduate Group, Computational Neuroscience Initiative, University of Pennsylvania, Philadelphia, PA, 19104, USA
| | - Maria N Geffen
- Department of Otorhinolaryngology: HNS, Department of Neuroscience, Neuroscience Graduate Group, Computational Neuroscience Initiative, University of Pennsylvania, Philadelphia, PA, 19104, USA.
| |
Collapse
|
33
|
Neuronal Intrinsic Physiology Changes During Development of a Learned Behavior. eNeuro 2017; 4:eN-NWR-0297-17. [PMID: 29062887 PMCID: PMC5649544 DOI: 10.1523/eneuro.0297-17.2017] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/23/2017] [Accepted: 09/07/2017] [Indexed: 01/14/2023] Open
Abstract
Juvenile male zebra finches learn their songs over distinct auditory and sensorimotor stages, the former requiring exposure to an adult tutor song pattern. The cortical premotor nucleus HVC (acronym is name) plays a necessary role in both learning stages, as well as the production of adult song. Consistent with neural network models where synaptic plasticity mediates developmental forms of learning, exposure to tutor song drives changes in the turnover, density, and morphology of HVC synapses during vocal development. A network's output, however, is also influenced by the intrinsic properties (e.g., ion channels) of the component neurons, which could change over development. Here, we use patch clamp recordings to show cell-type-specific changes in the intrinsic physiology of HVC projection neurons as a function of vocal development. Developmental changes in HVC neurons that project to the basal ganglia include an increased voltage sag response to hyperpolarizing currents and an increased rebound depolarization following hyperpolarization. Developmental changes in HVC neurons that project to vocal-motor cortex include a decreased resting membrane potential and an increased spike amplitude. HVC interneurons, however, show a relatively stable range of intrinsic features across vocal development. We used mathematical models to deduce possible changes in ionic currents that underlie the physiological changes and to show that the magnitude of the observed changes could alter HVC circuit function. The results demonstrate developmental plasticity in the intrinsic physiology of HVC projection neurons and suggest that intrinsic plasticity may have a role in the process of song learning.
Collapse
|
34
|
Cortical inhibitory interneurons control sensory processing. Curr Opin Neurobiol 2017; 46:200-207. [PMID: 28938181 DOI: 10.1016/j.conb.2017.08.018] [Citation(s) in RCA: 45] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2017] [Accepted: 08/30/2017] [Indexed: 01/17/2023]
Abstract
Inhibitory and excitatory neurons form intricate interconnected circuits in the mammalian sensory cortex. Whereas the function of excitatory neurons is largely to integrate and transmit information within and between brain areas, inhibitory neurons are thought to shape the way excitatory neurons integrate information, and they exhibit context-specific and behavior-specific responses. Over the last few years, work across sensory modalities has begun unraveling the function of distinct types of cortical inhibitory neurons in sensory processing, identifying their contribution to controlling stimulus selectivity of excitatory neurons and modulating information processing based on the behavioral state of the subject. Here, we review results from recent studies and discuss the implications for the contribution of inhibition to cortical circuit activity and information processing.
Collapse
|
35
|
Ocker GK, Hu Y, Buice MA, Doiron B, Josić K, Rosenbaum R, Shea-Brown E. From the statistics of connectivity to the statistics of spike times in neuronal networks. Curr Opin Neurobiol 2017; 46:109-119. [PMID: 28863386 DOI: 10.1016/j.conb.2017.07.011] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2017] [Revised: 07/21/2017] [Accepted: 07/27/2017] [Indexed: 10/19/2022]
Abstract
An essential step toward understanding neural circuits is linking their structure and their dynamics. In general, this relationship can be almost arbitrarily complex. Recent theoretical work has, however, begun to identify some broad principles underlying collective spiking activity in neural circuits. The first is that local features of network connectivity can be surprisingly effective in predicting global statistics of activity across a network. The second is that, for the important case of large networks with excitatory-inhibitory balance, correlated spiking persists or vanishes depending on the spatial scales of recurrent and feedforward connectivity. We close by showing how these ideas, together with plasticity rules, can help to close the loop between network structure and activity statistics.
Collapse
Affiliation(s)
| | - Yu Hu
- Center for Brain Science, Harvard University, United States
| | - Michael A Buice
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States
| | - Brent Doiron
- Department of Mathematics, University of Pittsburgh, United States; Center for the Neural Basis of Cognition, Pittsburgh, United States
| | - Krešimir Josić
- Department of Mathematics, University of Houston, United States; Department of Biology and Biochemistry, University of Houston, United States; Department of BioSciences, Rice University, United States
| | - Robert Rosenbaum
- Department of Mathematics, University of Notre Dame, United States
| | - Eric Shea-Brown
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States; Department of Physiology and Biophysics, and University of Washington Institute for Neuroengineering, United States.
| |
Collapse
|
36
|
Abstract
Recent experimental advances are producing an avalanche of data on both neural connectivity and neural activity. To take full advantage of these two emerging datasets we need a framework that links them, revealing how collective neural activity arises from the structure of neural connectivity and intrinsic neural dynamics. This problem of structure-driven activity has drawn major interest in computational neuroscience. Existing methods for relating activity and architecture in spiking networks rely on linearizing activity around a central operating point and thus fail to capture the nonlinear responses of individual neurons that are the hallmark of neural information processing. Here, we overcome this limitation and present a new relationship between connectivity and activity in networks of nonlinear spiking neurons by developing a diagrammatic fluctuation expansion based on statistical field theory. We explicitly show how recurrent network structure produces pairwise and higher-order correlated activity, and how nonlinearities impact the networks’ spiking activity. Our findings open new avenues to investigating how single-neuron nonlinearities—including those of different cell types—combine with connectivity to shape population activity and function. Neuronal networks, like many biological systems, exhibit variable activity. This activity is shaped by both the underlying biology of the component neurons and the structure of their interactions. How can we combine knowledge of these two things—that is, models of individual neurons and of their interactions—to predict the statistics of single- and multi-neuron activity? Current approaches rely on linearizing neural activity around a stationary state. In the face of neural nonlinearities, however, these linear methods can fail to predict spiking statistics and even fail to correctly predict whether activity is stable or pathological. Here, we show how to calculate any spike train cumulant in a broad class of models, while systematically accounting for nonlinear effects. We then study a fundamental effect of nonlinear input-rate transfer–coupling between different orders of spiking statistic–and how this depends on single-neuron and network properties.
Collapse
Affiliation(s)
- Gabriel Koch Ocker
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Krešimir Josić
- Department of Mathematics and Department of Biology and Biochemistry, University of Houston, Houston, Texas, United States of America
- Department of BioSciences, Rice University, Houston, Texas, United States of America
| | - Eric Shea-Brown
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
- Department of Physiology and Biophysics, and UW Institute of Neuroengineering, University of Washington, Seattle, Washington, United States of America
| | - Michael A. Buice
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
- * E-mail:
| |
Collapse
|
37
|
Sprekeler H. Functional consequences of inhibitory plasticity: homeostasis, the excitation-inhibition balance and beyond. Curr Opin Neurobiol 2017; 43:198-203. [PMID: 28500933 DOI: 10.1016/j.conb.2017.03.014] [Citation(s) in RCA: 45] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2016] [Revised: 03/12/2017] [Accepted: 03/22/2017] [Indexed: 11/18/2022]
Abstract
Computational neuroscience has a long-standing tradition of investigating the consequences of excitatory synaptic plasticity. In contrast, the functions of inhibitory plasticity are still largely nebulous, particularly given the bewildering diversity of interneurons in the brain. Here, we review recent computational advances that provide first suggestions for the functional roles of inhibitory plasticity, such as a maintenance of the excitation-inhibition balance, a stabilization of recurrent network dynamics and a decorrelation of sensory responses. The field is still in its infancy, but given the existing body of theory for excitatory plasticity, it is likely to mature quickly and deliver important insights into the self-organization of inhibitory circuits in the brain.
Collapse
Affiliation(s)
- Henning Sprekeler
- Department for Electrical Engineering and Computer Science, Berlin Institute of Technology, and Bernstein Center for Computational Neuroscience, Marchstr. 23, 10587 Berlin, Germany.
| |
Collapse
|
38
|
Lajoie G, Krouchev NI, Kalaska JF, Fairhall AL, Fetz EE. Correlation-based model of artificially induced plasticity in motor cortex by a bidirectional brain-computer interface. PLoS Comput Biol 2017; 13:e1005343. [PMID: 28151957 PMCID: PMC5313237 DOI: 10.1371/journal.pcbi.1005343] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2016] [Revised: 02/16/2017] [Accepted: 01/03/2017] [Indexed: 12/19/2022] Open
Abstract
Experiments show that spike-triggered stimulation performed with Bidirectional Brain-Computer-Interfaces (BBCI) can artificially strengthen connections between separate neural sites in motor cortex (MC). When spikes from a neuron recorded at one MC site trigger stimuli at a second target site after a fixed delay, the connections between sites eventually strengthen. It was also found that effective spike-stimulus delays are consistent with experimentally derived spike-timing-dependent plasticity (STDP) rules, suggesting that STDP is key to drive these changes. However, the impact of STDP at the level of circuits, and the mechanisms governing its modification with neural implants remain poorly understood. The present work describes a recurrent neural network model with probabilistic spiking mechanisms and plastic synapses capable of capturing both neural and synaptic activity statistics relevant to BBCI conditioning protocols. Our model successfully reproduces key experimental results, both established and new, and offers mechanistic insights into spike-triggered conditioning. Using analytical calculations and numerical simulations, we derive optimal operational regimes for BBCIs, and formulate predictions concerning the efficacy of spike-triggered conditioning in different regimes of cortical activity.
Collapse
Affiliation(s)
- Guillaume Lajoie
- University of Washington Institute for Neuroengineering, University of Washington, Seattle, WA, USA
| | | | - John F. Kalaska
- Groupe de recherche sur le système nerveux central, Département de neurosciences, Université de Montreal, Montreal, QC, Canada
| | - Adrienne L. Fairhall
- University of Washington Institute for Neuroengineering, University of Washington, Seattle, WA, USA
- Dept. of Physiology and Biophysics, University of Washington, Seattle, WA, USA
- Dept. of Physics, University of Washington, Seattle, WA, USA
| | - Eberhard E. Fetz
- University of Washington Institute for Neuroengineering, University of Washington, Seattle, WA, USA
- Dept. of Physiology and Biophysics, University of Washington, Seattle, WA, USA
| |
Collapse
|
39
|
Pedrosa V, Clopath C. The Role of Neuromodulators in Cortical Plasticity. A Computational Perspective. Front Synaptic Neurosci 2017; 8:38. [PMID: 28119596 PMCID: PMC5222801 DOI: 10.3389/fnsyn.2016.00038] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2016] [Accepted: 12/12/2016] [Indexed: 11/13/2022] Open
Abstract
Neuromodulators play a ubiquitous role across the brain in regulating plasticity. With recent advances in experimental techniques, it is possible to study the effects of diverse neuromodulatory states in specific brain regions. Neuromodulators are thought to impact plasticity predominantly through two mechanisms: the gating of plasticity and the upregulation of neuronal activity. However, the consequences of these mechanisms are poorly understood and there is a need for both experimental and theoretical exploration. Here we illustrate how neuromodulatory state affects cortical plasticity through these two mechanisms. First, we explore the ability of neuromodulators to gate plasticity by reshaping the learning window for spike-timing-dependent plasticity. Using a simple computational model, we implement four different learning rules and demonstrate their effects on receptive field plasticity. We then compare the neuromodulatory effects of upregulating learning rate versus the effects of upregulating neuronal activity. We find that these seemingly similar mechanisms do not yield the same outcome: upregulating neuronal activity can lead to either a broadening or a sharpening of receptive field tuning, whereas upregulating learning rate only intensifies the sharpening of receptive field tuning. This simple model demonstrates the need for further exploration of the rich landscape of neuromodulator-mediated plasticity. Future experiments, coupled with biologically detailed computational models, will elucidate the diversity of mechanisms by which neuromodulatory state regulates cortical plasticity.
Collapse
Affiliation(s)
- Victor Pedrosa
- Department of Bioengineering, Imperial College LondonLondon, UK; CAPES Foundation, Ministry of Education of BrazilBrasilia, Brazil
| | - Claudia Clopath
- Department of Bioengineering, Imperial College London London, UK
| |
Collapse
|
40
|
Ravid Tannenbaum N, Burak Y. Shaping Neural Circuits by High Order Synaptic Interactions. PLoS Comput Biol 2016; 12:e1005056. [PMID: 27517461 PMCID: PMC4982676 DOI: 10.1371/journal.pcbi.1005056] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2015] [Accepted: 06/30/2016] [Indexed: 11/19/2022] Open
Abstract
Spike timing dependent plasticity (STDP) is believed to play an important role in shaping the structure of neural circuits. Here we show that STDP generates effective interactions between synapses of different neurons, which were neglected in previous theoretical treatments, and can be described as a sum over contributions from structural motifs. These interactions can have a pivotal influence on the connectivity patterns that emerge under the influence of STDP. In particular, we consider two highly ordered forms of structure: wide synfire chains, in which groups of neurons project to each other sequentially, and self connected assemblies. We show that high order synaptic interactions can enable the formation of both structures, depending on the form of the STDP function and the time course of synaptic currents. Furthermore, within a certain regime of biophysical parameters, emergence of the ordered connectivity occurs robustly and autonomously in a stochastic network of spiking neurons, without a need to expose the neural network to structured inputs during learning. Plasticity between neural connections plays a key role in our ability to process and store information. One of the fundamental questions on plasticity, is the extent to which local processes, affecting individual synapses, are responsible for large scale structures of neural connectivity. Here we focus on two types of structures: synfire chains and self connected assemblies. These structures are often proposed as forms of neural connectivity that can support brain functions such as memory and generation of motor activity. We show that an important plasticity mechanism, spike timing dependent plasticity, can lead to autonomous emergence of these large scale structures in the brain: in contrast to previous theoretical proposals, we show that the emergence can occur autonomously even if instructive signals are not fed into the neural network while its form is shaped by synaptic plasticity.
Collapse
Affiliation(s)
- Neta Ravid Tannenbaum
- Edmond and Lily Safra Center for Brain Sciences, Hebrew University, Jerusalem, Israel
| | - Yoram Burak
- Edmond and Lily Safra Center for Brain Sciences, Hebrew University, Jerusalem, Israel
- Racah Institute of Physics, Hebrew University, Jerusalem, Israel
- * E-mail:
| |
Collapse
|
41
|
Bi Z, Zhou C. Spike Pattern Structure Influences Synaptic Efficacy Variability under STDP and Synaptic Homeostasis. II: Spike Shuffling Methods on LIF Networks. Front Comput Neurosci 2016; 10:83. [PMID: 27555816 PMCID: PMC4977343 DOI: 10.3389/fncom.2016.00083] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/19/2016] [Accepted: 07/25/2016] [Indexed: 12/12/2022] Open
Abstract
Synapses may undergo variable changes during plasticity because of the variability of spike patterns such as temporal stochasticity and spatial randomness. Here, we call the variability of synaptic weight changes during plasticity to be efficacy variability. In this paper, we investigate how four aspects of spike pattern statistics (i.e., synchronous firing, burstiness/regularity, heterogeneity of rates and heterogeneity of cross-correlations) influence the efficacy variability under pair-wise additive spike-timing dependent plasticity (STDP) and synaptic homeostasis (the mean strength of plastic synapses into a neuron is bounded), by implementing spike shuffling methods onto spike patterns self-organized by a network of excitatory and inhibitory leaky integrate-and-fire (LIF) neurons. With the increase of the decay time scale of the inhibitory synaptic currents, the LIF network undergoes a transition from asynchronous state to weak synchronous state and then to synchronous bursting state. We first shuffle these spike patterns using a variety of methods, each designed to evidently change a specific pattern statistics; and then investigate the change of efficacy variability of the synapses under STDP and synaptic homeostasis, when the neurons in the network fire according to the spike patterns before and after being treated by a shuffling method. In this way, we can understand how the change of pattern statistics may cause the change of efficacy variability. Our results are consistent with those of our previous study which implements spike-generating models on converging motifs. We also find that burstiness/regularity is important to determine the efficacy variability under asynchronous states, while heterogeneity of cross-correlations is the main factor to cause efficacy variability when the network moves into synchronous bursting states (the states observed in epilepsy).
Collapse
Affiliation(s)
- Zedong Bi
- State Key Laboratory of Theoretical Physics, Institute of Theoretical Physics, Chinese Academy of SciencesBeijing, China; Department of Physics, Hong Kong Baptist UniversityKowloon Tong, Hong Kong
| | - Changsong Zhou
- Department of Physics, Hong Kong Baptist UniversityKowloon Tong, Hong Kong; Centre for Nonlinear Studies, Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems, Institute of Computational and Theoretical Studies, Hong Kong Baptist UniversityKowloon Tong, Hong Kong; Beijing Computational Science Research CenterBeijing, China; Research Centre, Hong Kong Baptist University Institute of Research and Continuing EducationShenzhen, China
| |
Collapse
|
42
|
Bi Z, Zhou C. Spike Pattern Structure Influences Synaptic Efficacy Variability under STDP and Synaptic Homeostasis. I: Spike Generating Models on Converging Motifs. Front Comput Neurosci 2016; 10:14. [PMID: 26941634 PMCID: PMC4763167 DOI: 10.3389/fncom.2016.00014] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2015] [Accepted: 02/01/2016] [Indexed: 11/26/2022] Open
Abstract
In neural systems, synaptic plasticity is usually driven by spike trains. Due to the inherent noises of neurons and synapses as well as the randomness of connection details, spike trains typically exhibit variability such as spatial randomness and temporal stochasticity, resulting in variability of synaptic changes under plasticity, which we call efficacy variability. How the variability of spike trains influences the efficacy variability of synapses remains unclear. In this paper, we try to understand this influence under pair-wise additive spike-timing dependent plasticity (STDP) when the mean strength of plastic synapses into a neuron is bounded (synaptic homeostasis). Specifically, we systematically study, analytically and numerically, how four aspects of statistical features, i.e., synchronous firing, burstiness/regularity, heterogeneity of rates and heterogeneity of cross-correlations, as well as their interactions influence the efficacy variability in converging motifs (simple networks in which one neuron receives from many other neurons). Neurons (including the post-synaptic neuron) in a converging motif generate spikes according to statistical models with tunable parameters. In this way, we can explicitly control the statistics of the spike patterns, and investigate their influence onto the efficacy variability, without worrying about the feedback from synaptic changes onto the dynamics of the post-synaptic neuron. We separate efficacy variability into two parts: the drift part (DriftV) induced by the heterogeneity of change rates of different synapses, and the diffusion part (DiffV) induced by weight diffusion caused by stochasticity of spike trains. Our main findings are: (1) synchronous firing and burstiness tend to increase DiffV, (2) heterogeneity of rates induces DriftV when potentiation and depression in STDP are not balanced, and (3) heterogeneity of cross-correlations induces DriftV together with heterogeneity of rates. We anticipate our work important for understanding functional processes of neuronal networks (such as memory) and neural development.
Collapse
Affiliation(s)
- Zedong Bi
- State Key Laboratory of Theoretical Physics, Institute of Theoretical Physics, Chinese Academy of SciencesBeijing, China; Department of Physics, Hong Kong Baptist UniversityKowloon Tong, Hong Kong
| | - Changsong Zhou
- Department of Physics, Hong Kong Baptist UniversityKowloon Tong, Hong Kong; Centre for Nonlinear Studies, Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems, Institute of Computational and Theoretical Studies, Hong Kong Baptist UniversityKowloon Tong, Hong Kong; Beijing Computational Science Research CenterBeijing, China; Research Centre, HKBU Institute of Research and Continuing EducationShenzhen, China
| |
Collapse
|