1
|
Andrade-Talavera Y, Fisahn A, Rodríguez-Moreno A. Timing to be precise? An overview of spike timing-dependent plasticity, brain rhythmicity, and glial cells interplay within neuronal circuits. Mol Psychiatry 2023; 28:2177-2188. [PMID: 36991134 PMCID: PMC10611582 DOI: 10.1038/s41380-023-02027-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/03/2022] [Revised: 02/27/2023] [Accepted: 03/01/2023] [Indexed: 03/31/2023]
Abstract
In the mammalian brain information processing and storage rely on the complex coding and decoding events performed by neuronal networks. These actions are based on the computational ability of neurons and their functional engagement in neuronal assemblies where precise timing of action potential firing is crucial. Neuronal circuits manage a myriad of spatially and temporally overlapping inputs to compute specific outputs that are proposed to underly memory traces formation, sensory perception, and cognitive behaviors. Spike-timing-dependent plasticity (STDP) and electrical brain rhythms are suggested to underlie such functions while the physiological evidence of assembly structures and mechanisms driving both processes continues to be scarce. Here, we review foundational and current evidence on timing precision and cooperative neuronal electrical activity driving STDP and brain rhythms, their interactions, and the emerging role of glial cells in such processes. We also provide an overview of their cognitive correlates and discuss current limitations and controversies, future perspectives on experimental approaches, and their application in humans.
Collapse
Affiliation(s)
- Yuniesky Andrade-Talavera
- Laboratory of Cellular Neuroscience and Plasticity, Department of Physiology, Anatomy and Cell Biology, Universidad Pablo de Olavide, ES-41013, Seville, Spain.
| | - André Fisahn
- Department of Biosciences and Nutrition and Department of Women's and Children's Health, Karolinska Institutet, 171 77, Stockholm, Sweden
| | - Antonio Rodríguez-Moreno
- Laboratory of Cellular Neuroscience and Plasticity, Department of Physiology, Anatomy and Cell Biology, Universidad Pablo de Olavide, ES-41013, Seville, Spain.
| |
Collapse
|
2
|
Wu YK, Miehl C, Gjorgjieva J. Regulation of circuit organization and function through inhibitory synaptic plasticity. Trends Neurosci 2022; 45:884-898. [PMID: 36404455 DOI: 10.1016/j.tins.2022.10.006] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2022] [Revised: 10/02/2022] [Accepted: 10/04/2022] [Indexed: 11/15/2022]
Abstract
Diverse inhibitory neurons in the mammalian brain shape circuit connectivity and dynamics through mechanisms of synaptic plasticity. Inhibitory plasticity can establish excitation/inhibition (E/I) balance, control neuronal firing, and affect local calcium concentration, hence regulating neuronal activity at the network, single neuron, and dendritic level. Computational models can synthesize multiple experimental results and provide insight into how inhibitory plasticity controls circuit dynamics and sculpts connectivity by identifying phenomenological learning rules amenable to mathematical analysis. We highlight recent studies on the role of inhibitory plasticity in modulating excitatory plasticity, forming structured networks underlying memory formation and recall, and implementing adaptive phenomena and novelty detection. We conclude with experimental and modeling progress on the role of interneuron-specific plasticity in circuit computation and context-dependent learning.
Collapse
Affiliation(s)
- Yue Kris Wu
- School of Life Sciences, Technical University of Munich, Freising, Germany; Max Planck Institute for Brain Research, Frankfurt, Germany
| | - Christoph Miehl
- School of Life Sciences, Technical University of Munich, Freising, Germany; Max Planck Institute for Brain Research, Frankfurt, Germany
| | - Julijana Gjorgjieva
- School of Life Sciences, Technical University of Munich, Freising, Germany; Max Planck Institute for Brain Research, Frankfurt, Germany.
| |
Collapse
|
3
|
Miehl C, Onasch S, Festa D, Gjorgjieva J. Formation and computational implications of assemblies in neural circuits. J Physiol 2022. [PMID: 36068723 DOI: 10.1113/jp282750] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2022] [Accepted: 08/22/2022] [Indexed: 11/08/2022] Open
Abstract
In the brain, patterns of neural activity represent sensory information and store it in non-random synaptic connectivity. A prominent theoretical hypothesis states that assemblies, groups of neurons that are strongly connected to each other, are the key computational units underlying perception and memory formation. Compatible with these hypothesised assemblies, experiments have revealed groups of neurons that display synchronous activity, either spontaneously or upon stimulus presentation, and exhibit behavioural relevance. While it remains unclear how assemblies form in the brain, theoretical work has vastly contributed to the understanding of various interacting mechanisms in this process. Here, we review the recent theoretical literature on assembly formation by categorising the involved mechanisms into four components: synaptic plasticity, symmetry breaking, competition and stability. We highlight different approaches and assumptions behind assembly formation and discuss recent ideas of assemblies as the key computational unit in the brain. Abstract figure legend Assembly Formation. Assemblies are groups of strongly connected neurons formed by the interaction of multiple mechanisms and with vast computational implications. Four interacting components are thought to drive assembly formation: synaptic plasticity, symmetry breaking, competition and stability. This article is protected by copyright. All rights reserved.
Collapse
Affiliation(s)
- Christoph Miehl
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Sebastian Onasch
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Dylan Festa
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| | - Julijana Gjorgjieva
- Computation in Neural Circuits, Max Planck Institute for Brain Research, 60438, Frankfurt, Germany.,School of Life Sciences, Technical University of Munich, 85354, Freising, Germany
| |
Collapse
|
4
|
Gallinaro JV, Gašparović N, Rotter S. Homeostatic control of synaptic rewiring in recurrent networks induces the formation of stable memory engrams. PLoS Comput Biol 2022; 18:e1009836. [PMID: 35143489 PMCID: PMC8865699 DOI: 10.1371/journal.pcbi.1009836] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2021] [Revised: 02/23/2022] [Accepted: 01/14/2022] [Indexed: 12/04/2022] Open
Abstract
Brain networks store new memories using functional and structural synaptic plasticity. Memory formation is generally attributed to Hebbian plasticity, while homeostatic plasticity is thought to have an ancillary role in stabilizing network dynamics. Here we report that homeostatic plasticity alone can also lead to the formation of stable memories. We analyze this phenomenon using a new theory of network remodeling, combined with numerical simulations of recurrent spiking neural networks that exhibit structural plasticity based on firing rate homeostasis. These networks are able to store repeatedly presented patterns and recall them upon the presentation of incomplete cues. Storage is fast, governed by the homeostatic drift. In contrast, forgetting is slow, driven by a diffusion process. Joint stimulation of neurons induces the growth of associative connections between them, leading to the formation of memory engrams. These memories are stored in a distributed fashion throughout connectivity matrix, and individual synaptic connections have only a small influence. Although memory-specific connections are increased in number, the total number of inputs and outputs of neurons undergo only small changes during stimulation. We find that homeostatic structural plasticity induces a specific type of "silent memories", different from conventional attractor states.
Collapse
Affiliation(s)
- Júlia V. Gallinaro
- Bernstein Center Freiburg & Faculty of Biology, University of Freiburg, Freiburg im Breisgau, Germany
- Bioengineering Department, Imperial College London, London, United Kingdom
| | - Nebojša Gašparović
- Bernstein Center Freiburg & Faculty of Biology, University of Freiburg, Freiburg im Breisgau, Germany
| | - Stefan Rotter
- Bernstein Center Freiburg & Faculty of Biology, University of Freiburg, Freiburg im Breisgau, Germany
| |
Collapse
|
5
|
Drifting assemblies for persistent memory: Neuron transitions and unsupervised compensation. Proc Natl Acad Sci U S A 2021; 118:2023832118. [PMID: 34772802 DOI: 10.1073/pnas.2023832118] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/11/2021] [Indexed: 11/18/2022] Open
Abstract
Change is ubiquitous in living beings. In particular, the connectome and neural representations can change. Nevertheless, behaviors and memories often persist over long times. In a standard model, associative memories are represented by assemblies of strongly interconnected neurons. For faithful storage these assemblies are assumed to consist of the same neurons over time. Here we propose a contrasting memory model with complete temporal remodeling of assemblies, based on experimentally observed changes of synapses and neural representations. The assemblies drift freely as noisy autonomous network activity and spontaneous synaptic turnover induce neuron exchange. The gradual exchange allows activity-dependent and homeostatic plasticity to conserve the representational structure and keep inputs, outputs, and assemblies consistent. This leads to persistent memory. Our findings explain recent experimental results on temporal evolution of fear memory representations and suggest that memory systems need to be understood in their completeness as individual parts may constantly change.
Collapse
|
6
|
Gallinaro JV, Clopath C. Memories in a network with excitatory and inhibitory plasticity are encoded in the spiking irregularity. PLoS Comput Biol 2021; 17:e1009593. [PMID: 34762644 PMCID: PMC8610285 DOI: 10.1371/journal.pcbi.1009593] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2021] [Revised: 11/23/2021] [Accepted: 10/26/2021] [Indexed: 11/19/2022] Open
Abstract
Cell assemblies are thought to be the substrate of memory in the brain. Theoretical studies have previously shown that assemblies can be formed in networks with multiple types of plasticity. But how exactly they are formed and how they encode information is yet to be fully understood. One possibility is that memories are stored in silent assemblies. Here we used a computational model to study the formation of silent assemblies in a network of spiking neurons with excitatory and inhibitory plasticity. We found that even though the formed assemblies were silent in terms of mean firing rate, they had an increased coefficient of variation of inter-spike intervals. We also found that this spiking irregularity could be read out with support of short-term plasticity, and that it could contribute to the longevity of memories.
Collapse
Affiliation(s)
- Júlia V. Gallinaro
- Bioengineering Department, Imperial College London, London, United Kingdom
| | - Claudia Clopath
- Bioengineering Department, Imperial College London, London, United Kingdom
| |
Collapse
|
7
|
Huang C. Modulation of the dynamical state in cortical network models. Curr Opin Neurobiol 2021; 70:43-50. [PMID: 34403890 PMCID: PMC8688204 DOI: 10.1016/j.conb.2021.07.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2021] [Revised: 05/18/2021] [Accepted: 07/14/2021] [Indexed: 11/29/2022]
Abstract
Cortical neural responses can be modulated by various factors, such as stimulus inputs and the behavior state of the animal. Understanding the circuit mechanisms underlying modulations of network dynamics is important to understand the flexibility of circuit computations. Identifying the dynamical state of a network is an important first step to predict network responses to external stimulus and top-down modulatory inputs. Models in stable or unstable dynamical regimes require different analytic tools to estimate the network responses to inputs and the structure of neural variability. In this article, I review recent cortical models of state-dependent responses and their predictions about the underlying modulatory mechanisms.
Collapse
Affiliation(s)
- Chengcheng Huang
- Departments of Neuroscience and Mathematics, University of Pittsburgh, Pittsburgh, PA, USA; Center for the Neural Basis of Cognition, Pittsburgh, PA, USA.
| |
Collapse
|
8
|
Wu YK, Hengen KB, Turrigiano GG, Gjorgjieva J. Homeostatic mechanisms regulate distinct aspects of cortical circuit dynamics. Proc Natl Acad Sci U S A 2020; 117:24514-24525. [PMID: 32917810 PMCID: PMC7533694 DOI: 10.1073/pnas.1918368117] [Citation(s) in RCA: 31] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2019] [Accepted: 08/04/2020] [Indexed: 11/18/2022] Open
Abstract
Homeostasis is indispensable to counteract the destabilizing effects of Hebbian plasticity. Although it is commonly assumed that homeostasis modulates synaptic strength, membrane excitability, and firing rates, its role at the neural circuit and network level is unknown. Here, we identify changes in higher-order network properties of freely behaving rodents during prolonged visual deprivation. Strikingly, our data reveal that functional pairwise correlations and their structure are subject to homeostatic regulation. Using a computational model, we demonstrate that the interplay of different plasticity and homeostatic mechanisms can capture the initial drop and delayed recovery of firing rates and correlations observed experimentally. Moreover, our model indicates that synaptic scaling is crucial for the recovery of correlations and network structure, while intrinsic plasticity is essential for the rebound of firing rates, suggesting that synaptic scaling and intrinsic plasticity can serve distinct functions in homeostatically regulating network dynamics.
Collapse
Affiliation(s)
- Yue Kris Wu
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, 60438 Frankfurt, Germany
| | - Keith B Hengen
- Department of Biology, Brandeis University, Waltham, MA 02454
- Department of Biology, Washington University in St. Louis, St. Louis, MO 63130
| | | | - Julijana Gjorgjieva
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, 60438 Frankfurt, Germany;
- School of Life Sciences, Technical University of Munich, 85354 Freising, Germany
| |
Collapse
|
9
|
Iyer R, Hu B, Mihalas S. Contextual Integration in Cortical and Convolutional Neural Networks. Front Comput Neurosci 2020; 14:31. [PMID: 32390818 PMCID: PMC7192314 DOI: 10.3389/fncom.2020.00031] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2019] [Accepted: 03/24/2020] [Indexed: 11/28/2022] Open
Abstract
It has been suggested that neurons can represent sensory input using probability distributions and neural circuits can perform probabilistic inference. Lateral connections between neurons have been shown to have non-random connectivity and modulate responses to stimuli within the classical receptive field. Large-scale efforts mapping local cortical connectivity describe cell type specific connections from inhibitory neurons and like-to-like connectivity between excitatory neurons. To relate the observed connectivity to computations, we propose a neuronal network model that approximates Bayesian inference of the probability of different features being present at different image locations. We show that the lateral connections between excitatory neurons in a circuit implementing contextual integration in this should depend on correlations between unit activities, minus a global inhibitory drive. The model naturally suggests the need for two types of inhibitory gates (normalization, surround inhibition). First, using natural scene statistics and classical receptive fields corresponding to simple cells parameterized with data from mouse primary visual cortex, we show that the predicted connectivity qualitatively matches with that measured in mouse cortex: neurons with similar orientation tuning have stronger connectivity, and both excitatory and inhibitory connectivity have a modest spatial extent, comparable to that observed in mouse visual cortex. We incorporate lateral connections learned using this model into convolutional neural networks. Features are defined by supervised learning on the task, and the lateral connections provide an unsupervised learning of feature context in multiple layers. Since the lateral connections provide contextual information when the feedforward input is locally corrupted, we show that incorporating such lateral connections into convolutional neural networks makes them more robust to noise and leads to better performance on noisy versions of the MNIST dataset. Decomposing the predicted lateral connectivity matrices into low-rank and sparse components introduces additional cell types into these networks. We explore effects of cell-type specific perturbations on network computation. Our framework can potentially be applied to networks trained on other tasks, with the learned lateral connections aiding computations implemented by feedforward connections when the input is unreliable and demonstrate the potential usefulness of combining supervised and unsupervised learning techniques in real-world vision tasks.
Collapse
Affiliation(s)
- Ramakrishnan Iyer
- Modeling and Theory, Allen Institute for Brain Science, Seattle, WA, United States
| | - Brian Hu
- Modeling and Theory, Allen Institute for Brain Science, Seattle, WA, United States
| | - Stefan Mihalas
- Modeling and Theory, Allen Institute for Brain Science, Seattle, WA, United States
| |
Collapse
|
10
|
Montangie L, Miehl C, Gjorgjieva J. Autonomous emergence of connectivity assemblies via spike triplet interactions. PLoS Comput Biol 2020; 16:e1007835. [PMID: 32384081 PMCID: PMC7239496 DOI: 10.1371/journal.pcbi.1007835] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2019] [Revised: 05/20/2020] [Accepted: 03/31/2020] [Indexed: 01/08/2023] Open
Abstract
Non-random connectivity can emerge without structured external input driven by activity-dependent mechanisms of synaptic plasticity based on precise spiking patterns. Here we analyze the emergence of global structures in recurrent networks based on a triplet model of spike timing dependent plasticity (STDP), which depends on the interactions of three precisely-timed spikes, and can describe plasticity experiments with varying spike frequency better than the classical pair-based STDP rule. We derive synaptic changes arising from correlations up to third-order and describe them as the sum of structural motifs, which determine how any spike in the network influences a given synaptic connection through possible connectivity paths. This motif expansion framework reveals novel structural motifs under the triplet STDP rule, which support the formation of bidirectional connections and ultimately the spontaneous emergence of global network structure in the form of self-connected groups of neurons, or assemblies. We propose that under triplet STDP assembly structure can emerge without the need for externally patterned inputs or assuming a symmetric pair-based STDP rule common in previous studies. The emergence of non-random network structure under triplet STDP occurs through internally-generated higher-order correlations, which are ubiquitous in natural stimuli and neuronal spiking activity, and important for coding. We further demonstrate how neuromodulatory mechanisms that modulate the shape of the triplet STDP rule or the synaptic transmission function differentially promote structural motifs underlying the emergence of assemblies, and quantify the differences using graph theoretic measures. Emergent non-random connectivity structures in different brain regions are tightly related to specific patterns of neural activity and support diverse brain functions. For instance, self-connected groups of neurons, known as assemblies, have been proposed to represent functional units in brain circuits and can emerge even without patterned external instruction. Here we investigate the emergence of non-random connectivity in recurrent networks using a particular plasticity rule, triplet STDP, which relies on the interaction of spike triplets and can capture higher-order statistical dependencies in neural activity. We derive the evolution of the synaptic strengths in the network and explore the conditions for the self-organization of connectivity into assemblies. We demonstrate key differences of the triplet STDP rule compared to the classical pair-based rule in terms of how assemblies are formed, including the realistic asymmetric shape and influence of novel connectivity motifs on network plasticity driven by higher-order correlations. Assembly formation depends on the specific shape of the STDP window and synaptic transmission function, pointing towards an important role of neuromodulatory signals on formation of intrinsically generated assemblies.
Collapse
Affiliation(s)
- Lisandro Montangie
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt, Germany
| | - Christoph Miehl
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt, Germany
- Technical University of Munich, School of Life Sciences, Freising, Germany
| | - Julijana Gjorgjieva
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, Frankfurt, Germany
- Technical University of Munich, School of Life Sciences, Freising, Germany
- * E-mail:
| |
Collapse
|
11
|
Hasselmo ME, Alexander AS, Hoyland A, Robinson JC, Bezaire MJ, Chapman GW, Saudargiene A, Carstensen LC, Dannenberg H. The Unexplored Territory of Neural Models: Potential Guides for Exploring the Function of Metabotropic Neuromodulation. Neuroscience 2020; 456:143-158. [PMID: 32278058 DOI: 10.1016/j.neuroscience.2020.03.048] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2019] [Revised: 03/30/2020] [Accepted: 03/31/2020] [Indexed: 12/16/2022]
Abstract
The space of possible neural models is enormous and under-explored. Single cell computational neuroscience models account for a range of dynamical properties of membrane potential, but typically do not address network function. In contrast, most models focused on network function address the dimensions of excitatory weight matrices and firing thresholds without addressing the complexities of metabotropic receptor effects on intrinsic properties. There are many under-explored dimensions of neural parameter space, and the field needs a framework for representing what has been explored and what has not. Possible frameworks include maps of parameter spaces, or efforts to categorize the fundamental elements and molecules of neural circuit function. Here we review dimensions that are under-explored in network models that include the metabotropic modulation of synaptic plasticity and presynaptic inhibition, spike frequency adaptation due to calcium-dependent potassium currents, and afterdepolarization due to calcium-sensitive non-specific cation currents and hyperpolarization activated cation currents. Neuroscience research should more effectively explore possible functional models incorporating under-explored dimensions of neural function.
Collapse
Affiliation(s)
- Michael E Hasselmo
- Center for Systems Neuroscience, Department of Psychological and Brain Sciences, Boston University, 610 Commonwealth Ave., Boston, MA 02215, United States.
| | - Andrew S Alexander
- Center for Systems Neuroscience, Department of Psychological and Brain Sciences, Boston University, 610 Commonwealth Ave., Boston, MA 02215, United States
| | - Alec Hoyland
- Center for Systems Neuroscience, Department of Psychological and Brain Sciences, Boston University, 610 Commonwealth Ave., Boston, MA 02215, United States
| | - Jennifer C Robinson
- Center for Systems Neuroscience, Department of Psychological and Brain Sciences, Boston University, 610 Commonwealth Ave., Boston, MA 02215, United States
| | - Marianne J Bezaire
- Center for Systems Neuroscience, Department of Psychological and Brain Sciences, Boston University, 610 Commonwealth Ave., Boston, MA 02215, United States
| | - G William Chapman
- Center for Systems Neuroscience, Department of Psychological and Brain Sciences, Boston University, 610 Commonwealth Ave., Boston, MA 02215, United States
| | - Ausra Saudargiene
- Center for Systems Neuroscience, Department of Psychological and Brain Sciences, Boston University, 610 Commonwealth Ave., Boston, MA 02215, United States
| | - Lucas C Carstensen
- Center for Systems Neuroscience, Department of Psychological and Brain Sciences, Boston University, 610 Commonwealth Ave., Boston, MA 02215, United States
| | - Holger Dannenberg
- Center for Systems Neuroscience, Department of Psychological and Brain Sciences, Boston University, 610 Commonwealth Ave., Boston, MA 02215, United States
| |
Collapse
|
12
|
Shamir M. Theories of rhythmogenesis. Curr Opin Neurobiol 2019; 58:70-77. [PMID: 31408837 DOI: 10.1016/j.conb.2019.07.005] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/01/2019] [Accepted: 07/14/2019] [Indexed: 12/31/2022]
Abstract
Rhythmogenesis is the process that develops the capacity for rhythmic activity in a non-rhythmic system. Theoretical works suggested a wide array of possible mechanisms for rhythmogenesis ranging from the regulation of cellular properties to top-down control. Here we discuss theories of rhythmogenesis with an emphasis on spike timing-dependent plasticity. We argue that even though the specifics of different mechanisms vary greatly they all share certain key features. Namely, rhythmogenesis can be described as a flow on the phase diagram leading the system into a rhythmic region and stabilizing it on a specific manifold characterized by the desired rhythmic activity. Functionality is retained despite biological diversity by forcing the system into a specific manifold, but allowing fluctuations within that manifold.
Collapse
Affiliation(s)
- Maoz Shamir
- Department of Physiology and Cell Biology, Faculty of Health Sciences, Department of Physics, Faculty of Natural Sciences, Zlotowski Center for Neuroscience, Ben-Gurion University of the Negev, Be'er-Sheva, Israel; The Kavli Institute for Theoretical Physics, University of California, Santa Barbara, USA.
| |
Collapse
|
13
|
Curto C, Morrison K. Relating network connectivity to dynamics: opportunities and challenges for theoretical neuroscience. Curr Opin Neurobiol 2019; 58:11-20. [PMID: 31319287 DOI: 10.1016/j.conb.2019.06.003] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2019] [Accepted: 06/22/2019] [Indexed: 11/29/2022]
Abstract
We review recent work relating network connectivity to the dynamics of neural activity. While concepts stemming from network science provide a valuable starting point, the interpretation of graph-theoretic structures and measures can be highly dependent on the dynamics associated to the network. Properties that are quite meaningful for linear dynamics, such as random walk and network flow models, may be of limited relevance in the neuroscience setting. Theoretical and computational neuroscience are playing a vital role in understanding the relationship between network connectivity and the nonlinear dynamics associated to neural networks.
Collapse
Affiliation(s)
- Carina Curto
- The Pennsylvania State University, PA 16802, United States.
| | - Katherine Morrison
- School of Mathematical Sciences, University of Northern Colorado, Greeley, CO 80639, USA
| |
Collapse
|
14
|
Recanatesi S, Ocker GK, Buice MA, Shea-Brown E. Dimensionality in recurrent spiking networks: Global trends in activity and local origins in connectivity. PLoS Comput Biol 2019; 15:e1006446. [PMID: 31299044 PMCID: PMC6655892 DOI: 10.1371/journal.pcbi.1006446] [Citation(s) in RCA: 31] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2018] [Revised: 07/24/2019] [Accepted: 04/03/2019] [Indexed: 11/25/2022] Open
Abstract
The dimensionality of a network's collective activity is of increasing interest in neuroscience. This is because dimensionality provides a compact measure of how coordinated network-wide activity is, in terms of the number of modes (or degrees of freedom) that it can independently explore. A low number of modes suggests a compressed low dimensional neural code and reveals interpretable dynamics [1], while findings of high dimension may suggest flexible computations [2, 3]. Here, we address the fundamental question of how dimensionality is related to connectivity, in both autonomous and stimulus-driven networks. Working with a simple spiking network model, we derive three main findings. First, the dimensionality of global activity patterns can be strongly, and systematically, regulated by local connectivity structures. Second, the dimensionality is a better indicator than average correlations in determining how constrained neural activity is. Third, stimulus evoked neural activity interacts systematically with neural connectivity patterns, leading to network responses of either greater or lesser dimensionality than the stimulus.
Collapse
Affiliation(s)
- Stefano Recanatesi
- Center for Computational Neuroscience, University of Washington, Seattle, Washington, United States of America
| | - Gabriel Koch Ocker
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Michael A. Buice
- Center for Computational Neuroscience, University of Washington, Seattle, Washington, United States of America
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
| | - Eric Shea-Brown
- Center for Computational Neuroscience, University of Washington, Seattle, Washington, United States of America
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
| |
Collapse
|
15
|
Triplett MA, Avitan L, Goodhill GJ. Emergence of spontaneous assembly activity in developing neural networks without afferent input. PLoS Comput Biol 2018; 14:e1006421. [PMID: 30265665 PMCID: PMC6161857 DOI: 10.1371/journal.pcbi.1006421] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2018] [Accepted: 08/07/2018] [Indexed: 02/04/2023] Open
Abstract
Spontaneous activity is a fundamental characteristic of the developing nervous system. Intriguingly, it often takes the form of multiple structured assemblies of neurons. Such assemblies can form even in the absence of afferent input, for instance in the zebrafish optic tectum after bilateral enucleation early in life. While the development of neural assemblies based on structured afferent input has been theoretically well-studied, it is less clear how they could arise in systems without afferent input. Here we show that a recurrent network of binary threshold neurons with initially random weights can form neural assemblies based on a simple Hebbian learning rule. Over development the network becomes increasingly modular while being driven by initially unstructured spontaneous activity, leading to the emergence of neural assemblies. Surprisingly, the set of neurons making up each assembly then continues to evolve, despite the number of assemblies remaining roughly constant. In the mature network assembly activity builds over several timesteps before the activation of the full assembly, as recently observed in calcium-imaging experiments. Our results show that Hebbian learning is sufficient to explain the emergence of highly structured patterns of neural activity in the absence of structured input.
Collapse
Affiliation(s)
- Marcus A. Triplett
- Queensland Brain Institute, University of Queensland, St Lucia, Queensland, Australia
- School of Mathematics and Physics, University of Queensland, St Lucia, Queensland, Australia
| | - Lilach Avitan
- Queensland Brain Institute, University of Queensland, St Lucia, Queensland, Australia
| | - Geoffrey J. Goodhill
- Queensland Brain Institute, University of Queensland, St Lucia, Queensland, Australia
- School of Mathematics and Physics, University of Queensland, St Lucia, Queensland, Australia
- * E-mail:
| |
Collapse
|
16
|
Ocker GK, Hu Y, Buice MA, Doiron B, Josić K, Rosenbaum R, Shea-Brown E. From the statistics of connectivity to the statistics of spike times in neuronal networks. Curr Opin Neurobiol 2017; 46:109-119. [PMID: 28863386 DOI: 10.1016/j.conb.2017.07.011] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2017] [Revised: 07/21/2017] [Accepted: 07/27/2017] [Indexed: 10/19/2022]
Abstract
An essential step toward understanding neural circuits is linking their structure and their dynamics. In general, this relationship can be almost arbitrarily complex. Recent theoretical work has, however, begun to identify some broad principles underlying collective spiking activity in neural circuits. The first is that local features of network connectivity can be surprisingly effective in predicting global statistics of activity across a network. The second is that, for the important case of large networks with excitatory-inhibitory balance, correlated spiking persists or vanishes depending on the spatial scales of recurrent and feedforward connectivity. We close by showing how these ideas, together with plasticity rules, can help to close the loop between network structure and activity statistics.
Collapse
Affiliation(s)
| | - Yu Hu
- Center for Brain Science, Harvard University, United States
| | - Michael A Buice
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States
| | - Brent Doiron
- Department of Mathematics, University of Pittsburgh, United States; Center for the Neural Basis of Cognition, Pittsburgh, United States
| | - Krešimir Josić
- Department of Mathematics, University of Houston, United States; Department of Biology and Biochemistry, University of Houston, United States; Department of BioSciences, Rice University, United States
| | - Robert Rosenbaum
- Department of Mathematics, University of Notre Dame, United States
| | - Eric Shea-Brown
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States; Department of Physiology and Biophysics, and University of Washington Institute for Neuroengineering, United States.
| |
Collapse
|