1
|
Pagkalos M, Makarov R, Poirazi P. Leveraging dendritic properties to advance machine learning and neuro-inspired computing. Curr Opin Neurobiol 2024; 85:102853. [PMID: 38394956 DOI: 10.1016/j.conb.2024.102853] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2023] [Revised: 02/04/2024] [Accepted: 02/05/2024] [Indexed: 02/25/2024]
Abstract
The brain is a remarkably capable and efficient system. It can process and store huge amounts of noisy and unstructured information, using minimal energy. In contrast, current artificial intelligence (AI) systems require vast resources for training while still struggling to compete in tasks that are trivial for biological agents. Thus, brain-inspired engineering has emerged as a promising new avenue for designing sustainable, next-generation AI systems. Here, we describe how dendritic mechanisms of biological neurons have inspired innovative solutions for significant AI problems, including credit assignment in multi-layer networks, catastrophic forgetting, and high-power consumption. These findings provide exciting alternatives to existing architectures, showing how dendritic research can pave the way for building more powerful and energy efficient artificial learning systems.
Collapse
Affiliation(s)
- Michalis Pagkalos
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece; Department of Biology, University of Crete, Heraklion, 70013, Greece. https://twitter.com/MPagkalos
| | - Roman Makarov
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece; Department of Biology, University of Crete, Heraklion, 70013, Greece. https://twitter.com/_RomanMakarov
| | - Panayiota Poirazi
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece.
| |
Collapse
|
2
|
Fitz H, Hagoort P, Petersson KM. Neurobiological Causal Models of Language Processing. NEUROBIOLOGY OF LANGUAGE (CAMBRIDGE, MASS.) 2024; 5:225-247. [PMID: 38645618 PMCID: PMC11025648 DOI: 10.1162/nol_a_00133] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/29/2022] [Accepted: 12/18/2023] [Indexed: 04/23/2024]
Abstract
The language faculty is physically realized in the neurobiological infrastructure of the human brain. Despite significant efforts, an integrated understanding of this system remains a formidable challenge. What is missing from most theoretical accounts is a specification of the neural mechanisms that implement language function. Computational models that have been put forward generally lack an explicit neurobiological foundation. We propose a neurobiologically informed causal modeling approach which offers a framework for how to bridge this gap. A neurobiological causal model is a mechanistic description of language processing that is grounded in, and constrained by, the characteristics of the neurobiological substrate. It intends to model the generators of language behavior at the level of implementational causality. We describe key features and neurobiological component parts from which causal models can be built and provide guidelines on how to implement them in model simulations. Then we outline how this approach can shed new light on the core computational machinery for language, the long-term storage of words in the mental lexicon and combinatorial processing in sentence comprehension. In contrast to cognitive theories of behavior, causal models are formulated in the "machine language" of neurobiology which is universal to human cognition. We argue that neurobiological causal modeling should be pursued in addition to existing approaches. Eventually, this approach will allow us to develop an explicit computational neurobiology of language.
Collapse
Affiliation(s)
- Hartmut Fitz
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
- Neurobiology of Language Department, Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands
| | - Peter Hagoort
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
- Neurobiology of Language Department, Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands
| | - Karl Magnus Petersson
- Neurobiology of Language Department, Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands
- Faculty of Medicine and Biomedical Sciences, University of Algarve, Faro, Portugal
| |
Collapse
|
3
|
Friedenberger Z, Harkin E, Tóth K, Naud R. Silences, spikes and bursts: Three-part knot of the neural code. J Physiol 2023; 601:5165-5193. [PMID: 37889516 DOI: 10.1113/jp281510] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2023] [Accepted: 09/28/2023] [Indexed: 10/28/2023] Open
Abstract
When a neuron breaks silence, it can emit action potentials in a number of patterns. Some responses are so sudden and intense that electrophysiologists felt the need to single them out, labelling action potentials emitted at a particularly high frequency with a metonym - bursts. Is there more to bursts than a figure of speech? After all, sudden bouts of high-frequency firing are expected to occur whenever inputs surge. The burst coding hypothesis advances that the neural code has three syllables: silences, spikes and bursts. We review evidence supporting this ternary code in terms of devoted mechanisms for burst generation, synaptic transmission and synaptic plasticity. We also review the learning and attention theories for which such a triad is beneficial.
Collapse
Affiliation(s)
- Zachary Friedenberger
- Brain and Mind Institute, Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Ontario, Canada
- Centre for Neural Dynamics and Artifical Intelligence, Department of Physics, University of Ottawa, Ottawa, Ontario, Ottawa
| | - Emerson Harkin
- Brain and Mind Institute, Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Ontario, Canada
| | - Katalin Tóth
- Brain and Mind Institute, Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Ontario, Canada
| | - Richard Naud
- Brain and Mind Institute, Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Ontario, Canada
- Centre for Neural Dynamics and Artifical Intelligence, Department of Physics, University of Ottawa, Ottawa, Ontario, Ottawa
| |
Collapse
|
4
|
Makarov R, Pagkalos M, Poirazi P. Dendrites and efficiency: Optimizing performance and resource utilization. Curr Opin Neurobiol 2023; 83:102812. [PMID: 37980803 DOI: 10.1016/j.conb.2023.102812] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2023] [Revised: 10/19/2023] [Accepted: 10/21/2023] [Indexed: 11/21/2023]
Abstract
The brain is a highly efficient system that has evolved to optimize performance under limited resources. In this review, we highlight recent theoretical and experimental studies that support the view that dendrites make information processing and storage in the brain more efficient. This is achieved through the dynamic modulation of integration versus segregation of inputs and activity within a neuron. We argue that under conditions of limited energy and space, dendrites help biological networks to implement complex functions such as processing natural stimuli on behavioral timescales, performing the inference process on those stimuli in a context-specific manner, and storing the information in overlapping populations of neurons. A global picture starts to emerge, in which dendrites help the brain achieve efficiency through a combination of optimization strategies that balance the tradeoff between performance and resource utilization.
Collapse
Affiliation(s)
- Roman Makarov
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece; Department of Biology, University of Crete, Heraklion, 70013, Greece. https://twitter.com/_RomanMakarov
| | - Michalis Pagkalos
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece; Department of Biology, University of Crete, Heraklion, 70013, Greece. https://twitter.com/MPagkalos
| | - Panayiota Poirazi
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece.
| |
Collapse
|
5
|
Kim YJ, Ujfalussy BB, Lengyel M. Parallel functional architectures within a single dendritic tree. Cell Rep 2023; 42:112386. [PMID: 37060564 PMCID: PMC7614531 DOI: 10.1016/j.celrep.2023.112386] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/24/2022] [Revised: 10/31/2022] [Accepted: 03/28/2023] [Indexed: 04/16/2023] Open
Abstract
The input-output transformation of individual neurons is a key building block of neural circuit dynamics. While previous models of this transformation vary widely in their complexity, they all describe the underlying functional architecture as unitary, such that each synaptic input makes a single contribution to the neuronal response. Here, we show that the input-output transformation of CA1 pyramidal cells is instead best captured by two distinct functional architectures operating in parallel. We used statistically principled methods to fit flexible, yet interpretable, models of the transformation of input spikes into the somatic "output" voltage and to automatically select among alternative functional architectures. With dendritic Na+ channels blocked, responses are accurately captured by a single static and global nonlinearity. In contrast, dendritic Na+-dependent integration requires a functional architecture with multiple dynamic nonlinearities and clustered connectivity. These two architectures incorporate distinct morphological and biophysical properties of the neuron and its synaptic organization.
Collapse
Affiliation(s)
- Young Joon Kim
- Computational and Biological Learning Lab, Department of Engineering, University of Cambridge, Cambridge, UK; Harvard Medical School, Boston, MA, USA.
| | - Balázs B Ujfalussy
- Laboratory of Biological Computation, Institute of Experimental Medicine, Budapest, Hungary
| | - Máté Lengyel
- Computational and Biological Learning Lab, Department of Engineering, University of Cambridge, Cambridge, UK
| |
Collapse
|
6
|
Zhang Y, Du K, Huang T. Heuristic Tree-Partition-Based Parallel Method for Biophysically Detailed Neuron Simulation. Neural Comput 2023; 35:627-644. [PMID: 36746142 DOI: 10.1162/neco_a_01565] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2022] [Accepted: 10/20/2022] [Indexed: 02/08/2023]
Abstract
Biophysically detailed neuron simulation is a powerful tool to explore the mechanisms behind biological experiments and bridge the gap between various scales in neuroscience research. However, the extremely high computational complexity of detailed neuron simulation restricts the modeling and exploration of detailed network models. The bottleneck is solving the system of linear equations. To accelerate detailed simulation, we propose a heuristic tree-partition-based parallel method (HTP) to parallelize the computation of the Hines algorithm, the kernel for solving linear equations, and leverage the strong parallel capability of the graphic processing unit (GPU) to achieve further speedup. We formulate the problem of how to get a fine parallel process as a tree-partition problem. Next, we present a heuristic partition algorithm to obtain an effective partition to efficiently parallelize the equation-solving process in detailed simulation. With further optimization on GPU, our HTP method achieves 2.2 to 8.5 folds speedup compared to the state-of-the-art GPU method and 36 to 660 folds speedup compared to the typical Hines algorithm.
Collapse
Affiliation(s)
- Yichen Zhang
- School of Computer Science, Peking University, Beijing 100871, China
| | - Kai Du
- School of Computer Science and Institute for Artificial Intelligence, Peking University, Beijing 100871, China
| | - Tiejun Huang
- School of Computer Science and Institute for Artificial Intelligence, Peking University, Beijing 100871, China
| |
Collapse
|
7
|
Harkin EF, Lynn MB, Payeur A, Boucher JF, Caya-Bissonnette L, Cyr D, Stewart C, Longtin A, Naud R, Béïque JC. Temporal derivative computation in the dorsal raphe network revealed by an experimentally driven augmented integrate-and-fire modeling framework. eLife 2023; 12:72951. [PMID: 36655738 PMCID: PMC9977298 DOI: 10.7554/elife.72951] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2021] [Accepted: 12/19/2022] [Indexed: 01/20/2023] Open
Abstract
By means of an expansive innervation, the serotonin (5-HT) neurons of the dorsal raphe nucleus (DRN) are positioned to enact coordinated modulation of circuits distributed across the entire brain in order to adaptively regulate behavior. Yet the network computations that emerge from the excitability and connectivity features of the DRN are still poorly understood. To gain insight into these computations, we began by carrying out a detailed electrophysiological characterization of genetically identified mouse 5-HT and somatostatin (SOM) neurons. We next developed a single-neuron modeling framework that combines the realism of Hodgkin-Huxley models with the simplicity and predictive power of generalized integrate-and-fire models. We found that feedforward inhibition of 5-HT neurons by heterogeneous SOM neurons implemented divisive inhibition, while endocannabinoid-mediated modulation of excitatory drive to the DRN increased the gain of 5-HT output. Our most striking finding was that the output of the DRN encodes a mixture of the intensity and temporal derivative of its input, and that the temporal derivative component dominates this mixture precisely when the input is increasing rapidly. This network computation primarily emerged from prominent adaptation mechanisms found in 5-HT neurons, including a previously undescribed dynamic threshold. By applying a bottom-up neural network modeling approach, our results suggest that the DRN is particularly apt to encode input changes over short timescales, reflecting one of the salient emerging computations that dominate its output to regulate behavior.
Collapse
Affiliation(s)
- Emerson F Harkin
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
| | - Michael B Lynn
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
| | - Alexandre Payeur
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
- Department of Physics, University of OttawaOttawaCanada
| | - Jean-François Boucher
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
| | - Léa Caya-Bissonnette
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
| | - Dominic Cyr
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
| | - Chloe Stewart
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
| | - André Longtin
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
- Department of Physics, University of OttawaOttawaCanada
| | - Richard Naud
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
- Department of Physics, University of OttawaOttawaCanada
| | - Jean-Claude Béïque
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
| |
Collapse
|
8
|
Li Z, Tang W, Zhang B, Yang R, Miao X. Emerging memristive neurons for neuromorphic computing and sensing. SCIENCE AND TECHNOLOGY OF ADVANCED MATERIALS 2023; 24:2188878. [PMID: 37090846 PMCID: PMC10120469 DOI: 10.1080/14686996.2023.2188878] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 05/03/2023]
Abstract
Inspired by the principles of the biological nervous system, neuromorphic engineering has brought a promising alternative approach to intelligence computing with high energy efficiency and low consumption. As pivotal components of neuromorphic system, artificial spiking neurons are powerful information processing units and can achieve highly complex nonlinear computations. By leveraging the switching dynamic characteristics of memristive device, memristive neurons show rich spiking behaviors with simple circuit. This report reviews the memristive neurons and their applications in neuromorphic sensing and computing systems. The switching mechanisms that endow memristive devices with rich dynamics and nonlinearity are highlighted, and subsequently various nonlinear spiking neuron behaviors emulated in these memristive devices are reviewed. Then, recent development is introduced on neuromorphic system with memristive neurons for sensing and computing. Finally, we discuss challenges and outlooks of the memristive neurons toward high-performance neuromorphic hardware systems and provide an insightful perspective for the development of interactive neuromorphic electronic systems.
Collapse
Affiliation(s)
- Zhiyuan Li
- School of Integrated Circuits, School of Optical and Electronic Information, Huazhong University of Science and Technology, Wuhan, China
- Hubei Yangtze Memory Laboratories, Wuhan, China
| | - Wei Tang
- School of Integrated Circuits, School of Optical and Electronic Information, Huazhong University of Science and Technology, Wuhan, China
- Hubei Yangtze Memory Laboratories, Wuhan, China
| | - Beining Zhang
- School of Integrated Circuits, School of Optical and Electronic Information, Huazhong University of Science and Technology, Wuhan, China
- Hubei Yangtze Memory Laboratories, Wuhan, China
| | - Rui Yang
- School of Integrated Circuits, School of Optical and Electronic Information, Huazhong University of Science and Technology, Wuhan, China
- Hubei Yangtze Memory Laboratories, Wuhan, China
- CONTACT Rui Yang School of Integrated Circuits, Huazhong University of Science and Technology, Wuhan430074, China; Hubei Yangtze Memory Laboratories, Wuhan 430205, China
| | - Xiangshui Miao
- School of Integrated Circuits, School of Optical and Electronic Information, Huazhong University of Science and Technology, Wuhan, China
- Hubei Yangtze Memory Laboratories, Wuhan, China
| |
Collapse
|
9
|
Renner J, Rasia-Filho AA. Morphological Features of Human Dendritic Spines. ADVANCES IN NEUROBIOLOGY 2023; 34:367-496. [PMID: 37962801 DOI: 10.1007/978-3-031-36159-3_9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/15/2023]
Abstract
Dendritic spine features in human neurons follow the up-to-date knowledge presented in the previous chapters of this book. Human dendrites are notable for their heterogeneity in branching patterns and spatial distribution. These data relate to circuits and specialized functions. Spines enhance neuronal connectivity, modulate and integrate synaptic inputs, and provide additional plastic functions to microcircuits and large-scale networks. Spines present a continuum of shapes and sizes, whose number and distribution along the dendritic length are diverse in neurons and different areas. Indeed, human neurons vary from aspiny or "relatively aspiny" cells to neurons covered with a high density of intermingled pleomorphic spines on very long dendrites. In this chapter, we discuss the phylogenetic and ontogenetic development of human spines and describe the heterogeneous features of human spiny neurons along the spinal cord, brainstem, cerebellum, thalamus, basal ganglia, amygdala, hippocampal regions, and neocortical areas. Three-dimensional reconstructions of Golgi-impregnated dendritic spines and data from fluorescence microscopy are reviewed with ultrastructural findings to address the complex possibilities for synaptic processing and integration in humans. Pathological changes are also presented, for example, in Alzheimer's disease and schizophrenia. Basic morphological data can be linked to current techniques, and perspectives in this research field include the characterization of spines in human neurons with specific transcriptome features, molecular classification of cellular diversity, and electrophysiological identification of coexisting subpopulations of cells. These data would enlighten how cellular attributes determine neuron type-specific connectivity and brain wiring for our diverse aptitudes and behavior.
Collapse
Affiliation(s)
- Josué Renner
- Department of Basic Sciences/Physiology and Graduate Program in Biosciences, Universidade Federal de Ciências da Saúde de Porto Alegre, Porto Alegre, RS, Brazil
| | - Alberto A Rasia-Filho
- Department of Basic Sciences/Physiology and Graduate Program in Biosciences, Universidade Federal de Ciências da Saúde de Porto Alegre, Porto Alegre, RS, Brazil
- Graduate Program in Neuroscience, Universidade Federal do Rio Grande do Sul, Porto Alegre, RS, Brazil
| |
Collapse
|
10
|
Kumar S, Singh RK, Chaudhary A. A novel non-linear neuron model based on multiplicative aggregation in quaternionic domain. COMPLEX INTELL SYST 2022. [DOI: 10.1007/s40747-022-00911-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
AbstractThe learning algorithm for a three-layered neural structure with novel non-linear quaternionic-valued multiplicative (QVM) neurons is proposed in this paper. The computing capability of non-linear aggregation in the cell body of biological neurons inspired the development of a non-linear neuron model. However, unlike linear neuron models, most non-linear neuron models are built on higher order aggregation, which is more mathematically complex and difficult to train. As a result, building non-linear neuron models with a simple structure is a difficult and time-consuming endeavor in the neurocomputing field. The concept of a QVM neuron model was influenced by the non-linear neuron model, which has a simple structure and the great computational ability. The suggested neuron’s linearity is determined by the weight and bias associated with each quaternionic-valued input. Non-commutative multiplication of all linearly connected quaternionic input-weight terms accommodates the non-linearity. To train three-layered networks with QVM neurons, the standard quaternionic-gradient-based backpropagation (QBP) algorithm is utilized. The computational and generalization capabilities of the QVM neuron are assessed through training and testing in the quaternionic domain utilizing benchmark problems, such as 3D and 4D chaotic time-series predictions, 3D geometrical transformations, and 3D face recognition. The training and testing outcomes are compared to conventional and root-power mean (RPM) neurons in quaternionic domain using training–testing MSEs, network topology (parameters), variance, and AIC as statistical measures. According to these findings, networks with QVM neurons have greater computational and generalization capabilities than networks with conventional and RPM neurons in quaternionic domain.
Collapse
|
11
|
Xue X, Buccino AP, Kumar SS, Hierlemann A, Bartram J. Inferring monosynaptic connections from paired dendritic spine Ca 2+imaging and large-scale recording of extracellular spiking. J Neural Eng 2022; 19:046044. [PMID: 35931040 PMCID: PMC7613561 DOI: 10.1088/1741-2552/ac8765] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2022] [Accepted: 08/05/2022] [Indexed: 11/12/2022]
Abstract
Objective: Techniques to identify monosynaptic connections between neurons have been vital for neuroscience research, facilitating important advancements concerning network topology, synaptic plasticity, and synaptic integration, among others.Approach: Here, we introduce a novel approach to identify and monitor monosynaptic connections using high-resolution dendritic spine Ca2+imaging combined with simultaneous large-scale recording of extracellular electrical activity by means of high-density microelectrode arrays.Main results: We introduce an easily adoptable analysis pipeline that associates the imaged spine with its presynaptic unit and test it onin vitrorecordings. The method is further validated and optimized by simulating synaptically-evoked spine Ca2+transients based on measured spike trains in order to obtain simulated ground-truth connections.Significance: The proposed approach offers unique advantages as (a) it can be used to identify monosynaptic connections with an accurate localization of the synapse within the dendritic tree, (b) it provides precise information of presynaptic spiking, and (c) postsynaptic spine Ca2+signals and, finally, (d) the non-invasive nature of the proposed method allows for long-term measurements. The analysis toolkit together with the rich data sets that were acquired are made publicly available for further exploration by the research community.
Collapse
|
12
|
Introduction. Neuroscience 2022; 489:1-3. [DOI: 10.1016/j.neuroscience.2022.03.037] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022]
|
13
|
Otor Y, Achvat S, Cermak N, Benisty H, Abboud M, Barak O, Schiller Y, Poleg-Polsky A, Schiller J. Dynamic compartmental computations in tuft dendrites of layer 5 neurons during motor behavior. Science 2022; 376:267-275. [PMID: 35420959 DOI: 10.1126/science.abn1421] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/17/2022]
Abstract
Tuft dendrites of layer 5 pyramidal neurons form specialized compartments important for motor learning and performance, yet their computational capabilities remain unclear. Structural-functional mapping of the tuft tree from the motor cortex during motor tasks revealed two morphologically distinct populations of layer 5 pyramidal tract neurons (PTNs) that exhibit specific tuft computational properties. Early bifurcating and large nexus PTNs showed marked tuft functional compartmentalization, representing different motor variable combinations within and between their two tuft hemi-trees. By contrast, late bifurcating and smaller nexus PTNs showed synchronous tuft activation. Dendritic structure and dynamic recruitment of the N-methyl-d-aspartate (NMDA)-spiking mechanism explained the differential compartmentalization patterns. Our findings support a morphologically dependent framework for motor computations, in which independent amplification units can be combinatorically recruited to represent different motor sequences within the same tree.
Collapse
Affiliation(s)
- Yara Otor
- Department of Physiology, Technion Medical School, Bat-Galim, Haifa 31096, Israel
| | - Shay Achvat
- Department of Physiology, Technion Medical School, Bat-Galim, Haifa 31096, Israel
| | - Nathan Cermak
- Department of Physiology, Technion Medical School, Bat-Galim, Haifa 31096, Israel
| | - Hadas Benisty
- Yale University School of Medicine; Bethany, CT, USA
| | - Maisan Abboud
- Department of Physiology, Technion Medical School, Bat-Galim, Haifa 31096, Israel
| | - Omri Barak
- Department of Physiology, Technion Medical School, Bat-Galim, Haifa 31096, Israel
| | - Yitzhak Schiller
- Department of Physiology, Technion Medical School, Bat-Galim, Haifa 31096, Israel
| | - Alon Poleg-Polsky
- Department of Physiology and Biophysics; University of Colorado School of Medicine, 12800 East 19th Avenue MS8307, Aurora, CO 8004, USA
| | - Jackie Schiller
- Department of Physiology, Technion Medical School, Bat-Galim, Haifa 31096, Israel
| |
Collapse
|
14
|
Feldhoff F, Toepfer H, Harczos T, Klefenz F. Periodicity Pitch Perception Part III: Sensibility and Pachinko Volatility. Front Neurosci 2022; 16:736642. [PMID: 35356050 PMCID: PMC8959216 DOI: 10.3389/fnins.2022.736642] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2021] [Accepted: 02/07/2022] [Indexed: 11/29/2022] Open
Abstract
Neuromorphic computer models are used to explain sensory perceptions. Auditory models generate cochleagrams, which resemble the spike distributions in the auditory nerve. Neuron ensembles along the auditory pathway transform sensory inputs step by step and at the end pitch is represented in auditory categorical spaces. In two previous articles in the series on periodicity pitch perception an extended auditory model had been successfully used for explaining periodicity pitch proved for various musical instrument generated tones and sung vowels. In this third part in the series the focus is on octopus cells as they are central sensitivity elements in auditory cognition processes. A powerful numerical model had been devised, in which auditory nerve fibers (ANFs) spike events are the inputs, triggering the impulse responses of the octopus cells. Efficient algorithms are developed and demonstrated to explain the behavior of octopus cells with a focus on a simple event-based hardware implementation of a layer of octopus neurons. The main finding is, that an octopus' cell model in a local receptive field fine-tunes to a specific trajectory by a spike-timing-dependent plasticity (STDP) learning rule with synaptic pre-activation and the dendritic back-propagating signal as post condition. Successful learning explains away the teacher and there is thus no need for a temporally precise control of plasticity that distinguishes between learning and retrieval phases. Pitch learning is cascaded: At first octopus cells respond individually by self-adjustment to specific trajectories in their local receptive fields, then unions of octopus cells are collectively learned for pitch discrimination. Pitch estimation by inter-spike intervals is shown exemplary using two input scenarios: a simple sinus tone and a sung vowel. The model evaluation indicates an improvement in pitch estimation on a fixed time-scale.
Collapse
Affiliation(s)
- Frank Feldhoff
- Advanced Electromagnetics Group, Technische Universität Ilmenau, Ilmenau, Germany
| | - Hannes Toepfer
- Advanced Electromagnetics Group, Technische Universität Ilmenau, Ilmenau, Germany
| | - Tamas Harczos
- Fraunhofer-Institut für Digitale Medientechnologie, Ilmenau, Germany
- Auditory Neuroscience and Optogenetics Laboratory, German Primate Center, Göttingen, Germany
- audifon GmbH & Co. KG, Kölleda, Germany
| | - Frank Klefenz
- Fraunhofer-Institut für Digitale Medientechnologie, Ilmenau, Germany
| |
Collapse
|
15
|
Murphy-Baum BL, Awatramani GB. Parallel processing in active dendrites during periods of intense spiking activity. Cell Rep 2022; 38:110412. [PMID: 35196499 DOI: 10.1016/j.celrep.2022.110412] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2021] [Revised: 12/15/2021] [Accepted: 01/28/2022] [Indexed: 12/17/2022] Open
Abstract
A neuron's ability to perform parallel computations throughout its dendritic arbor substantially improves its computational capacity. However, during natural patterns of activity, the degree to which computations remain compartmentalized, especially in neurons with active dendritic trees, is not clear. Here, we examine how the direction of moving objects is computed across the bistratified dendritic arbors of ON-OFF direction-selective ganglion cells (DSGCs) in the mouse retina. We find that although local synaptic signals propagate efficiently throughout their dendritic trees, direction-selective computations in one part of the dendritic arbor have little effect on those being made elsewhere. Independent dendritic processing allows DSGCs to compute the direction of moving objects multiple times as they traverse their receptive fields, enabling them to rapidly detect changes in motion direction on a sub-receptive-field basis. These results demonstrate that the parallel processing capacity of neurons can be maintained even during periods of intense synaptic activity.
Collapse
Affiliation(s)
| | - Gautam B Awatramani
- Department of Biology, University of Victoria, Victoria, BC V8P 5C2, Canada.
| |
Collapse
|
16
|
Acharya J, Basu A, Legenstein R, Limbacher T, Poirazi P, Wu X. Dendritic Computing: Branching Deeper into Machine Learning. Neuroscience 2021; 489:275-289. [PMID: 34656706 DOI: 10.1016/j.neuroscience.2021.10.001] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2021] [Revised: 09/07/2021] [Accepted: 10/03/2021] [Indexed: 12/31/2022]
Abstract
In this paper, we discuss the nonlinear computational power provided by dendrites in biological and artificial neurons. We start by briefly presenting biological evidence about the type of dendritic nonlinearities, respective plasticity rules and their effect on biological learning as assessed by computational models. Four major computational implications are identified as improved expressivity, more efficient use of resources, utilizing internal learning signals, and enabling continual learning. We then discuss examples of how dendritic computations have been used to solve real-world classification problems with performance reported on well known data sets used in machine learning. The works are categorized according to the three primary methods of plasticity used-structural plasticity, weight plasticity, or plasticity of synaptic delays. Finally, we show the recent trend of confluence between concepts of deep learning and dendritic computations and highlight some future research directions.
Collapse
Affiliation(s)
| | - Arindam Basu
- Department of Electrical Engineering, City University of Hong Kong, Hong Kong
| | - Robert Legenstein
- Institute of Theoretical Computer Science, Graz University of Technology, Austria
| | - Thomas Limbacher
- Institute of Theoretical Computer Science, Graz University of Technology, Austria
| | - Panayiota Poirazi
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology-Hellas (FORTH), Greece
| | - Xundong Wu
- School of Computer Science, Hangzhou Dianzi University, China
| |
Collapse
|
17
|
Sinha M, Narayanan R. Active Dendrites and Local Field Potentials: Biophysical Mechanisms and Computational Explorations. Neuroscience 2021; 489:111-142. [PMID: 34506834 PMCID: PMC7612676 DOI: 10.1016/j.neuroscience.2021.08.035] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2021] [Revised: 08/30/2021] [Accepted: 08/31/2021] [Indexed: 10/27/2022]
Abstract
Neurons and glial cells are endowed with membranes that express a rich repertoire of ion channels, transporters, and receptors. The constant flux of ions across the neuronal and glial membranes results in voltage fluctuations that can be recorded from the extracellular matrix. The high frequency components of this voltage signal contain information about the spiking activity, reflecting the output from the neurons surrounding the recording location. The low frequency components of the signal, referred to as the local field potential (LFP), have been traditionally thought to provide information about the synaptic inputs that impinge on the large dendritic trees of various neurons. In this review, we discuss recent computational and experimental studies pointing to a critical role of several active dendritic mechanisms that can influence the genesis and the location-dependent spectro-temporal dynamics of LFPs, spanning different brain regions. We strongly emphasize the need to account for the several fast and slow dendritic events and associated active mechanisms - including gradients in their expression profiles, inter- and intra-cellular spatio-temporal interactions spanning neurons and glia, heterogeneities and degeneracy across scales, neuromodulatory influences, and activitydependent plasticity - towards gaining important insights about the origins of LFP under different behavioral states in health and disease. We provide simple but essential guidelines on how to model LFPs taking into account these dendritic mechanisms, with detailed methodology on how to account for various heterogeneities and electrophysiological properties of neurons and synapses while studying LFPs.
Collapse
Affiliation(s)
- Manisha Sinha
- Cellular Neurophysiology Laboratory, Molecular Biophysics Unit, Indian Institute of Science, Bangalore, Karnataka 560012, India
| | - Rishikesh Narayanan
- Cellular Neurophysiology Laboratory, Molecular Biophysics Unit, Indian Institute of Science, Bangalore, Karnataka 560012, India.
| |
Collapse
|
18
|
Tavosanis G. Dendrite enlightenment. Curr Opin Neurobiol 2021; 69:222-230. [PMID: 34134010 DOI: 10.1016/j.conb.2021.05.001] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2021] [Revised: 05/04/2021] [Accepted: 05/05/2021] [Indexed: 12/18/2022]
Abstract
Neuronal dendrites acquire complex morphologies during development. These are not just the product of cell-intrinsic developmental programs; rather they are defined in close interaction with the cellular environment. Thus, to understand the molecular cascades that yield appropriate morphologies, it is essential to investigate them in vivo, in the actual complex tissue environment encountered by the differentiating neuron in the developing animal. Particularly, genetic approaches have pointed to factors controlling dendrite differentiation in vivo. These suggest that localized and transient molecular cascades might underlie the formation and stabilization of dendrite branches with neuron type-specific characteristics. Here, I highlight the need for studies of neuronal dendrite differentiation in the animal, the challenges provided by such an approach, and the promising pathways that have recently opened.
Collapse
Affiliation(s)
- Gaia Tavosanis
- German Center for Neurodegenerative Diseases (DZNE), Venusberg-Campus 1/99, Bonn, 53127, Germany; LIMES Institute, University of Bonn, Carl-Troll-Str. 3, Bonn, 53115, Germany.
| |
Collapse
|
19
|
Zavatone-Veth JA, Pehlevan C. Activation function dependence of the storage capacity of treelike neural networks. Phys Rev E 2021; 103:L020301. [PMID: 33736039 DOI: 10.1103/physreve.103.l020301] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2020] [Accepted: 02/04/2021] [Indexed: 11/07/2022]
Abstract
The expressive power of artificial neural networks crucially depends on the nonlinearity of their activation functions. Though a wide variety of nonlinear activation functions have been proposed for use in artificial neural networks, a detailed understanding of their role in determining the expressive power of a network has not emerged. Here, we study how activation functions affect the storage capacity of treelike two-layer networks. We relate the boundedness or divergence of the capacity in the infinite-width limit to the smoothness of the activation function, elucidating the relationship between previously studied special cases. Our results show that nonlinearity can both increase capacity and decrease the robustness of classification, and provide simple estimates for the capacity of networks with several commonly used activation functions. Furthermore, they generate a hypothesis for the functional benefit of dendritic spikes in branched neurons.
Collapse
Affiliation(s)
| | - Cengiz Pehlevan
- John A. Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, Massachusetts 02138, USA.,Center for Brain Science, Harvard University, Cambridge, Massachusetts 02138, USA
| |
Collapse
|
20
|
Rossbroich J, Trotter D, Beninger J, Tóth K, Naud R. Linear-nonlinear cascades capture synaptic dynamics. PLoS Comput Biol 2021; 17:e1008013. [PMID: 33720935 PMCID: PMC7993773 DOI: 10.1371/journal.pcbi.1008013] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2020] [Revised: 03/25/2021] [Accepted: 02/25/2021] [Indexed: 11/18/2022] Open
Abstract
Short-term synaptic dynamics differ markedly across connections and strongly regulate how action potentials communicate information. To model the range of synaptic dynamics observed in experiments, we have developed a flexible mathematical framework based on a linear-nonlinear operation. This model can capture various experimentally observed features of synaptic dynamics and different types of heteroskedasticity. Despite its conceptual simplicity, we show that it is more adaptable than previous models. Combined with a standard maximum likelihood approach, synaptic dynamics can be accurately and efficiently characterized using naturalistic stimulation patterns. These results make explicit that synaptic processing bears algorithmic similarities with information processing in convolutional neural networks.
Collapse
Affiliation(s)
- Julian Rossbroich
- Friedrich Miescher Institute for Biomedical Research, Basel, Switzerland
| | - Daniel Trotter
- Department of Physics, University of Ottawa, Ottawa, ON, Canada
| | - John Beninger
- uOttawa Brain Mind Institute, Center for Neural Dynamics, Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Katalin Tóth
- uOttawa Brain Mind Institute, Center for Neural Dynamics, Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Richard Naud
- Department of Physics, University of Ottawa, Ottawa, ON, Canada
- uOttawa Brain Mind Institute, Center for Neural Dynamics, Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, ON, Canada
| |
Collapse
|
21
|
Shafiei G, Markello RD, Vos de Wael R, Bernhardt BC, Fulcher BD, Misic B. Topographic gradients of intrinsic dynamics across neocortex. eLife 2020; 9:e62116. [PMID: 33331819 PMCID: PMC7771969 DOI: 10.7554/elife.62116] [Citation(s) in RCA: 68] [Impact Index Per Article: 17.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2020] [Accepted: 12/16/2020] [Indexed: 12/16/2022] Open
Abstract
The intrinsic dynamics of neuronal populations are shaped by both microscale attributes and macroscale connectome architecture. Here we comprehensively characterize the rich temporal patterns of neural activity throughout the human brain. Applying massive temporal feature extraction to regional haemodynamic activity, we systematically estimate over 6000 statistical properties of individual brain regions' time-series across the neocortex. We identify two robust spatial gradients of intrinsic dynamics, one spanning a ventromedial-dorsolateral axis and dominated by measures of signal autocorrelation, and the other spanning a unimodal-transmodal axis and dominated by measures of dynamic range. These gradients reflect spatial patterns of gene expression, intracortical myelin and cortical thickness, as well as structural and functional network embedding. Importantly, these gradients are correlated with patterns of meta-analytic functional activation, differentiating cognitive versus affective processing and sensory versus higher-order cognitive processing. Altogether, these findings demonstrate a link between microscale and macroscale architecture, intrinsic dynamics, and cognition.
Collapse
Affiliation(s)
- Golia Shafiei
- McConnell Brain Imaging Centre, Montréal Neurological Institute, McGill UniversityMontréalCanada
| | - Ross D Markello
- McConnell Brain Imaging Centre, Montréal Neurological Institute, McGill UniversityMontréalCanada
| | - Reinder Vos de Wael
- McConnell Brain Imaging Centre, Montréal Neurological Institute, McGill UniversityMontréalCanada
| | - Boris C Bernhardt
- McConnell Brain Imaging Centre, Montréal Neurological Institute, McGill UniversityMontréalCanada
| | - Ben D Fulcher
- School of Physics, The University of SydneySydneyAustralia
| | - Bratislav Misic
- McConnell Brain Imaging Centre, Montréal Neurological Institute, McGill UniversityMontréalCanada
| |
Collapse
|
22
|
Yang JQ, Wang R, Ren Y, Mao JY, Wang ZP, Zhou Y, Han ST. Neuromorphic Engineering: From Biological to Spike-Based Hardware Nervous Systems. ADVANCED MATERIALS (DEERFIELD BEACH, FLA.) 2020; 32:e2003610. [PMID: 33165986 DOI: 10.1002/adma.202003610] [Citation(s) in RCA: 56] [Impact Index Per Article: 14.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/26/2020] [Revised: 07/27/2020] [Indexed: 06/11/2023]
Abstract
The human brain is a sophisticated, high-performance biocomputer that processes multiple complex tasks in parallel with high efficiency and remarkably low power consumption. Scientists have long been pursuing an artificial intelligence (AI) that can rival the human brain. Spiking neural networks based on neuromorphic computing platforms simulate the architecture and information processing of the intelligent brain, providing new insights for building AIs. The rapid development of materials engineering, device physics, chip integration, and neuroscience has led to exciting progress in neuromorphic computing with the goal of overcoming the von Neumann bottleneck. Herein, fundamental knowledge related to the structures and working principles of neurons and synapses of the biological nervous system is reviewed. An overview is then provided on the development of neuromorphic hardware systems, from artificial synapses and neurons to spike-based neuromorphic computing platforms. It is hoped that this review will shed new light on the evolution of brain-like computing.
Collapse
Affiliation(s)
- Jia-Qin Yang
- College of Electronics and Information Engineering, Shenzhen University, Shenzhen, 518060, P. R. China
- Institute of Microscale Optoelectronics, Shenzhen University, Shenzhen, 518060, P. R. China
| | - Ruopeng Wang
- College of Electronics and Information Engineering, Shenzhen University, Shenzhen, 518060, P. R. China
- Institute of Microscale Optoelectronics, Shenzhen University, Shenzhen, 518060, P. R. China
| | - Yi Ren
- Institute for Advanced Study, Shenzhen University, Shenzhen, 518060, P. R. China
| | - Jing-Yu Mao
- Institute for Advanced Study, Shenzhen University, Shenzhen, 518060, P. R. China
| | - Zhan-Peng Wang
- Institute for Advanced Study, Shenzhen University, Shenzhen, 518060, P. R. China
| | - Ye Zhou
- Institute for Advanced Study, Shenzhen University, Shenzhen, 518060, P. R. China
| | - Su-Ting Han
- Institute of Microscale Optoelectronics, Shenzhen University, Shenzhen, 518060, P. R. China
| |
Collapse
|
23
|
Scholl B, Fitzpatrick D. Cortical synaptic architecture supports flexible sensory computations. Curr Opin Neurobiol 2020; 64:41-45. [PMID: 32088662 PMCID: PMC8080306 DOI: 10.1016/j.conb.2020.01.013] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2019] [Revised: 01/17/2020] [Accepted: 01/23/2020] [Indexed: 12/11/2022]
Abstract
Establishing the fundamental principles that underlie the integration of excitatory and inhibitory presynaptic input populations is crucial to understanding how individual cortical neurons transform signals from peripheral receptors. Here we review recent studies using novel tools to examine the functional properties of excitatory synaptic inputs and the tuning of excitation and inhibition onto individual neurons. New evidence challenges existing synaptic connectivity rules and suggests a more complex functional synaptic architecture that supports a broad range of operations, enabling single neurons to encode multiple sensory features and flexibly shape their computations in the face of diverse sensory input.
Collapse
Affiliation(s)
- Benjamin Scholl
- Max Planck Florida Institute, 1 Max Planck Way, Jupiter, FL USA.
| | | |
Collapse
|
24
|
Mueller M, Egger V. Dendritic integration in olfactory bulb granule cells upon simultaneous multispine activation: Low thresholds for nonlocal spiking activity. PLoS Biol 2020; 18:e3000873. [PMID: 32966273 PMCID: PMC7535128 DOI: 10.1371/journal.pbio.3000873] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2020] [Revised: 10/05/2020] [Accepted: 08/24/2020] [Indexed: 11/18/2022] Open
Abstract
The inhibitory axonless olfactory bulb granule cells form reciprocal dendrodendritic synapses with mitral and tufted cells via large spines, mediating recurrent and lateral inhibition. As a case in point for dendritic transmitter release, rat granule cell dendrites are highly excitable, featuring local Na+ spine spikes and global Ca2+- and Na+-spikes. To investigate the transition from local to global signaling, we performed holographic, simultaneous 2-photon uncaging of glutamate at up to 12 granule cell spines, along with whole-cell recording and dendritic 2-photon Ca2+ imaging in acute juvenile rat brain slices. Coactivation of less than 10 reciprocal spines was sufficient to generate diverse regenerative signals that included regional dendritic Ca2+-spikes and dendritic Na+-spikes (D-spikes). Global Na+-spikes could be triggered in one third of granule cells. Individual spines and dendritic segments sensed the respective signal transitions as increments in Ca2+ entry. Dendritic integration as monitored by the somatic membrane potential was mostly linear until a threshold number of spines was activated, at which often D-spikes along with supralinear summation set in. As to the mechanisms supporting active integration, NMDA receptors (NMDARs) strongly contributed to all aspects of supralinearity, followed by dendritic voltage-gated Na+- and Ca2+-channels, whereas local Na+ spine spikes, as well as morphological variables, barely mattered. Because of the low numbers of coactive spines required to trigger dendritic Ca2+ signals and thus possibly lateral release of GABA onto mitral and tufted cells, we predict that thresholds for granule cell-mediated bulbar lateral inhibition are low. Moreover, D-spikes could provide a plausible substrate for granule cell-mediated gamma oscillations.
Collapse
Affiliation(s)
- Max Mueller
- Neurophysiology, Institute of Zoology, Universität Regensburg, Regensburg, Germany
| | - Veronica Egger
- Neurophysiology, Institute of Zoology, Universität Regensburg, Regensburg, Germany
| |
Collapse
|
25
|
Herstel LJ, Wierenga CJ. Network control through coordinated inhibition. Curr Opin Neurobiol 2020; 67:34-41. [PMID: 32853970 DOI: 10.1016/j.conb.2020.08.001] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2020] [Revised: 07/29/2020] [Accepted: 08/01/2020] [Indexed: 12/29/2022]
Abstract
Coordinated excitatory and inhibitory activity is required for proper brain functioning. Recent computational and experimental studies have demonstrated that activity patterns in recurrent cortical networks are dominated by inhibition. Whereas previous studies have suggested that inhibitory plasticity is important for homeostatic control, this new framework puts inhibition in the driver's seat. Complex neuronal networks in the brain comprise many configurations in parallel, controlled by external and internal 'switches'. Context-dependent modulation and plasticity of inhibitory connections play a key role in memory and learning. It is therefore important to realize that synaptic plasticity is often multisynaptic and that a proper balance between excitation and inhibition is not fixed, but depends on context and activity level.
Collapse
Affiliation(s)
- Lotte J Herstel
- Cell Biology, Neurobiology and Biophysics, Biology Department, Faculty of Science, Utrecht University, The Netherlands
| | - Corette J Wierenga
- Cell Biology, Neurobiology and Biophysics, Biology Department, Faculty of Science, Utrecht University, The Netherlands.
| |
Collapse
|
26
|
Ligon C, Seong E, Schroeder EJ, DeKorver NW, Yuan L, Chaudoin TR, Cai Y, Buch S, Bonasera SJ, Arikkath J. δ-Catenin engages the autophagy pathway to sculpt the developing dendritic arbor. J Biol Chem 2020; 295:10988-11001. [PMID: 32554807 DOI: 10.1074/jbc.ra120.013058] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2020] [Revised: 06/14/2020] [Indexed: 01/21/2023] Open
Abstract
The development of the dendritic arbor in pyramidal neurons is critical for neural circuit function. Here, we uncovered a pathway in which δ-catenin, a component of the cadherin-catenin cell adhesion complex, promotes coordination of growth among individual dendrites and engages the autophagy mechanism to sculpt the developing dendritic arbor. Using a rat primary neuron model, time-lapse imaging, immunohistochemistry, and confocal microscopy, we found that apical and basolateral dendrites are coordinately sculpted during development. Loss or knockdown of δ-catenin uncoupled this coordination, leading to retraction of the apical dendrite without altering basolateral dendrite dynamics. Autophagy is a key cellular pathway that allows degradation of cellular components. We observed that the impairment of the dendritic arbor resulting from δ-catenin knockdown could be reversed by knockdown of autophagy-related 7 (ATG7), a component of the autophagy machinery. We propose that δ-catenin regulates the dendritic arbor by coordinating the dynamics of individual dendrites and that the autophagy mechanism may be leveraged by δ-catenin and other effectors to sculpt the developing dendritic arbor. Our findings have implications for the management of neurological disorders, such as autism and intellectual disability, that are characterized by dendritic aberrations.
Collapse
Affiliation(s)
- Cheryl Ligon
- Developmental Neuroscience, Munroe-Meyer Institute, University of Nebraska Medical Center, Omaha, Nebraska, USA
| | - Eunju Seong
- Developmental Neuroscience, Munroe-Meyer Institute, University of Nebraska Medical Center, Omaha, Nebraska, USA
| | - Ethan J Schroeder
- Department of Genetics, Cell Biology, and Anatomy, University of Nebraska Medical Center, Omaha, Nebraska, USA
| | - Nicholas W DeKorver
- Department of Pharmacology and Experimental Neuroscience, University of Nebraska Medical Center, Omaha, Nebraska, USA
| | - Li Yuan
- Department of Pharmacology and Experimental Neuroscience, University of Nebraska Medical Center, Omaha, Nebraska, USA
| | - Tammy R Chaudoin
- Division of Geriatrics, University of Nebraska Medical Center, Omaha, Nebraska, USA
| | - Yu Cai
- Department of Pharmacology and Experimental Neuroscience, University of Nebraska Medical Center, Omaha, Nebraska, USA
| | - Shilpa Buch
- Department of Pharmacology and Experimental Neuroscience, University of Nebraska Medical Center, Omaha, Nebraska, USA
| | - Stephen J Bonasera
- Division of Geriatrics, University of Nebraska Medical Center, Omaha, Nebraska, USA
| | - Jyothi Arikkath
- Department of Anatomy, Howard University, Washington, D. C., USA
| |
Collapse
|
27
|
Dematties D, Rizzi S, Thiruvathukal GK, Pérez MD, Wainselboim A, Zanutto BS. A Computational Theory for the Emergence of Grammatical Categories in Cortical Dynamics. Front Neural Circuits 2020; 14:12. [PMID: 32372918 PMCID: PMC7179825 DOI: 10.3389/fncir.2020.00012] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2019] [Accepted: 03/16/2020] [Indexed: 11/22/2022] Open
Abstract
A general agreement in psycholinguistics claims that syntax and meaning are unified precisely and very quickly during online sentence processing. Although several theories have advanced arguments regarding the neurocomputational bases of this phenomenon, we argue that these theories could potentially benefit by including neurophysiological data concerning cortical dynamics constraints in brain tissue. In addition, some theories promote the integration of complex optimization methods in neural tissue. In this paper we attempt to fill these gaps introducing a computational model inspired in the dynamics of cortical tissue. In our modeling approach, proximal afferent dendrites produce stochastic cellular activations, while distal dendritic branches–on the other hand–contribute independently to somatic depolarization by means of dendritic spikes, and finally, prediction failures produce massive firing events preventing formation of sparse distributed representations. The model presented in this paper combines semantic and coarse-grained syntactic constraints for each word in a sentence context until grammatically related word function discrimination emerges spontaneously by the sole correlation of lexical information from different sources without applying complex optimization methods. By means of support vector machine techniques, we show that the sparse activation features returned by our approach are well suited—bootstrapping from the features returned by Word Embedding mechanisms—to accomplish grammatical function classification of individual words in a sentence. In this way we develop a biologically guided computational explanation for linguistically relevant unification processes in cortex which connects psycholinguistics to neurobiological accounts of language. We also claim that the computational hypotheses established in this research could foster future work on biologically-inspired learning algorithms for natural language processing applications.
Collapse
Affiliation(s)
- Dario Dematties
- Universidad de Buenos Aires, Facultad de Ingeniería, Instituto de Ingeniería Biomédica, Buenos Aires, Argentina
| | - Silvio Rizzi
- Argonne National Laboratory, Lemont, IL, United States
| | - George K Thiruvathukal
- Argonne National Laboratory, Lemont, IL, United States.,Computer Science Department, Loyola University Chicago, Chicago, IL, United States
| | - Mauricio David Pérez
- Microwaves in Medical Engineering Group, Division of Solid-State Electronics, Department of Electrical Engineering, Uppsala University, Uppsala, Sweden
| | - Alejandro Wainselboim
- Centro Científico Tecnológico Conicet Mendoza, Instituto de Ciencias Humanas, Sociales y Ambientales, Mendoza, Argentina
| | - B Silvano Zanutto
- Universidad de Buenos Aires, Facultad de Ingeniería, Instituto de Ingeniería Biomédica, Buenos Aires, Argentina.,Instituto de Biología y Medicina Experimental-CONICET, Buenos Aires, Argentina
| |
Collapse
|
28
|
Suárez LE, Markello RD, Betzel RF, Misic B. Linking Structure and Function in Macroscale Brain Networks. Trends Cogn Sci 2020; 24:302-315. [PMID: 32160567 DOI: 10.1016/j.tics.2020.01.008] [Citation(s) in RCA: 337] [Impact Index Per Article: 84.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2019] [Revised: 01/20/2020] [Accepted: 01/21/2020] [Indexed: 02/06/2023]
Abstract
Structure-function relationships are a fundamental principle of many naturally occurring systems. However, network neuroscience research suggests that there is an imperfect link between structural connectivity and functional connectivity in the brain. Here, we synthesize the current state of knowledge linking structure and function in macroscale brain networks and discuss the different types of models used to assess this relationship. We argue that current models do not include the requisite biological detail to completely predict function. Structural network reconstructions enriched with local molecular and cellular metadata, in concert with more nuanced representations of functions and properties, hold great potential for a truly multiscale understanding of the structure-function relationship.
Collapse
Affiliation(s)
- Laura E Suárez
- McConnell Brain Imaging Centre, Montréal Neurological Institute, McGill University, Montréal, QC, Canada
| | - Ross D Markello
- McConnell Brain Imaging Centre, Montréal Neurological Institute, McGill University, Montréal, QC, Canada
| | - Richard F Betzel
- Psychological and Brain Sciences, Program in Neuroscience, Cognitive Science Program, Network Science Institute, Indiana University, Bloomington, IN, USA
| | - Bratislav Misic
- McConnell Brain Imaging Centre, Montréal Neurological Institute, McGill University, Montréal, QC, Canada.
| |
Collapse
|