1
|
Pagkalos M, Makarov R, Poirazi P. Leveraging dendritic properties to advance machine learning and neuro-inspired computing. Curr Opin Neurobiol 2024; 85:102853. [PMID: 38394956 DOI: 10.1016/j.conb.2024.102853] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2023] [Revised: 02/04/2024] [Accepted: 02/05/2024] [Indexed: 02/25/2024]
Abstract
The brain is a remarkably capable and efficient system. It can process and store huge amounts of noisy and unstructured information, using minimal energy. In contrast, current artificial intelligence (AI) systems require vast resources for training while still struggling to compete in tasks that are trivial for biological agents. Thus, brain-inspired engineering has emerged as a promising new avenue for designing sustainable, next-generation AI systems. Here, we describe how dendritic mechanisms of biological neurons have inspired innovative solutions for significant AI problems, including credit assignment in multi-layer networks, catastrophic forgetting, and high-power consumption. These findings provide exciting alternatives to existing architectures, showing how dendritic research can pave the way for building more powerful and energy efficient artificial learning systems.
Collapse
Affiliation(s)
- Michalis Pagkalos
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece; Department of Biology, University of Crete, Heraklion, 70013, Greece. https://twitter.com/MPagkalos
| | - Roman Makarov
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece; Department of Biology, University of Crete, Heraklion, 70013, Greece. https://twitter.com/_RomanMakarov
| | - Panayiota Poirazi
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece.
| |
Collapse
|
2
|
Baronig M, Legenstein R. Context association in pyramidal neurons through local synaptic plasticity in apical dendrites. Front Neurosci 2024; 17:1276706. [PMID: 38357522 PMCID: PMC10864492 DOI: 10.3389/fnins.2023.1276706] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2023] [Accepted: 12/26/2023] [Indexed: 02/16/2024] Open
Abstract
The unique characteristics of neocortical pyramidal neurons are thought to be crucial for many aspects of information processing and learning in the brain. Experimental data suggests that their segregation into two distinct compartments, the basal dendrites close to the soma and the apical dendrites branching out from the thick apical dendritic tuft, plays an essential role in cortical organization. A recent hypothesis states that layer 5 pyramidal cells associate top-down contextual information arriving at their apical tuft with features of the sensory input that predominantly arrives at their basal dendrites. It has however remained unclear whether such context association could be established by synaptic plasticity processes. In this work, we formalize the objective of such context association learning through a mathematical loss function and derive a plasticity rule for apical synapses that optimizes this loss. The resulting plasticity rule utilizes information that is available either locally at the synapse, through branch-local NMDA spikes, or through global Ca2+events, both of which have been observed experimentally in layer 5 pyramidal cells. We show in computer simulations that the plasticity rule enables pyramidal cells to associate top-down contextual input patterns with high somatic activity. Furthermore, it enables networks of pyramidal neuron models to perform context-dependent tasks and enables continual learning by allocating new dendritic branches to novel contexts.
Collapse
Affiliation(s)
| | - Robert Legenstein
- Institute of Theoretical Computer Science, Graz University of Technology, Graz, Austria
| |
Collapse
|
3
|
Kourosh-Arami M, Komaki A, Gholami M, Marashi SH, Hejazi S. Heterosynaptic plasticity-induced modulation of synapses. J Physiol Sci 2023; 73:33. [PMID: 38057729 DOI: 10.1186/s12576-023-00893-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2023] [Accepted: 11/27/2023] [Indexed: 12/08/2023]
Abstract
Plasticity is a common feature of synapses that is stated in different ways and occurs through several mechanisms. The regular action of the brain needs to be balanced in several neuronal and synaptic features, one of which is synaptic plasticity. The different homeostatic processes, including the balance between excitation/inhibition or homeostasis of synaptic weights at the single-neuron level, may obtain this. Homosynaptic Hebbian-type plasticity causes associative alterations of synapses. Both homosynaptic and heterosynaptic plasticity characterize the corresponding aspects of adjustable synapses, and both are essential for the regular action of neural systems and their plastic synapses.In this review, we will compare homo- and heterosynaptic plasticity and the main factors affecting the direction of plastic changes. This review paper will also discuss the diverse functions of the different kinds of heterosynaptic plasticity and their properties. We argue that a complementary system of heterosynaptic plasticity demonstrates an essential cellular constituent for homeostatic modulation of synaptic weights and neuronal activity.
Collapse
Affiliation(s)
- Masoumeh Kourosh-Arami
- Department of Neuroscience, School of Advanced Technologies in Medicine, Iran University of Medical Sciences, Tehran, Iran.
| | - Alireza Komaki
- Department of Neuroscience, School of Science and Advanced Technologies in Medicine, Hamadan University of Medical Sciences, Hamadan, Iran
| | - Masoumeh Gholami
- Department of Physiology, Medical College, Arak University of Medical Sciences, Arak, Iran
| | | | - Sara Hejazi
- Department of Industrial Engineering & Management Systems, University of Central Florida, Orlando, USA
| |
Collapse
|
4
|
Makarov R, Pagkalos M, Poirazi P. Dendrites and efficiency: Optimizing performance and resource utilization. Curr Opin Neurobiol 2023; 83:102812. [PMID: 37980803 DOI: 10.1016/j.conb.2023.102812] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2023] [Revised: 10/19/2023] [Accepted: 10/21/2023] [Indexed: 11/21/2023]
Abstract
The brain is a highly efficient system that has evolved to optimize performance under limited resources. In this review, we highlight recent theoretical and experimental studies that support the view that dendrites make information processing and storage in the brain more efficient. This is achieved through the dynamic modulation of integration versus segregation of inputs and activity within a neuron. We argue that under conditions of limited energy and space, dendrites help biological networks to implement complex functions such as processing natural stimuli on behavioral timescales, performing the inference process on those stimuli in a context-specific manner, and storing the information in overlapping populations of neurons. A global picture starts to emerge, in which dendrites help the brain achieve efficiency through a combination of optimization strategies that balance the tradeoff between performance and resource utilization.
Collapse
Affiliation(s)
- Roman Makarov
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece; Department of Biology, University of Crete, Heraklion, 70013, Greece. https://twitter.com/_RomanMakarov
| | - Michalis Pagkalos
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece; Department of Biology, University of Crete, Heraklion, 70013, Greece. https://twitter.com/MPagkalos
| | - Panayiota Poirazi
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece.
| |
Collapse
|
5
|
Lobov SA, Berdnikova ES, Zharinov AI, Kurganov DP, Kazantsev VB. STDP-Driven Rewiring in Spiking Neural Networks under Stimulus-Induced and Spontaneous Activity. Biomimetics (Basel) 2023; 8:320. [PMID: 37504208 PMCID: PMC10807410 DOI: 10.3390/biomimetics8030320] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2023] [Revised: 07/18/2023] [Accepted: 07/19/2023] [Indexed: 07/29/2023] Open
Abstract
Mathematical and computer simulation of learning in living neural networks have typically focused on changes in the efficiency of synaptic connections represented by synaptic weights in the models. Synaptic plasticity is believed to be the cellular basis for learning and memory. In spiking neural networks composed of dynamical spiking units, a biologically relevant learning rule is based on the so-called spike-timing-dependent plasticity or STDP. However, experimental data suggest that synaptic plasticity is only a part of brain circuit plasticity, which also includes homeostatic and structural plasticity. A model of structural plasticity proposed in this study is based on the activity-dependent appearance and disappearance of synaptic connections. The results of the research indicate that such adaptive rewiring enables the consolidation of the effects of STDP in response to a local external stimulation of a neural network. Subsequently, a vector field approach is used to demonstrate the successive "recording" of spike paths in both functional connectome and synaptic connectome, and finally in the anatomical connectome of the network. Moreover, the findings suggest that the adaptive rewiring could stabilize network dynamics over time in the context of activity patterns' reproducibility. A universal measure of such reproducibility introduced in this article is based on similarity between time-consequent patterns of the special vector fields characterizing both functional and anatomical connectomes.
Collapse
Affiliation(s)
- Sergey A. Lobov
- Laboratory of Neurobiomorphic Technologies, The Moscow Institute of Physics and Technology, 117303 Moscow, Russia;
- Neurotechnology Department, Lobachevsky State University of Nizhny Novgorod, 603022 Nizhny Novgorod, Russia; (E.S.B.); (A.I.Z.)
| | - Ekaterina S. Berdnikova
- Neurotechnology Department, Lobachevsky State University of Nizhny Novgorod, 603022 Nizhny Novgorod, Russia; (E.S.B.); (A.I.Z.)
| | - Alexey I. Zharinov
- Neurotechnology Department, Lobachevsky State University of Nizhny Novgorod, 603022 Nizhny Novgorod, Russia; (E.S.B.); (A.I.Z.)
| | - Dmitry P. Kurganov
- Laboratory of Neuromodeling, Samara State Medical University, 443079 Samara, Russia;
| | - Victor B. Kazantsev
- Laboratory of Neurobiomorphic Technologies, The Moscow Institute of Physics and Technology, 117303 Moscow, Russia;
- Neurotechnology Department, Lobachevsky State University of Nizhny Novgorod, 603022 Nizhny Novgorod, Russia; (E.S.B.); (A.I.Z.)
- Laboratory of Neuromodeling, Samara State Medical University, 443079 Samara, Russia;
| |
Collapse
|
6
|
Pagkalos M, Makarov R, Poirazi P. Leveraging dendritic properties to advance machine learning and neuro-inspired computing. ARXIV 2023:arXiv:2306.08007v1. [PMID: 37396619 PMCID: PMC10312913] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Subscribe] [Scholar Register] [Indexed: 07/04/2023]
Abstract
The brain is a remarkably capable and efficient system. It can process and store huge amounts of noisy and unstructured information using minimal energy. In contrast, current artificial intelligence (AI) systems require vast resources for training while still struggling to compete in tasks that are trivial for biological agents. Thus, brain-inspired engineering has emerged as a promising new avenue for designing sustainable, next-generation AI systems. Here, we describe how dendritic mechanisms of biological neurons have inspired innovative solutions for significant AI problems, including credit assignment in multilayer networks, catastrophic forgetting, and high energy consumption. These findings provide exciting alternatives to existing architectures, showing how dendritic research can pave the way for building more powerful and energy-efficient artificial learning systems.
Collapse
Affiliation(s)
- Michalis Pagkalos
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece
- Department of Biology, University of Crete, Heraklion, 70013, Greece
| | - Roman Makarov
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece
- Department of Biology, University of Crete, Heraklion, 70013, Greece
| | - Panayiota Poirazi
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece
| |
Collapse
|
7
|
Makarov R, Pagkalos M, Poirazi P. Dendrites and Efficiency: Optimizing Performance and Resource Utilization. ARXIV 2023:arXiv:2306.07101v1. [PMID: 37396597 PMCID: PMC10312813] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Subscribe] [Scholar Register] [Indexed: 07/04/2023]
Abstract
The brain is a highly efficient system evolved to achieve high performance with limited resources. We propose that dendrites make information processing and storage in the brain more efficient through the segregation of inputs and their conditional integration via nonlinear events, the compartmentalization of activity and plasticity and the binding of information through synapse clustering. In real-world scenarios with limited energy and space, dendrites help biological networks process natural stimuli on behavioral timescales, perform the inference process on those stimuli in a context-specific manner, and store the information in overlapping populations of neurons. A global picture starts to emerge, in which dendrites help the brain achieve efficiency through a combination of optimization strategies balancing the tradeoff between performance and resource utilization.
Collapse
Affiliation(s)
- Roman Makarov
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece
- Department of Biology, University of Crete, Heraklion, 70013, Greece
| | - Michalis Pagkalos
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece
- Department of Biology, University of Crete, Heraklion, 70013, Greece
| | - Panayiota Poirazi
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece
| |
Collapse
|
8
|
Zhuravlev AV. Three levels of information processing in the brain. Biosystems 2023:104934. [PMID: 37245794 DOI: 10.1016/j.biosystems.2023.104934] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2023] [Revised: 05/25/2023] [Accepted: 05/25/2023] [Indexed: 05/30/2023]
Abstract
Information, the measure of order in a complex system, is the opposite of entropy, the measure of chaos and disorder. We can distinguish several levels at which information is processed in the brain. The first one is the level of serial molecular genetic processes, similar in some aspects to digital computations (DC). At the same time, higher cognitive activity is probably based on parallel neural network computations (NNC). The advantage of neural networks is their intrinsic ability to learn, adapting their parameters to specific tasks and to external data. However, there seems to be a third level of information processing as well, which involves subjective consciousness and its units, so called qualia. They are difficult to study experimentally, and the very fact of their existence is hard to explain within the framework of modern physical theory. Here I propose a way to consider consciousness as the extension of basic physical laws - namely, total entropy dissipation leading to a system simplification. At the level of subjective consciousness, the brain seems to convert information embodied by neural activity to a more simple and compact form, internally observed as qualia. Whereas physical implementations of both DC and NNC are essentially approximate and probabilistic, qualia-associated computations (QAC) make the brain capable of recognizing general laws and relationships. While elaborating a behavioral program, the conscious brain does not act blindly or gropingly but according to the very meaning of such general laws, which gives it an advantage compared to any artificial intelligence system.
Collapse
Affiliation(s)
- Aleksandr V Zhuravlev
- I. P. Pavlov Institute of Physiology, nab Makarova 6, 199034, St Petersburg, Russian Federation.
| |
Collapse
|
9
|
Malakasis N, Chavlis S, Poirazi P. Synaptic turnover promotes efficient learning in bio-realistic spiking neural networks. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.05.22.541722. [PMID: 37292929 PMCID: PMC10245885 DOI: 10.1101/2023.05.22.541722] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
While artificial machine learning systems achieve superhuman performance in specific tasks such as language processing, image and video recognition, they do so use extremely large datasets and huge amounts of power. On the other hand, the brain remains superior in several cognitively challenging tasks while operating with the energy of a small lightbulb. We use a biologically constrained spiking neural network model to explore how the neural tissue achieves such high efficiency and assess its learning capacity on discrimination tasks. We found that synaptic turnover, a form of structural plasticity, which is the ability of the brain to form and eliminate synapses continuously, increases both the speed and the performance of our network on all tasks tested. Moreover, it allows accurate learning using a smaller number of examples. Importantly, these improvements are most significant under conditions of resource scarcity, such as when the number of trainable parameters is halved and when the task difficulty is increased. Our findings provide new insights into the mechanisms that underlie efficient learning in the brain and can inspire the development of more efficient and flexible machine learning algorithms.
Collapse
Affiliation(s)
- Nikos Malakasis
- School of Medicine, University of Crete, Heraklion 70013, Greece
- Institute of Molecular Biology and Biotechnology, Foundation for Research and Technology-Hellas, Heraklion 70013, Greece
| | - Spyridon Chavlis
- Institute of Molecular Biology and Biotechnology, Foundation for Research and Technology-Hellas, Heraklion 70013, Greece
| | - Panayiota Poirazi
- Institute of Molecular Biology and Biotechnology, Foundation for Research and Technology-Hellas, Heraklion 70013, Greece
| |
Collapse
|
10
|
Levy WB, Baxter RA. Growing dendrites enhance a neuron's computational power and memory capacity. Neural Netw 2023; 164:275-309. [PMID: 37163846 DOI: 10.1016/j.neunet.2023.04.033] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2022] [Revised: 04/13/2023] [Accepted: 04/18/2023] [Indexed: 05/12/2023]
Abstract
Neocortical pyramidal neurons have many dendrites, and such dendrites are capable of, in isolation of one-another, generating a neuronal spike. It is also now understood that there is a large amount of dendritic growth during the first years of a humans life, arguably a period of prodigious learning. These observations inspire the construction of a local, stochastic algorithm based on an earlier stochastic, homeostatic, Hebbian developmental theory. Here we investigate the neurocomputational advantages and limits on this novel algorithm that combines dendritogenesis with supervised adaptive synaptogenesis. Neurons created with this algorithm have enhanced memory capacity, can avoid catastrophic interference (forgetting), and have the ability to unmix mixture distributions. In particular, individual dendrites develop within each class, in an unsupervised manner, to become feature-clusters that correspond to the mixing elements of class-conditional mixture distribution. Error-free classification is demonstrated with input perturbations up to 40%. Although discriminative problems are used to understand the capabilities of the stochastic algorithm and the neuronal connectivity it produces, the algorithm is in the generative class, it thus seems ideal for decisions that require generalization, i.e., extrapolation beyond previous learning.
Collapse
Affiliation(s)
- William B Levy
- Department of Neurosurgery, University of Virginia School of Medicine, Charlottesville, VA 22908, United States of America; Informed Simplifications, Earlysville, VA 22936, United States of America.
| | - Robert A Baxter
- Department of Neurosurgery, University of Virginia School of Medicine, Charlottesville, VA 22908, United States of America; Baxter Adaptive Systems, Bedford, MA 01730, United States of America
| |
Collapse
|
11
|
Altered integration of excitatory inputs onto the basal dendrites of layer 5 pyramidal neurons in a mouse model of Fragile X syndrome. Proc Natl Acad Sci U S A 2023; 120:e2208963120. [PMID: 36595706 PMCID: PMC9926222 DOI: 10.1073/pnas.2208963120] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/05/2023] Open
Abstract
Layer 5 (L5) pyramidal neurons receive predictive and sensory inputs in a compartmentalized manner at their apical and basal dendrites, respectively. To uncover how integration of sensory inputs is affected in autism spectrum disorders (ASD), we used two-photon glutamate uncaging to activate spines in the basal dendrites of L5 pyramidal neurons from a mouse model of Fragile X syndrome (FXS), the most common genetic cause of ASD. While subthreshold excitatory inputs integrate linearly in wild-type animals, surprisingly those with FXS summate sublinearly, contradicting what would be expected of sensory hypersensitivity classically associated with ASD. We next investigated the mechanism underlying this sublinearity by performing knockdown of the regulatory β4 subunit of BK channels, which rescued the synaptic integration, a result that was corroborated with numerical simulations. Taken together, these findings suggest that there is a differential impairment in the integration of feedforward sensory and feedback predictive inputs in L5 pyramidal neurons in FXS and potentially other forms of ASD, as a result of specifically localized subcellular channelopathies. These results challenge the traditional view that FXS and other ASD are characterized by sensory hypersensitivity, proposing instead a hyposensitivity of sensory inputs and hypersensitivity of predictive inputs onto cortical neurons.
Collapse
|
12
|
Kayikcioglu Bozkir I, Ozcan Z, Kose C, Kayikcioglu T, Cetin AE. Improving a cortical pyramidal neuron model's classification performance on a real-world ecg dataset by extending inputs. J Comput Neurosci 2022; 51:329-341. [PMID: 37148455 DOI: 10.1007/s10827-023-00851-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2022] [Revised: 04/23/2023] [Accepted: 04/25/2023] [Indexed: 05/08/2023]
Abstract
Pyramidal neurons display a variety of active conductivities and complex morphologies that support nonlinear dendritic computation. Given growing interest in understanding the ability of pyramidal neurons to classify real-world data, in our study we applied both a detailed pyramidal neuron model and the perceptron learning algorithm to classify real-world ECG data. We used Gray coding to generate spike patterns from ECG signals as well as investigated the classification performance of the pyramidal neuron's subcellular regions. Compared with the equivalent single-layer perceptron, the pyramidal neuron performed poorly due to a weight constraint. A proposed mirroring approach for inputs, however, significantly boosted the classification performance of the neuron. We thus conclude that pyramidal neurons can classify real-world data and that the mirroring approach affects performance in a way similar to non-constrained learning.
Collapse
Affiliation(s)
- Ilknur Kayikcioglu Bozkir
- Department of Computer Engineering, Karadeniz Technical University, Trabzon, Türkiye.
- Department of Computer Engineering, Bulent Ecevit University, Zonguldak, Türkiye.
| | - Zubeyir Ozcan
- Department of Electrical and Electronics Engineering, Karadeniz Technical University, Trabzon, Türkiye
| | - Cemal Kose
- Department of Computer Engineering, Karadeniz Technical University, Trabzon, Türkiye
| | - Temel Kayikcioglu
- Department of Electrical and Electronics Engineering, Karadeniz Technical University, Trabzon, Türkiye
- Department of Electrical and Computer Engineering, University of Illinois at Chicago, Chicago, USA
| | - Ahmet Enis Cetin
- Department of Electrical and Computer Engineering, University of Illinois at Chicago, Chicago, USA
| |
Collapse
|
13
|
Iyer A, Grewal K, Velu A, Souza LO, Forest J, Ahmad S. Avoiding Catastrophe: Active Dendrites Enable Multi-Task Learning in Dynamic Environments. Front Neurorobot 2022; 16:846219. [PMID: 35574225 PMCID: PMC9100780 DOI: 10.3389/fnbot.2022.846219] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2021] [Accepted: 03/31/2022] [Indexed: 11/13/2022] Open
Abstract
A key challenge for AI is to build embodied systems that operate in dynamically changing environments. Such systems must adapt to changing task contexts and learn continuously. Although standard deep learning systems achieve state of the art results on static benchmarks, they often struggle in dynamic scenarios. In these settings, error signals from multiple contexts can interfere with one another, ultimately leading to a phenomenon known as catastrophic forgetting. In this article we investigate biologically inspired architectures as solutions to these problems. Specifically, we show that the biophysical properties of dendrites and local inhibitory systems enable networks to dynamically restrict and route information in a context-specific manner. Our key contributions are as follows: first, we propose a novel artificial neural network architecture that incorporates active dendrites and sparse representations into the standard deep learning framework. Next, we study the performance of this architecture on two separate benchmarks requiring task-based adaptation: Meta-World, a multi-task reinforcement learning environment where a robotic agent must learn to solve a variety of manipulation tasks simultaneously; and a continual learning benchmark in which the model's prediction task changes throughout training. Analysis on both benchmarks demonstrates the emergence of overlapping but distinct and sparse subnetworks, allowing the system to fluidly learn multiple tasks with minimal forgetting. Our neural implementation marks the first time a single architecture has achieved competitive results in both multi-task and continual learning settings. Our research sheds light on how biological properties of neurons can inform deep learning systems to address dynamic scenarios that are typically impossible for traditional ANNs to solve.
Collapse
Affiliation(s)
- Abhiram Iyer
- Numenta, Redwood City, CA, United States
- Department of Electrical and Computer Engineering, Carnegie Mellon University, Pittsburgh, PA, United States
| | | | - Akash Velu
- Department of Computer Science, Stanford University, Stanford, CA, United States
| | | | - Jeremy Forest
- Department of Psychology, Cornell University, Ithaca, NY, United States
| | | |
Collapse
|
14
|
Jenks KR, Tsimring K, Ip JPK, Zepeda JC, Sur M. Heterosynaptic Plasticity and the Experience-Dependent Refinement of Developing Neuronal Circuits. Front Neural Circuits 2021; 15:803401. [PMID: 34949992 PMCID: PMC8689143 DOI: 10.3389/fncir.2021.803401] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2021] [Accepted: 11/15/2021] [Indexed: 01/01/2023] Open
Abstract
Neurons remodel the structure and strength of their synapses during critical periods of development in order to optimize both perception and cognition. Many of these developmental synaptic changes are thought to occur through synapse-specific homosynaptic forms of experience-dependent plasticity. However, homosynaptic plasticity can also induce or contribute to the plasticity of neighboring synapses through heterosynaptic interactions. Decades of research in vitro have uncovered many of the molecular mechanisms of heterosynaptic plasticity that mediate local compensation for homosynaptic plasticity, facilitation of further bouts of plasticity in nearby synapses, and cooperative induction of plasticity by neighboring synapses acting in concert. These discoveries greatly benefited from new tools and technologies that permitted single synapse imaging and manipulation of structure, function, and protein dynamics in living neurons. With the recent advent and application of similar tools for in vivo research, it is now feasible to explore how heterosynaptic plasticity contribute to critical periods and the development of neuronal circuits. In this review, we will first define the forms heterosynaptic plasticity can take and describe our current understanding of their molecular mechanisms. Then, we will outline how heterosynaptic plasticity may lead to meaningful refinement of neuronal responses and observations that suggest such mechanisms are indeed at work in vivo. Finally, we will use a well-studied model of cortical plasticity—ocular dominance plasticity during a critical period of visual cortex development—to highlight the molecular overlap between heterosynaptic and developmental forms of plasticity, and suggest potential avenues of future research.
Collapse
Affiliation(s)
- Kyle R Jenks
- Picower Institute for Learning and Memory, Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, United States
| | - Katya Tsimring
- Picower Institute for Learning and Memory, Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, United States
| | - Jacque Pak Kan Ip
- School of Biomedical Sciences, The Chinese University of Hong Kong, Hong Kong SAR, China
| | - Jose C Zepeda
- Picower Institute for Learning and Memory, Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, United States
| | - Mriganka Sur
- Picower Institute for Learning and Memory, Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, United States
| |
Collapse
|
15
|
Gorman JC, Tufte OL, Miller AVR, DeBello WM, Peña JL, Fischer BJ. Diverse processing underlying frequency integration in midbrain neurons of barn owls. PLoS Comput Biol 2021; 17:e1009569. [PMID: 34762650 PMCID: PMC8610287 DOI: 10.1371/journal.pcbi.1009569] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2021] [Revised: 11/23/2021] [Accepted: 10/16/2021] [Indexed: 11/18/2022] Open
Abstract
Emergent response properties of sensory neurons depend on circuit connectivity and somatodendritic processing. Neurons of the barn owl’s external nucleus of the inferior colliculus (ICx) display emergence of spatial selectivity. These neurons use interaural time difference (ITD) as a cue for the horizontal direction of sound sources. ITD is detected by upstream brainstem neurons with narrow frequency tuning, resulting in spatially ambiguous responses. This spatial ambiguity is resolved by ICx neurons integrating inputs over frequency, a relevant processing in sound localization across species. Previous models have predicted that ICx neurons function as point neurons that linearly integrate inputs across frequency. However, the complex dendritic trees and spines of ICx neurons raises the question of whether this prediction is accurate. Data from in vivo intracellular recordings of ICx neurons were used to address this question. Results revealed diverse frequency integration properties, where some ICx neurons showed responses consistent with the point neuron hypothesis and others with nonlinear dendritic integration. Modeling showed that varied connectivity patterns and forms of dendritic processing may underlie observed ICx neurons’ frequency integration processing. These results corroborate the ability of neurons with complex dendritic trees to implement diverse linear and nonlinear integration of synaptic inputs, of relevance for adaptive coding and learning, and supporting a fundamental mechanism in sound localization. Neurons at higher stages of sensory pathways often display selectivity for properties of sensory stimuli that result from computations performed within the nervous system. These emergent response properties can be produced by patterns of neural connectivity and processing that occur within individual cells. Here we investigated whether neural connectivity and single-neuron computation may contribute to the emergence of spatial selectivity in auditory neurons in the barn owl’s midbrain. We used data from in vivo intracellular recordings to test the hypothesis from previous modeling work that these cells function as point neurons that perform a linear sum of their inputs in their subthreshold responses. Results indicate that while some neurons show responses consistent with the point neuron hypothesis, others match predictions of nonlinear integration, indicating a diversity of frequency integration properties across neurons. Modeling further showed that varied connectivity patterns and forms of single-neuron computation may underlie observed responses. These results demonstrate that neurons with complex morphologies may implement diverse integration of synaptic inputs, relevant for adaptive coding and learning.
Collapse
Affiliation(s)
- Julia C. Gorman
- Department of Mathematics, Seattle University, Seattle, Washington, United States of America
| | - Oliver L. Tufte
- Department of Mathematics, Seattle University, Seattle, Washington, United States of America
| | - Anna V. R. Miller
- Department of Mathematics, Seattle University, Seattle, Washington, United States of America
| | - William M. DeBello
- Center for Neuroscience, University of California - Davis, Davis, California, United States of America
| | - José L. Peña
- Dominick P Purpura Department of Neuroscience, Albert Einstein College of Medicine, New York, New York, United States of America
| | - Brian J. Fischer
- Department of Mathematics, Seattle University, Seattle, Washington, United States of America
- * E-mail:
| |
Collapse
|
16
|
Acharya J, Basu A, Legenstein R, Limbacher T, Poirazi P, Wu X. Dendritic Computing: Branching Deeper into Machine Learning. Neuroscience 2021; 489:275-289. [PMID: 34656706 DOI: 10.1016/j.neuroscience.2021.10.001] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2021] [Revised: 09/07/2021] [Accepted: 10/03/2021] [Indexed: 12/31/2022]
Abstract
In this paper, we discuss the nonlinear computational power provided by dendrites in biological and artificial neurons. We start by briefly presenting biological evidence about the type of dendritic nonlinearities, respective plasticity rules and their effect on biological learning as assessed by computational models. Four major computational implications are identified as improved expressivity, more efficient use of resources, utilizing internal learning signals, and enabling continual learning. We then discuss examples of how dendritic computations have been used to solve real-world classification problems with performance reported on well known data sets used in machine learning. The works are categorized according to the three primary methods of plasticity used-structural plasticity, weight plasticity, or plasticity of synaptic delays. Finally, we show the recent trend of confluence between concepts of deep learning and dendritic computations and highlight some future research directions.
Collapse
Affiliation(s)
| | - Arindam Basu
- Department of Electrical Engineering, City University of Hong Kong, Hong Kong
| | - Robert Legenstein
- Institute of Theoretical Computer Science, Graz University of Technology, Austria
| | - Thomas Limbacher
- Institute of Theoretical Computer Science, Graz University of Technology, Austria
| | - Panayiota Poirazi
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology-Hellas (FORTH), Greece
| | - Xundong Wu
- School of Computer Science, Hangzhou Dianzi University, China
| |
Collapse
|
17
|
Güler M. Multibranch Formal Neuron: An Internally Nonlinear Learning Unit. Neural Comput 2021; 33:2736-2761. [PMID: 34280300 DOI: 10.1162/neco_a_01428] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2021] [Accepted: 05/18/2021] [Indexed: 11/04/2022]
Abstract
The transformation of synaptic input into action potential in nerve cells is strongly influenced by the morphology of the dendritic arbor as well as the synaptic efficacy map. The multiplicity of dendritic branches strikingly enables a single cell to act as a highly nonlinear processing element. Studies have also found functional synaptic clustering whereby synapses that encode a common sensory feature are spatially clustered together on the branches. Motivated by these findings, here we introduce a multibranch formal model of the neuron that can integrate synaptic inputs nonlinearly through collective action of its dendritic branches and yields synaptic clustering. An analysis in support of its use as a computational building block is offered. Also offered is an accompanying gradient descent-based learning algorithm. The model unit spans a wide spectrum of nonlinearities, including the parity problem, and can outperform the multilayer perceptron in generalizing to unseen data. The occurrence of synaptic clustering boosts the generalization efficiency of the unit, which may also be the answer for the puzzling ubiquity of synaptic clustering in the real neurons. Our theoretical analysis is backed up by simulations. The study could pave the way to new artificial neural networks.
Collapse
Affiliation(s)
- Marifi Güler
- Department of Computer Engineering, Eastern Mediterranean University, 99628 Famagusta North Cyprus, Turkey
| |
Collapse
|
18
|
Cazé RD, Stimberg M. Neurons with dendrites can perform linearly separable computations with low resolution synaptic weights. F1000Res 2020; 9:1174. [PMID: 33564396 PMCID: PMC7848858 DOI: 10.12688/f1000research.26486.3] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 03/30/2021] [Indexed: 11/25/2022] Open
Abstract
In theory, neurons modelled as single layer perceptrons can implement all linearly separable computations. In practice, however, these computations may require arbitrarily precise synaptic weights. This is a strong constraint since both biological neurons and their artificial counterparts have to cope with limited precision. Here, we explore how non-linear processing in dendrites helps overcome this constraint. We start by finding a class of computations which requires increasing precision with the number of inputs in a perceptron and show that it can be implemented without this constraint in a neuron with sub-linear dendritic subunits. Then, we complement this analytical study by a simulation of a biophysical neuron model with two passive dendrites and a soma, and show that it can implement this computation. This work demonstrates a new role of dendrites in neural computation: by distributing the computation across independent subunits, the same computation can be performed more efficiently with less precise tuning of the synaptic weights. This work not only offers new insight into the importance of dendrites for biological neurons, but also paves the way for new, more efficient architectures of artificial neuromorphic chips.
Collapse
Affiliation(s)
- Romain D Cazé
- IEMN, CNRS UMR 8520, Villeneuve d'asq, 59650, France
| | | |
Collapse
|