1
|
Kromer JA, Tass PA. Simulated dataset on coordinated reset stimulation of homogeneous and inhomogeneous networks of excitatory leaky integrate-and-fire neurons with spike-timing-dependent plasticity. Data Brief 2024; 54:110345. [PMID: 38586130 PMCID: PMC10998034 DOI: 10.1016/j.dib.2024.110345] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2024] [Accepted: 03/12/2024] [Indexed: 04/09/2024] Open
Abstract
We present simulated data on coordinated reset stimulation (CRS) of plastic neuronal networks. The neuronal network consists of excitatory leaky integrate-and-fire neurons and plasticity is implemented as spike-timing-dependent plasticity (STDP). A synchronized state with strong synaptic connectivity and a desynchronized state with weak synaptic connectivity coexist. CRS may drive the network from the synchronized state into a desynchronized state inducing long-lasting desynchronization effects that persist after cessation of stimulation. This is used to model brain stimulation-induced transitions between a pathological state, with abnormally strong neuronal synchrony, and a physiological state, e.g., in Parkinson's disease. During CRS, a sequence of stimuli is delivered to multiple stimulation sites - called CR sequence. We present simulated data for the analysis of long-lasting desynchronization effects of CRS with shuffled CR sequences versus non-shuffled CR sequences in which the order of stimulus deliveries to the sites remains unchanged throughout the entire stimulation period. Such data are presented for networks with homogeneous synaptic connectivity and networks with inhomogeneous synaptic connectivity. Homogeneous synaptic connectivity refers to a network in which the probability of a synaptic connection does not depend on the pre- and postsynaptic neurons' locations. In contrast, inhomogeneous synaptic connectivity refers to a network in which the probability of a synaptic connection depends on the neurons' locations. The presented neuronal network model was used to analyse the impact of the CR sequences and their shuffling on the long-lasting effects of CRS [1].
Collapse
Affiliation(s)
- Justus A Kromer
- Department of Neurosurgery, Stanford University, Stanford, CA, United States of America
| | - Peter A Tass
- Department of Neurosurgery, Stanford University, Stanford, CA, United States of America
| |
Collapse
|
2
|
Kromer JA, Tass PA. Coordinated reset stimulation of plastic neural networks with spatially dependent synaptic connections. FRONTIERS IN NETWORK PHYSIOLOGY 2024; 4:1351815. [PMID: 38863734 PMCID: PMC11165135 DOI: 10.3389/fnetp.2024.1351815] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/07/2023] [Accepted: 04/15/2024] [Indexed: 06/13/2024]
Abstract
Background Abnormal neuronal synchrony is associated with several neurological disorders, including Parkinson's disease (PD), essential tremor, dystonia, and epilepsy. Coordinated reset (CR) stimulation was developed computationally to counteract abnormal neuronal synchrony. During CR stimulation, phase-shifted stimuli are delivered to multiple stimulation sites. Computational studies in plastic neural networks reported that CR stimulation drove the networks into an attractor of a stable desynchronized state by down-regulating synaptic connections, which led to long-lasting desynchronization effects that outlasted stimulation. Later, corresponding long-lasting desynchronization and therapeutic effects were found in animal models of PD and PD patients. To date, it is unclear how spatially dependent synaptic connections, as typically observed in the brain, shape CR-induced synaptic downregulation and long-lasting effects. Methods We performed numerical simulations of networks of leaky integrate-and-fire neurons with spike-timing-dependent plasticity and spatially dependent synaptic connections to study and further improve acute and long-term responses to CR stimulation. Results The characteristic length scale of synaptic connections relative to the distance between stimulation sites plays a key role in CR parameter adjustment. In networks with short synaptic length scales, a substantial synaptic downregulation can be achieved by selecting appropriate stimulus-related parameters, such as the stimulus amplitude and shape, regardless of the employed spatiotemporal pattern of stimulus deliveries. Complex stimulus shapes can induce local connectivity patterns in the vicinity of the stimulation sites. In contrast, in networks with longer synaptic length scales, the spatiotemporal sequence of stimulus deliveries is of major importance for synaptic downregulation. In particular, rapid shuffling of the stimulus sequence is advantageous for synaptic downregulation. Conclusion Our results suggest that CR stimulation parameters can be adjusted to synaptic connectivity to further improve the long-lasting effects. Furthermore, shuffling of CR sequences is advantageous for long-lasting desynchronization effects. Our work provides important hypotheses on CR parameter selection for future preclinical and clinical studies.
Collapse
Affiliation(s)
- Justus A. Kromer
- Department of Neurosurgery, Stanford University, Stanford, CA, United States
| | | |
Collapse
|
3
|
Vignoud G, Venance L, Touboul JD. Anti-Hebbian plasticity drives sequence learning in striatum. Commun Biol 2024; 7:555. [PMID: 38724614 PMCID: PMC11082161 DOI: 10.1038/s42003-024-06203-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2022] [Accepted: 04/17/2024] [Indexed: 05/12/2024] Open
Abstract
Spatio-temporal activity patterns have been observed in a variety of brain areas in spontaneous activity, prior to or during action, or in response to stimuli. Biological mechanisms endowing neurons with the ability to distinguish between different sequences remain largely unknown. Learning sequences of spikes raises multiple challenges, such as maintaining in memory spike history and discriminating partially overlapping sequences. Here, we show that anti-Hebbian spike-timing dependent plasticity (STDP), as observed at cortico-striatal synapses, can naturally lead to learning spike sequences. We design a spiking model of the striatal output neuron receiving spike patterns defined as sequential input from a fixed set of cortical neurons. We use a simple synaptic plasticity rule that combines anti-Hebbian STDP and non-associative potentiation for a subset of the presented patterns called rewarded patterns. We study the ability of striatal output neurons to discriminate rewarded from non-rewarded patterns by firing only after the presentation of a rewarded pattern. In particular, we show that two biological properties of striatal networks, spiking latency and collateral inhibition, contribute to an increase in accuracy, by allowing a better discrimination of partially overlapping sequences. These results suggest that anti-Hebbian STDP may serve as a biological substrate for learning sequences of spikes.
Collapse
Affiliation(s)
- Gaëtan Vignoud
- Center for Interdisciplinary Research in Biology (CIRB), College de France, CNRS, INSERM, Université PSL, Paris, France
| | - Laurent Venance
- Center for Interdisciplinary Research in Biology (CIRB), College de France, CNRS, INSERM, Université PSL, Paris, France.
| | - Jonathan D Touboul
- Department of Mathematics and Volen National Center for Complex Systems, Brandeis University, Waltham, MA, USA.
| |
Collapse
|
4
|
Albesa-González A, Clopath C. Learning with filopodia and spines: Complementary strong and weak competition lead to specialized, graded, and protected receptive fields. PLoS Comput Biol 2024; 20:e1012110. [PMID: 38743789 PMCID: PMC11125506 DOI: 10.1371/journal.pcbi.1012110] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2023] [Revised: 05/24/2024] [Accepted: 04/25/2024] [Indexed: 05/16/2024] Open
Abstract
Filopodia are thin synaptic protrusions that have been long known to play an important role in early development. Recently, they have been found to be more abundant in the adult cortex than previously thought, and more plastic than spines (button-shaped mature synapses). Inspired by these findings, we introduce a new model of synaptic plasticity that jointly describes learning of filopodia and spines. The model assumes that filopodia exhibit strongly competitive learning dynamics -similarly to additive spike-timing-dependent plasticity (STDP). At the same time it proposes that, if filopodia undergo sufficient potentiation, they consolidate into spines. Spines follow weakly competitive learning, classically associated with multiplicative, soft-bounded models of STDP. This makes spines more stable and sensitive to the fine structure of input correlations. We show that our learning rule has a selectivity comparable to additive STDP and captures input correlations as well as multiplicative models of STDP. We also show how it can protect previously formed memories and perform synaptic consolidation. Overall, our results can be seen as a phenomenological description of how filopodia and spines could cooperate to overcome the individual difficulties faced by strong and weak competition mechanisms.
Collapse
Affiliation(s)
| | - Claudia Clopath
- Department of Bioengineering, Imperial College London, London, United Kingdom
| |
Collapse
|
5
|
Tian Y, Saradhi S, Bello E, Johnson MD, D’Eleuterio G, Popovic MR, Lankarany M. Model-based closed-loop control of thalamic deep brain stimulation. FRONTIERS IN NETWORK PHYSIOLOGY 2024; 4:1356653. [PMID: 38650608 PMCID: PMC11033853 DOI: 10.3389/fnetp.2024.1356653] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/15/2023] [Accepted: 03/18/2024] [Indexed: 04/25/2024]
Abstract
Introduction: Closed-loop control of deep brain stimulation (DBS) is beneficial for effective and automatic treatment of various neurological disorders like Parkinson's disease (PD) and essential tremor (ET). Manual (open-loop) DBS programming solely based on clinical observations relies on neurologists' expertise and patients' experience. Continuous stimulation in open-loop DBS may decrease battery life and cause side effects. On the contrary, a closed-loop DBS system uses a feedback biomarker/signal to track worsening (or improving) of patients' symptoms and offers several advantages compared to the open-loop DBS system. Existing closed-loop DBS control systems do not incorporate physiological mechanisms underlying DBS or symptoms, e.g., how DBS modulates dynamics of synaptic plasticity. Methods: In this work, we propose a computational framework for development of a model-based DBS controller where a neural model can describe the relationship between DBS and neural activity and a polynomial-based approximation can estimate the relationship between neural and behavioral activities. A controller is used in our model in a quasi-real-time manner to find DBS patterns that significantly reduce the worsening of symptoms. By using the proposed computational framework, these DBS patterns can be tested clinically by predicting the effect of DBS before delivering it to the patient. We applied this framework to the problem of finding optimal DBS frequencies for essential tremor given electromyography (EMG) recordings solely. Building on our recent network model of ventral intermediate nuclei (Vim), the main surgical target of the tremor, in response to DBS, we developed neural model simulation in which physiological mechanisms underlying Vim-DBS are linked to symptomatic changes in EMG signals. By using a proportional-integral-derivative (PID) controller, we showed that a closed-loop system can track EMG signals and adjust the stimulation frequency of Vim-DBS so that the power of EMG reaches a desired control target. Results and discussion: We demonstrated that the model-based DBS frequency aligns well with that used in clinical studies. Our model-based closed-loop system is adaptable to different control targets and can potentially be used for different diseases and personalized systems.
Collapse
Affiliation(s)
- Yupeng Tian
- Krembil Brain Institute—University Health Network, Toronto, ON, Canada
- Institute of Biomedical Engineering, University of Toronto, Toronto, ON, Canada
- KITE Research Institute, Toronto Rehabilitation Institute - University Health Network, Toronto, ON, Canada
| | - Srikar Saradhi
- Krembil Brain Institute—University Health Network, Toronto, ON, Canada
- Institute of Biomedical Engineering, University of Toronto, Toronto, ON, Canada
| | - Edward Bello
- Department of Biomedical Engineering, University of Minnesota, Minneapolis, MN, United States
| | - Matthew D. Johnson
- Department of Biomedical Engineering, University of Minnesota, Minneapolis, MN, United States
| | | | - Milos R. Popovic
- Institute of Biomedical Engineering, University of Toronto, Toronto, ON, Canada
- KITE Research Institute, Toronto Rehabilitation Institute - University Health Network, Toronto, ON, Canada
- Center for Advancing Neurotechnological Innovation to Application, University Health Network and University of Toronto, Toronto, ON, Canada
| | - Milad Lankarany
- Krembil Brain Institute—University Health Network, Toronto, ON, Canada
- Institute of Biomedical Engineering, University of Toronto, Toronto, ON, Canada
- KITE Research Institute, Toronto Rehabilitation Institute - University Health Network, Toronto, ON, Canada
- Center for Advancing Neurotechnological Innovation to Application, University Health Network and University of Toronto, Toronto, ON, Canada
- Department of Physiology, University of Toronto, Toronto, ON, Canada
- Institute of Medical Science, University of Toronto, Toronto, ON, Canada
| |
Collapse
|
6
|
Jain M, Patel MJ, Liu L, Gosai J, Khemnani M, Gogoi HJ, Chee MY, Guerrero A, Lew WS, Solanki A. Insights into synaptic functionality and resistive switching in lead iodide flexible memristor devices. NANOSCALE HORIZONS 2024; 9:438-448. [PMID: 38259176 DOI: 10.1039/d3nh00505d] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/24/2024]
Abstract
Neuromorphic platforms are gaining popularity due to their superior efficiency, low power consumption, and adaptable parallel signal processing capabilities, overcoming the limitations of traditional von Neumann architecture. We conduct an in-depth investigation into the factors influencing the resistive switching mechanism in memristor devices utilizing lead iodide (PbI2). We establish correlations between device performance and morphological features, unveiling synaptic like behaviour of device making it suitable for range of flexible neuromorphic applications. Notably, a highly reliable unipolar switching mechanism is identified, exhibiting stability even under mechanical strain (with a bending radius of approximately 4 mm) and in high humidity environment (at 75% relative humidity) without the need for encapsulation. The investigation delves into the complex interplay of charge transport, ion migration and the active interface, elucidating the factors contributing to the remarkable resistive switching observed in PbI2-based memristors. The detailed findings highlight synaptic behaviors akin to the modulation of synaptic strengths, with an impressive potentiation and depression of 2 × 104 cycles, emphasizing the role of spike time-dependent plasticity (STDP). The flexible platform demonstrates exceptional performance, achieving a simulated accuracy rate of 95.06% in recognizing modified patterns from the National Institute of Standards and Technology (MNIST) dataset with just 30 training epochs. Ultimately, this research underscores the potential of PbI2-based flexible memristor devices as versatile component for neuromorphic computing. Moreover, it demonstrate the robustness of PbI2 memristors in terms of their resistive switching capabilities, showcasing resilience both mechanically and electrically. This underscores their potential in replicating synaptic functions for advanced information processing systems.
Collapse
Affiliation(s)
- Muskan Jain
- Department of Physics, School of Energy Technology, Pandit Deendayal Energy University, Raysan, Gandhinagar 382426, India.
- Flextronics Lab, Pandit Deendayal Energy University, Gandhinagar, Gujarat 382426, India
| | - Mayur Jagdishbhai Patel
- Department of Chemistry, Indian Institute of Technology Guwahati, Guwahati 781039, Assam, India
| | - Lingli Liu
- School of Physical and Mathematical Sciences, Nanyang Technological University, 637371, Singapore
| | - Jeny Gosai
- Flextronics Lab, Pandit Deendayal Energy University, Gandhinagar, Gujarat 382426, India
- Department of Chemistry, School of Energy Technology, Pandit Deendayal Energy University, Raysan, Gandhinagar 382426, India
| | - Manish Khemnani
- Department of Physics, School of Energy Technology, Pandit Deendayal Energy University, Raysan, Gandhinagar 382426, India.
- Flextronics Lab, Pandit Deendayal Energy University, Gandhinagar, Gujarat 382426, India
| | - Himangshu Jyoti Gogoi
- Department of Electrical Engineering, Indian Institute of Technology Guwahati, 781039 Assam, India
| | - Mun Yin Chee
- School of Physical and Mathematical Sciences, Nanyang Technological University, 637371, Singapore
| | - Antonio Guerrero
- Institute of Advanced Materials (INAM), Universitat Jaume I, 12006 Castello, Spain
| | - Wen Siang Lew
- School of Physical and Mathematical Sciences, Nanyang Technological University, 637371, Singapore
| | - Ankur Solanki
- Department of Physics, School of Energy Technology, Pandit Deendayal Energy University, Raysan, Gandhinagar 382426, India.
- Flextronics Lab, Pandit Deendayal Energy University, Gandhinagar, Gujarat 382426, India
| |
Collapse
|
7
|
Lansner A, Fiebig F, Herman P. Fast Hebbian plasticity and working memory. Curr Opin Neurobiol 2023; 83:102809. [PMID: 37980802 DOI: 10.1016/j.conb.2023.102809] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2023] [Revised: 10/10/2023] [Accepted: 10/19/2023] [Indexed: 11/21/2023]
Abstract
Theories and models of working memory (WM) were at least since the mid-1990s dominated by the persistent activity hypothesis. The past decade has seen rising concerns about the shortcomings of sustained activity as the mechanism for short-term maintenance of WM information in the light of accumulating experimental evidence for so-called activity-silent WM and the fundamental difficulty in explaining robust multi-item WM. In consequence, alternative theories are now explored mostly in the direction of fast synaptic plasticity as the underlying mechanism. The question of non-Hebbian vs Hebbian synaptic plasticity emerges naturally in this context. In this review, we focus on fast Hebbian plasticity and trace the origins of WM theories and models building on this form of associative learning.
Collapse
Affiliation(s)
- Anders Lansner
- Stockholm University, Department of Mathematics, SE-106 91 Stockholm, Sweden; KTH Royal Institute of Technology, Dept of Computational Science and Technology, 100 44 Stockholm, Sweden; SeRC (Swedish e-Science Research Center), Sweden.
| | - Florian Fiebig
- KTH Royal Institute of Technology, Dept of Computational Science and Technology, 100 44 Stockholm, Sweden.
| | - Pawel Herman
- KTH Royal Institute of Technology, Dept of Computational Science and Technology, 100 44 Stockholm, Sweden; Digital Futures, KTH Royal Institute of Technology, Stockholm, Sweden; SeRC (Swedish e-Science Research Center), Sweden. https://twitter.com/PHermanKTHbrain
| |
Collapse
|
8
|
Phillips RS, Baertsch NA. Interdependence of cellular and network properties in respiratory rhythmogenesis. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.10.30.564834. [PMID: 37961254 PMCID: PMC10634953 DOI: 10.1101/2023.10.30.564834] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/15/2023]
Abstract
How breathing is generated by the preBötzinger Complex (preBötC) remains divided between two ideological frameworks, and the persistent sodium current (INaP) lies at the heart of this debate. Although INaP is widely expressed, the pacemaker hypothesis considers it essential because it endows a small subset of neurons with intrinsic bursting or "pacemaker" activity. In contrast, burstlet theory considers INaP dispensable because rhythm emerges from "pre-inspiratory" spiking activity driven by feed-forward network interactions. Using computational modeling, we discover that changes in spike shape can dissociate INaP from intrinsic bursting. Consistent with many experimental benchmarks, conditional effects on spike shape during simulated changes in oxygenation, development, extracellular potassium, and temperature alter the prevalence of intrinsic bursting and pre-inspiratory spiking without altering the role of INaP. Our results support a unifying hypothesis where INaP and excitatory network interactions, but not intrinsic bursting or pre-inspiratory spiking, are critical interdependent features of preBötC rhythmogenesis.
Collapse
Affiliation(s)
- Ryan S Phillips
- Center for Integrative Brain Research, Seattle Children's Research Institute, Seattle WA, USA
| | - Nathan A Baertsch
- Center for Integrative Brain Research, Seattle Children's Research Institute, Seattle WA, USA
- Pulmonary, Critical Care and Sleep Medicine, Department of Pediatrics, University of Washington, Seattle WA, USA
- Department of Physiology and Biophysics, University of Washington, Seattle WA, USA
| |
Collapse
|
9
|
Madar A, Dong C, Sheffield M. BTSP, not STDP, Drives Shifts in Hippocampal Representations During Familiarization. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.10.17.562791. [PMID: 37904999 PMCID: PMC10614909 DOI: 10.1101/2023.10.17.562791] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/02/2023]
Abstract
Synaptic plasticity is widely thought to support memory storage in the brain, but the rules determining impactful synaptic changes in-vivo are not known. We considered the trial-by-trial shifting dynamics of hippocampal place fields (PFs) as an indicator of ongoing plasticity during memory formation. By implementing different plasticity rules in computational models of spiking place cells and comparing to experimentally measured PFs from mice navigating familiar and novel environments, we found that Behavioral-Timescale-Synaptic-Plasticity (BTSP), rather than Hebbian Spike-Timing-Dependent-Plasticity, is the principal mechanism governing PF shifting dynamics. BTSP-triggering events are rare, but more frequent during novel experiences. During exploration, their probability is dynamic: it decays after PF onset, but continually drives a population-level representational drift. Finally, our results show that BTSP occurs in CA3 but is less frequent and phenomenologically different than in CA1. Overall, our study provides a new framework to understand how synaptic plasticity shapes neuronal representations during learning.
Collapse
Affiliation(s)
- A.D. Madar
- Department of Neurobiology, Neuroscience Institute, University of Chicago
| | - C. Dong
- Department of Neurobiology, Neuroscience Institute, University of Chicago
- current affiliation: Department of Neurobiology, Stanford University School of Medicine
| | - M.E.J. Sheffield
- Department of Neurobiology, Neuroscience Institute, University of Chicago
| |
Collapse
|
10
|
Fang W, Chen Y, Ding J, Yu Z, Masquelier T, Chen D, Huang L, Zhou H, Li G, Tian Y. SpikingJelly: An open-source machine learning infrastructure platform for spike-based intelligence. SCIENCE ADVANCES 2023; 9:eadi1480. [PMID: 37801497 PMCID: PMC10558124 DOI: 10.1126/sciadv.adi1480] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/06/2023] [Accepted: 09/05/2023] [Indexed: 10/08/2023]
Abstract
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency by introducing neural dynamics and spike properties. As the emerging spiking deep learning paradigm attracts increasing interest, traditional programming frameworks cannot meet the demands of the automatic differentiation, parallel computation acceleration, and high integration of processing neuromorphic datasets and deployment. In this work, we present the SpikingJelly framework to address the aforementioned dilemma. We contribute a full-stack toolkit for preprocessing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips. Compared to existing methods, the training of deep SNNs can be accelerated 11×, and the superior extensibility and flexibility of SpikingJelly enable users to accelerate custom models at low costs through multilevel inheritance and semiautomatic code generation. SpikingJelly paves the way for synthesizing truly energy-efficient SNN-based machine intelligence systems, which will enrich the ecology of neuromorphic computing.
Collapse
Affiliation(s)
- Wei Fang
- School of Computer Science, Peking University, China
- Peng Cheng Laboratory, China
- School of Electronic and Computer Engineering, Shenzhen Graduate School, Peking University, China
| | - Yanqi Chen
- School of Computer Science, Peking University, China
- Peng Cheng Laboratory, China
| | - Jianhao Ding
- School of Computer Science, Peking University, China
| | - Zhaofei Yu
- Institute for Artificial Intelligence, Peking University, China
| | - Timothée Masquelier
- Centre de Recherche Cerveau et Cognition (CERCO), UMR5549 CNRS–Université Toulouse 3, France
| | - Ding Chen
- Peng Cheng Laboratory, China
- Department of Computer Science and Engineering, Shanghai Jiao Tong University, China
| | - Liwei Huang
- School of Computer Science, Peking University, China
- Peng Cheng Laboratory, China
| | | | - Guoqi Li
- Institute of Automation, Chinese Academy of Sciences, China
- School of Artificial Intelligence, University of Chinese Academy of Sciences, China
| | - Yonghong Tian
- School of Computer Science, Peking University, China
- Peng Cheng Laboratory, China
- School of Electronic and Computer Engineering, Shenzhen Graduate School, Peking University, China
| |
Collapse
|
11
|
Nicoll RA, Schulman H. Synaptic memory and CaMKII. Physiol Rev 2023; 103:2877-2925. [PMID: 37290118 PMCID: PMC10642921 DOI: 10.1152/physrev.00034.2022] [Citation(s) in RCA: 15] [Impact Index Per Article: 15.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2022] [Revised: 04/26/2023] [Accepted: 04/30/2023] [Indexed: 06/10/2023] Open
Abstract
Ca2+/calmodulin-dependent protein kinase II (CaMKII) and long-term potentiation (LTP) were discovered within a decade of each other and have been inextricably intertwined ever since. However, like many marriages, it has had its up and downs. Based on the unique biochemical properties of CaMKII, it was proposed as a memory molecule before any physiological linkage was made to LTP. However, as reviewed here, the convincing linkage of CaMKII to synaptic physiology and behavior took many decades. New technologies were critical in this journey, including in vitro brain slices, mouse genetics, single-cell molecular genetics, pharmacological reagents, protein structure, and two-photon microscopy, as were new investigators attracted by the exciting challenge. This review tracks this journey and assesses the state of this marriage 40 years on. The collective literature impels us to propose a relatively simple model for synaptic memory involving the following steps that drive the process: 1) Ca2+ entry through N-methyl-d-aspartate (NMDA) receptors activates CaMKII. 2) CaMKII undergoes autophosphorylation resulting in constitutive, Ca2+-independent activity and exposure of a binding site for the NMDA receptor subunit GluN2B. 3) Active CaMKII translocates to the postsynaptic density (PSD) and binds to the cytoplasmic C-tail of GluN2B. 4) The CaMKII-GluN2B complex initiates a structural rearrangement of the PSD that may involve liquid-liquid phase separation. 5) This rearrangement involves the PSD-95 scaffolding protein, α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid receptors (AMPARs), and their transmembrane AMPAR-regulatory protein (TARP) auxiliary subunits, resulting in an accumulation of AMPARs in the PSD that underlies synaptic potentiation. 6) The stability of the modified PSD is maintained by the stability of the CaMKII-GluN2B complex. 7) By a process of subunit exchange or interholoenzyme phosphorylation CaMKII maintains synaptic potentiation in the face of CaMKII protein turnover. There are many other important proteins that participate in enlargement of the synaptic spine or modulation of the steps that drive and maintain the potentiation. In this review we critically discuss the data underlying each of the steps. As will become clear, some of these steps are more firmly grounded than others, and we provide suggestions as to how the evidence supporting these steps can be strengthened or, based on the new data, be replaced. Although the journey has been a long one, the prospect of having a detailed cellular and molecular understanding of learning and memory is at hand.
Collapse
Affiliation(s)
- Roger A Nicoll
- Department of Cellular and Molecular Pharmacology, University of California at San Francisco, San Francisco, California, United States
| | - Howard Schulman
- Department of Neurobiology, Stanford University School of Medicine, Stanford, California, United States
- Panorama Research Institute, Sunnyvale, California, United States
| |
Collapse
|
12
|
Zhu L, Mangan M, Webb B. Neuromorphic sequence learning with an event camera on routes through vegetation. Sci Robot 2023; 8:eadg3679. [PMID: 37756384 DOI: 10.1126/scirobotics.adg3679] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2022] [Accepted: 08/29/2023] [Indexed: 09/29/2023]
Abstract
For many robotics applications, it is desirable to have relatively low-power and efficient onboard solutions. We took inspiration from insects, such as ants, that are capable of learning and following routes in complex natural environments using relatively constrained sensory and neural systems. Such capabilities are particularly relevant to applications such as agricultural robotics, where visual navigation through dense vegetation remains a challenging task. In this scenario, a route is likely to have high self-similarity and be subject to changing lighting conditions and motion over uneven terrain, and the effects of wind on leaves increase the variability of the input. We used a bioinspired event camera on a terrestrial robot to collect visual sequences along routes in natural outdoor environments and applied a neural algorithm for spatiotemporal memory that is closely based on a known neural circuit in the insect brain. We show that this method is plausible to support route recognition for visual navigation and more robust than SeqSLAM when evaluated on repeated runs on the same route or routes with small lateral offsets. By encoding memory in a spiking neural network running on a neuromorphic computer, our model can evaluate visual familiarity in real time from event camera footage.
Collapse
Affiliation(s)
- Le Zhu
- School of Informatics, University of Edinburgh, EH8 9AB Edinburgh, UK
| | - Michael Mangan
- Sheffield Robotics, Department of Computer Science, University of Sheffield, S1 4DP Sheffield, UK
| | - Barbara Webb
- School of Informatics, University of Edinburgh, EH8 9AB Edinburgh, UK
| |
Collapse
|
13
|
Park TJ, Deng S, Manna S, Islam ANMN, Yu H, Yuan Y, Fong DD, Chubykin AA, Sengupta A, Sankaranarayanan SKRS, Ramanathan S. Complex Oxides for Brain-Inspired Computing: A Review. ADVANCED MATERIALS (DEERFIELD BEACH, FLA.) 2023; 35:e2203352. [PMID: 35723973 DOI: 10.1002/adma.202203352] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/13/2022] [Revised: 06/02/2022] [Indexed: 06/15/2023]
Abstract
The fields of brain-inspired computing, robotics, and, more broadly, artificial intelligence (AI) seek to implement knowledge gleaned from the natural world into human-designed electronics and machines. In this review, the opportunities presented by complex oxides, a class of electronic ceramic materials whose properties can be elegantly tuned by doping, electron interactions, and a variety of external stimuli near room temperature, are discussed. The review begins with a discussion of natural intelligence at the elementary level in the nervous system, followed by collective intelligence and learning at the animal colony level mediated by social interactions. An important aspect highlighted is the vast spatial and temporal scales involved in learning and memory. The focus then turns to collective phenomena, such as metal-to-insulator transitions (MITs), ferroelectricity, and related examples, to highlight recent demonstrations of artificial neurons, synapses, and circuits and their learning. First-principles theoretical treatments of the electronic structure, and in situ synchrotron spectroscopy of operating devices are then discussed. The implementation of the experimental characteristics into neural networks and algorithm design is then revewed. Finally, outstanding materials challenges that require a microscopic understanding of the physical mechanisms, which will be essential for advancing the frontiers of neuromorphic computing, are highlighted.
Collapse
Affiliation(s)
- Tae Joon Park
- School of Materials Engineering, Purdue University, West Lafayette, IN, 47907, USA
| | - Sunbin Deng
- School of Materials Engineering, Purdue University, West Lafayette, IN, 47907, USA
| | - Sukriti Manna
- Center for Nanoscale Materials, Argonne National Laboratory, Argonne, IL, 60439, USA
| | - A N M Nafiul Islam
- Department of Electrical Engineering, The Pennsylvania State University, University Park, PA, 16802, USA
| | - Haoming Yu
- School of Materials Engineering, Purdue University, West Lafayette, IN, 47907, USA
| | - Yifan Yuan
- School of Materials Engineering, Purdue University, West Lafayette, IN, 47907, USA
| | - Dillon D Fong
- Materials Science Division, Argonne National Laboratory, Lemont, IL, 60439, USA
| | - Alexander A Chubykin
- Department of Biological Sciences, Purdue Institute for Integrative Neuroscience, Purdue University, West Lafayette, IN, 47907, USA
| | - Abhronil Sengupta
- Department of Electrical Engineering, The Pennsylvania State University, University Park, PA, 16802, USA
| | - Subramanian K R S Sankaranarayanan
- Center for Nanoscale Materials, Argonne National Laboratory, Argonne, IL, 60439, USA
- Department of Mechanical and Industrial Engineering, University of Illinois Chicago, Chicago, IL, 60607, USA
| | - Shriram Ramanathan
- School of Materials Engineering, Purdue University, West Lafayette, IN, 47907, USA
| |
Collapse
|
14
|
Li Q, Pang Y, Wang Y, Han X, Li Q, Zhao M. CBMC: A Biomimetic Approach for Control of a 7-Degree of Freedom Robotic Arm. Biomimetics (Basel) 2023; 8:389. [PMID: 37754140 PMCID: PMC10526988 DOI: 10.3390/biomimetics8050389] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2023] [Revised: 08/23/2023] [Accepted: 08/24/2023] [Indexed: 09/28/2023] Open
Abstract
Many approaches inspired by brain science have been proposed for robotic control, specifically targeting situations where knowledge of the dynamic model is unavailable. This is crucial because dynamic model inaccuracies and variations can occur during the robot's operation. In this paper, inspired by the central nervous system (CNS), we present a CNS-based Biomimetic Motor Control (CBMC) approach consisting of four modules. The first module consists of a cerebellum-like spiking neural network that employs spiking timing-dependent plasticity to learn the dynamics mechanisms and adjust the synapses connecting the spiking neurons. The second module constructed using an artificial neural network, mimicking the regulation ability of the cerebral cortex to the cerebellum in the CNS, learns by reinforcement learning to supervise the cerebellum module with instructive input. The third and last modules are the cerebral sensory module and the spinal cord module, which deal with sensory input and provide modulation to torque commands, respectively. To validate our method, CBMC was applied to the trajectory tracking control of a 7-DoF robotic arm in simulation. Finally, experiments are conducted on the robotic arm using various payloads, and the results of these experiments clearly demonstrate the effectiveness of the proposed methodology.
Collapse
Affiliation(s)
- Qingkai Li
- Department of Automation, Tsinghua University, Beijing 100084, China
| | - Yanbo Pang
- Department of Automation, Tsinghua University, Beijing 100084, China
| | - Yushi Wang
- Department of Automation, Tsinghua University, Beijing 100084, China
| | - Xinyu Han
- Department of Automation, Tsinghua University, Beijing 100084, China
| | - Qing Li
- Department of Automation, Tsinghua University, Beijing 100084, China
| | - Mingguo Zhao
- Department of Automation, Tsinghua University, Beijing 100084, China
- Beijing Innovation Center for Future Chips, Tsinghua University, Beijing 100084, China
| |
Collapse
|
15
|
Li KT, Ji D, Zhou C. Memory rescue and learning in synaptic impaired neuronal circuits. iScience 2023; 26:106931. [PMID: 37534172 PMCID: PMC10391582 DOI: 10.1016/j.isci.2023.106931] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2022] [Revised: 04/05/2023] [Accepted: 05/16/2023] [Indexed: 08/04/2023] Open
Abstract
Neuronal impairment is a characteristic of Alzheimer's disease (AD), but its effect on neural activity dynamics underlying memory deficits is unclear. Here, we studied the effects of synaptic impairment on neural activities associated with memory recall, memory rescue, and learning a new memory, in an integrate-and-fire neuronal network. Our results showed that reducing connectivity decreases the neuronal synchronization of memory neurons and impairs memory recall performance. Although, slow-gamma stimulation rescued memory recall and slow-gamma oscillations, the rescue caused a side effect of activating mixed memories. During the learning of a new memory, reducing connectivity caused impairment in storing the new memory, but did not affect previously stored memories. We also explored the effects of other types of impairments including neuronal loss and excitation-inhibition imbalance and the rescue by general increase of excitability. Our results reveal potential computational mechanisms underlying the memory deficits caused by impairment in AD.
Collapse
Affiliation(s)
- Kwan Tung Li
- Department of Physics, Centre for Nonlinear Studies, Beijing–Hong Kong–Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Hong Kong, China
- Research Center for Augmented Intelligence, Research Institute of Artificial Intelligence, Zhejiang Lab, Hangzhou 311100, China
| | - Daoyun Ji
- Department of Neuroscience, Baylor College of Medicine, Houston, TX 77030, USA
- Department of Molecular and Cellular Biology, Baylor College of Medicine, Houston, TX 77030, USA
| | - Changsong Zhou
- Department of Physics, Centre for Nonlinear Studies, Beijing–Hong Kong–Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Hong Kong, China
| |
Collapse
|
16
|
Lobov SA, Berdnikova ES, Zharinov AI, Kurganov DP, Kazantsev VB. STDP-Driven Rewiring in Spiking Neural Networks under Stimulus-Induced and Spontaneous Activity. Biomimetics (Basel) 2023; 8:320. [PMID: 37504208 PMCID: PMC10807410 DOI: 10.3390/biomimetics8030320] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2023] [Revised: 07/18/2023] [Accepted: 07/19/2023] [Indexed: 07/29/2023] Open
Abstract
Mathematical and computer simulation of learning in living neural networks have typically focused on changes in the efficiency of synaptic connections represented by synaptic weights in the models. Synaptic plasticity is believed to be the cellular basis for learning and memory. In spiking neural networks composed of dynamical spiking units, a biologically relevant learning rule is based on the so-called spike-timing-dependent plasticity or STDP. However, experimental data suggest that synaptic plasticity is only a part of brain circuit plasticity, which also includes homeostatic and structural plasticity. A model of structural plasticity proposed in this study is based on the activity-dependent appearance and disappearance of synaptic connections. The results of the research indicate that such adaptive rewiring enables the consolidation of the effects of STDP in response to a local external stimulation of a neural network. Subsequently, a vector field approach is used to demonstrate the successive "recording" of spike paths in both functional connectome and synaptic connectome, and finally in the anatomical connectome of the network. Moreover, the findings suggest that the adaptive rewiring could stabilize network dynamics over time in the context of activity patterns' reproducibility. A universal measure of such reproducibility introduced in this article is based on similarity between time-consequent patterns of the special vector fields characterizing both functional and anatomical connectomes.
Collapse
Affiliation(s)
- Sergey A. Lobov
- Laboratory of Neurobiomorphic Technologies, The Moscow Institute of Physics and Technology, 117303 Moscow, Russia;
- Neurotechnology Department, Lobachevsky State University of Nizhny Novgorod, 603022 Nizhny Novgorod, Russia; (E.S.B.); (A.I.Z.)
| | - Ekaterina S. Berdnikova
- Neurotechnology Department, Lobachevsky State University of Nizhny Novgorod, 603022 Nizhny Novgorod, Russia; (E.S.B.); (A.I.Z.)
| | - Alexey I. Zharinov
- Neurotechnology Department, Lobachevsky State University of Nizhny Novgorod, 603022 Nizhny Novgorod, Russia; (E.S.B.); (A.I.Z.)
| | - Dmitry P. Kurganov
- Laboratory of Neuromodeling, Samara State Medical University, 443079 Samara, Russia;
| | - Victor B. Kazantsev
- Laboratory of Neurobiomorphic Technologies, The Moscow Institute of Physics and Technology, 117303 Moscow, Russia;
- Neurotechnology Department, Lobachevsky State University of Nizhny Novgorod, 603022 Nizhny Novgorod, Russia; (E.S.B.); (A.I.Z.)
- Laboratory of Neuromodeling, Samara State Medical University, 443079 Samara, Russia;
| |
Collapse
|
17
|
Kim D, Choi J, Cheon M, Jeong Y, Kim J, Kwak JY, Park JK, Lee S, Kim I, Park J. Real-time Neural Connectivity Inference with Presynaptic Spike-driven Spike Timing-Dependent Plasticity. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2023; 2023:1-4. [PMID: 38082930 DOI: 10.1109/embc40787.2023.10341017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2023]
Abstract
Brain-like artificial intelligence in electronics can be built efficiently by understanding the connectivity of neuronal circuitry. The concept of neural connectivity inference with a two-dimensional cross-bar structure memristor array is indicated in recent studies; however, large-scale implementation is challenging owing to device variations and the requirement of online parameter adaptation. This study proposes a neural connectivity inference method with one-dimensional spiking neurons using spike timing-dependent plasticity and presynaptic spike-driven spike timing-dependent plasticity learning rules, designed for a large-scale neuromorphic system. The proposed learning process decreases the number of spiking neurons by half. We simulate 12 ground-truth neural networks comprising one-dimensional eight and 64 neurons. We analyze the correlation between the neural connectivity of the ground truth and spiking neural networks using the Matthews correlation coefficient. In addition, we analyze the sensitivity and specificity of inference. Validation using the presynaptic spike-driven spike timing-dependent plasticity learning rule implies a potential approach for large-scale neural network inference with real hardware realization of large-scale neuromorphic systems.
Collapse
|
18
|
Schmidgall S, Hays J. Meta-SpikePropamine: learning to learn with synaptic plasticity in spiking neural networks. Front Neurosci 2023; 17:1183321. [PMID: 37250397 PMCID: PMC10213417 DOI: 10.3389/fnins.2023.1183321] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2023] [Accepted: 04/06/2023] [Indexed: 05/31/2023] Open
Abstract
We propose that in order to harness our understanding of neuroscience toward machine learning, we must first have powerful tools for training brain-like models of learning. Although substantial progress has been made toward understanding the dynamics of learning in the brain, neuroscience-derived models of learning have yet to demonstrate the same performance capabilities as methods in deep learning such as gradient descent. Inspired by the successes of machine learning using gradient descent, we introduce a bi-level optimization framework that seeks to both solve online learning tasks and improve the ability to learn online using models of plasticity from neuroscience. We demonstrate that models of three-factor learning with synaptic plasticity taken from the neuroscience literature can be trained in Spiking Neural Networks (SNNs) with gradient descent via a framework of learning-to-learn to address challenging online learning problems. This framework opens a new path toward developing neuroscience inspired online learning algorithms.
Collapse
Affiliation(s)
- Samuel Schmidgall
- U.S. Naval Research Laboratory, Spacecraft Engineering Department, Washington, DC, United States
- Department of Electrical and Computer Engineering, Johns Hopkins University, Baltimore, MD, United States
| | - Joe Hays
- U.S. Naval Research Laboratory, Spacecraft Engineering Department, Washington, DC, United States
| |
Collapse
|
19
|
Bouhadjar Y, Wouters DJ, Diesmann M, Tetzlaff T. Coherent noise enables probabilistic sequence replay in spiking neuronal networks. PLoS Comput Biol 2023; 19:e1010989. [PMID: 37130121 PMCID: PMC10153753 DOI: 10.1371/journal.pcbi.1010989] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2022] [Accepted: 03/02/2023] [Indexed: 05/03/2023] Open
Abstract
Animals rely on different decision strategies when faced with ambiguous or uncertain cues. Depending on the context, decisions may be biased towards events that were most frequently experienced in the past, or be more explorative. A particular type of decision making central to cognition is sequential memory recall in response to ambiguous cues. A previously developed spiking neuronal network implementation of sequence prediction and recall learns complex, high-order sequences in an unsupervised manner by local, biologically inspired plasticity rules. In response to an ambiguous cue, the model deterministically recalls the sequence shown most frequently during training. Here, we present an extension of the model enabling a range of different decision strategies. In this model, explorative behavior is generated by supplying neurons with noise. As the model relies on population encoding, uncorrelated noise averages out, and the recall dynamics remain effectively deterministic. In the presence of locally correlated noise, the averaging effect is avoided without impairing the model performance, and without the need for large noise amplitudes. We investigate two forms of correlated noise occurring in nature: shared synaptic background inputs, and random locking of the stimulus to spatiotemporal oscillations in the network activity. Depending on the noise characteristics, the network adopts various recall strategies. This study thereby provides potential mechanisms explaining how the statistics of learned sequences affect decision making, and how decision strategies can be adjusted after learning.
Collapse
Affiliation(s)
- Younes Bouhadjar
- Institute of Neuroscience and Medicine (INM-6), & Institute for Advanced Simulation (IAS-6), & JARA BRAIN Institute Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Peter Grünberg Institute (PGI-7,10), Jülich Research Centre and JARA, Jülich, Germany
- RWTH Aachen University, Aachen, Germany
| | - Dirk J Wouters
- Institute of Electronic Materials (IWE 2) & JARA-FIT, RWTH Aachen University, Aachen, Germany
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6), & Institute for Advanced Simulation (IAS-6), & JARA BRAIN Institute Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, & Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen University, Aachen, Germany
| | - Tom Tetzlaff
- Institute of Neuroscience and Medicine (INM-6), & Institute for Advanced Simulation (IAS-6), & JARA BRAIN Institute Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| |
Collapse
|
20
|
Lube AJ, Ma X, Carlson BA. Spike timing-dependent plasticity alters electrosensory neuron synaptic strength in vitro but does not consistently predict changes in sensory tuning in vivo. J Neurophysiol 2023; 129:1127-1144. [PMID: 37073981 PMCID: PMC10151048 DOI: 10.1152/jn.00498.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2022] [Revised: 04/12/2023] [Accepted: 04/13/2023] [Indexed: 04/20/2023] Open
Abstract
How do sensory systems optimize detection of behaviorally relevant stimuli when the sensory environment is constantly changing? We addressed the role of spike timing-dependent plasticity (STDP) in driving changes in synaptic strength in a sensory pathway and whether those changes in synaptic strength could alter sensory tuning. It is challenging to precisely control temporal patterns of synaptic activity in vivo and replicate those patterns in vitro in behaviorally relevant ways. This makes it difficult to make connections between STDP-induced changes in synaptic physiology and plasticity in sensory systems. Using the mormyrid species Brevimyrus niger and Brienomyrus brachyistius, which produce electric organ discharges for electrolocation and communication, we can precisely control the timing of synaptic input in vivo and replicate these same temporal patterns of synaptic input in vitro. In central electrosensory neurons in the electric communication pathway, using whole cell intracellular recordings in vitro, we paired presynaptic input with postsynaptic spiking at different delays. Using whole cell intracellular recordings in awake, behaving fish, we paired sensory stimulation with postsynaptic spiking using the same delays. We found that Hebbian STDP predictably alters sensory tuning in vitro and is mediated by NMDA receptors. However, the change in synaptic responses induced by sensory stimulation in vivo did not adhere to the direction predicted by the STDP observed in vitro. Further analysis suggests that this difference is influenced by polysynaptic activity, including inhibitory interneurons. Our findings suggest that STDP rules operating at identified synapses may not drive predictable changes in sensory responses at the circuit level.NEW & NOTEWORTHY We replicated behaviorally relevant temporal patterns of synaptic activity in vitro and used the same patterns during sensory stimulation in vivo. There was a Hebbian spike timing-dependent plasticity (STDP) pattern in vitro, but sensory responses in vivo did not shift according to STDP predictions. Analysis suggests that this disparity is influenced by differences in polysynaptic activity, including inhibitory interneurons. These results suggest that STDP rules at synapses in vitro do not necessarily apply to circuits in vivo.
Collapse
Affiliation(s)
- Adalee J Lube
- Department of Biology, Washington University in St. Louis, St. Louis, Missouri, United States
| | - Xiaofeng Ma
- Department of Biology, Washington University in St. Louis, St. Louis, Missouri, United States
| | - Bruce A Carlson
- Department of Biology, Washington University in St. Louis, St. Louis, Missouri, United States
| |
Collapse
|
21
|
Liu J, Wang Y, Luo Y, Zhang S, Jiang D, Hua Y, Qin S, Yang S. Hardware Spiking Neural Networks with Pair-Based STDP Using Stochastic Computing. Neural Process Lett 2023. [DOI: 10.1007/s11063-023-11255-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/08/2023]
|
22
|
Deng S, Yu H, Park TJ, Islam AN, Manna S, Pofelski A, Wang Q, Zhu Y, Sankaranarayanan SK, Sengupta A, Ramanathan S. Selective area doping for Mott neuromorphic electronics. SCIENCE ADVANCES 2023; 9:eade4838. [PMID: 36930716 PMCID: PMC10022892 DOI: 10.1126/sciadv.ade4838] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/18/2022] [Accepted: 02/16/2023] [Indexed: 06/18/2023]
Abstract
The cointegration of artificial neuronal and synaptic devices with homotypic materials and structures can greatly simplify the fabrication of neuromorphic hardware. We demonstrate experimental realization of vanadium dioxide (VO2) artificial neurons and synapses on the same substrate through selective area carrier doping. By locally configuring pairs of catalytic and inert electrodes that enable nanoscale control over carrier density, volatility or nonvolatility can be appropriately assigned to each two-terminal Mott memory device per lithographic design, and both neuron- and synapse-like devices are successfully integrated on a single chip. Feedforward excitation and inhibition neural motifs are demonstrated at hardware level, followed by simulation of network-level handwritten digit and fashion product recognition tasks with experimental characteristics. Spatially selective electron doping opens up previously unidentified avenues for integration of emerging correlated semiconductors in electronic device technologies.
Collapse
Affiliation(s)
- Sunbin Deng
- School of Materials Engineering, Purdue University, West Lafayette, IN 47907, USA
| | - Haoming Yu
- School of Materials Engineering, Purdue University, West Lafayette, IN 47907, USA
| | - Tae Joon Park
- School of Materials Engineering, Purdue University, West Lafayette, IN 47907, USA
| | - A. N. M. Nafiul Islam
- School of Electrical Engineering and Computer Science, The Pennsylvania State University, University Park, PA 16802, USA
| | - Sukriti Manna
- Center for Nanoscale Materials, Argonne National Laboratory, Lemont, IL 60439, USA
- Department of Mechanical and Industrial Engineering, University of Illinois, Chicago, IL 60607, USA
| | - Alexandre Pofelski
- Department of Condensed Matter Physics and Materials Science, Brookhaven National Laboratory, Upton, NY 11973, USA
| | - Qi Wang
- School of Materials Engineering, Purdue University, West Lafayette, IN 47907, USA
| | - Yimei Zhu
- Department of Condensed Matter Physics and Materials Science, Brookhaven National Laboratory, Upton, NY 11973, USA
| | - Subramanian K. R. S. Sankaranarayanan
- Center for Nanoscale Materials, Argonne National Laboratory, Lemont, IL 60439, USA
- Department of Mechanical and Industrial Engineering, University of Illinois, Chicago, IL 60607, USA
| | - Abhronil Sengupta
- School of Electrical Engineering and Computer Science, The Pennsylvania State University, University Park, PA 16802, USA
| | - Shriram Ramanathan
- School of Materials Engineering, Purdue University, West Lafayette, IN 47907, USA
| |
Collapse
|
23
|
Zhao Y, Lin X, Zhang Z, Wang X, He X, Yang L. STDP-based adaptive graph convolutional networks for automatic sleep staging. Front Neurosci 2023; 17:1158246. [PMID: 37152593 PMCID: PMC10157055 DOI: 10.3389/fnins.2023.1158246] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Accepted: 04/03/2023] [Indexed: 05/09/2023] Open
Abstract
Automatic sleep staging is important for improving diagnosis and treatment, and machine learning with neuroscience explainability of sleep staging is shown to be a suitable method to solve this problem. In this paper, an explainable model for automatic sleep staging is proposed. Inspired by the Spike-Timing-Dependent Plasticity (STDP), an adaptive Graph Convolutional Network (GCN) is established to extract features from the Polysomnography (PSG) signal, named STDP-GCN. In detail, the channel of the PSG signal can be regarded as a neuron, the synapse strength between neurons can be constructed by the STDP mechanism, and the connection between different channels of the PSG signal constitutes a graph structure. After utilizing GCN to extract spatial features, temporal convolution is used to extract transition rules between sleep stages, and a fully connected neural network is used for classification. To enhance the strength of the model and minimize the effect of individual physiological signal discrepancies on classification accuracy, STDP-GCN utilizes domain adversarial training. Experiments demonstrate that the performance of STDP-GCN is comparable to the current state-of-the-art models.
Collapse
|
24
|
Yan X, Qian JH, Sangwan VK, Hersam MC. Progress and Challenges for Memtransistors in Neuromorphic Circuits and Systems. ADVANCED MATERIALS (DEERFIELD BEACH, FLA.) 2022; 34:e2108025. [PMID: 34813677 DOI: 10.1002/adma.202108025] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/07/2021] [Revised: 11/07/2021] [Indexed: 06/13/2023]
Abstract
Due to the increasing importance of artificial intelligence (AI), significant recent effort has been devoted to the development of neuromorphic circuits that seek to emulate the energy-efficient information processing of the brain. While non-volatile memory (NVM) based on resistive switches, phase-change memory, and magnetic tunnel junctions has shown potential for implementing neural networks, additional multi-terminal device concepts are required for more sophisticated bio-realistic functions. Of particular interest are memtransistors based on low-dimensional nanomaterials, which are capable of electrostatically tuning memory and learning behavior at the device level. Herein, a conceptual overview of the memtransistor is provided in the context of neuromorphic circuits. Recent progress is surveyed for memtransistors and related multi-terminal NVM devices including dual-gated floating-gate memories, dual-gated ferroelectric transistors, and dual-gated van der Waals heterojunctions. The different materials systems and device architectures are classified based on the degree of control and relative tunability of synaptic behavior, with an emphasis on device concepts that harness the reduced dimensionality, weak electrostatic screening, and phase-changes properties of nanomaterials. Finally, strategies for achieving wafer-scale integration of memtransistors and multi-terminal NVM devices are delineated, with specific attention given to the materials challenges for practical neuromorphic circuits.
Collapse
Affiliation(s)
- Xiaodong Yan
- Department of Materials Science and Engineering, Northwestern University, Evanston, IL, 60208, USA
| | - Justin H Qian
- Department of Materials Science and Engineering, Northwestern University, Evanston, IL, 60208, USA
| | - Vinod K Sangwan
- Department of Materials Science and Engineering, Northwestern University, Evanston, IL, 60208, USA
| | - Mark C Hersam
- Department of Materials Science and Engineering, Northwestern University, Evanston, IL, 60208, USA
- Department of Electrical and Computer Engineering, Northwestern University, Evanston, IL, 60208, USA
- Department of Chemistry, Northwestern University, Evanston, IL, 60208, USA
| |
Collapse
|
25
|
Ratas I, Pyragas K. Interplay of different synchronization modes and synaptic plasticity in a system of class I neurons. Sci Rep 2022; 12:19631. [PMID: 36385488 PMCID: PMC9668974 DOI: 10.1038/s41598-022-24001-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2022] [Accepted: 11/08/2022] [Indexed: 11/17/2022] Open
Abstract
We analyze the effect of spike-timing-dependent plasticity (STDP) on a system of pulse-coupled class I neurons. Our research begins with a system of two mutually connected quadratic integrate-and-fire (QIF) neurons, which are canonical representatives of class I neurons. Along with various asymptotic modes previously observed in other neuronal models with plastic synapses, we found a stable synchronous mode characterized by unidirectional link from a slower neuron to a faster neuron. In this frequency-locked mode, the faster neuron emits multiple spikes per cycle of the slower neuron. We analytically obtain the Arnold tongues for this mode without STDP and with STDP. We also consider larger plastic networks of QIF neurons and show that the detected mode can manifest itself in such a way that slow neurons become pacemakers. As a result, slow and fast neurons can form large synchronous clusters that generate low-frequency oscillations. We demonstrate the generality of the results obtained with two connected QIF neurons using Wang-Buzsáki and Morris-Lecar biophysically plausible class I neuron models.
Collapse
Affiliation(s)
- Irmantas Ratas
- grid.425985.7Center for Physical Sciences and Technology, 10257 Vilnius, Lithuania
| | - Kestutis Pyragas
- grid.425985.7Center for Physical Sciences and Technology, 10257 Vilnius, Lithuania
| |
Collapse
|
26
|
Kromer JA, Tass PA. Synaptic reshaping of plastic neuronal networks by periodic multichannel stimulation with single-pulse and burst stimuli. PLoS Comput Biol 2022; 18:e1010568. [PMID: 36327232 PMCID: PMC9632832 DOI: 10.1371/journal.pcbi.1010568] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2022] [Accepted: 09/14/2022] [Indexed: 11/06/2022] Open
Abstract
Synaptic dysfunction is associated with several brain disorders, including Alzheimer's disease, Parkinson's disease (PD) and obsessive compulsive disorder (OCD). Utilizing synaptic plasticity, brain stimulation is capable of reshaping synaptic connectivity. This may pave the way for novel therapies that specifically counteract pathological synaptic connectivity. For instance, in PD, novel multichannel coordinated reset stimulation (CRS) was designed to counteract neuronal synchrony and down-regulate pathological synaptic connectivity. CRS was shown to entail long-lasting therapeutic aftereffects in PD patients and related animal models. This is in marked contrast to conventional deep brain stimulation (DBS) therapy, where PD symptoms return shortly after stimulation ceases. In the present paper, we study synaptic reshaping by periodic multichannel stimulation (PMCS) in networks of leaky integrate-and-fire (LIF) neurons with spike-timing-dependent plasticity (STDP). During PMCS, phase-shifted periodic stimulus trains are delivered to segregated neuronal subpopulations. Harnessing STDP, PMCS leads to changes of the synaptic network structure. We found that the PMCS-induced changes of the network structure depend on both the phase lags between stimuli and the shape of individual stimuli. Single-pulse stimuli and burst stimuli with low intraburst frequency down-regulate synapses between neurons receiving stimuli simultaneously. In contrast, burst stimuli with high intraburst frequency up-regulate these synapses. We derive theoretical approximations of the stimulation-induced network structure. This enables us to formulate stimulation strategies for inducing a variety of network structures. Our results provide testable hypotheses for future pre-clinical and clinical studies and suggest that periodic multichannel stimulation may be suitable for reshaping plastic neuronal networks to counteract pathological synaptic connectivity. Furthermore, we provide novel insight on how the stimulus type may affect the long-lasting outcome of conventional DBS. This may strongly impact parameter adjustment procedures for clinical DBS, which, so far, primarily focused on acute effects of stimulation.
Collapse
Affiliation(s)
- Justus A Kromer
- Department of Neurosurgery, Stanford University, Stanford, California, United States of America
| | - Peter A Tass
- Department of Neurosurgery, Stanford University, Stanford, California, United States of America
| |
Collapse
|
27
|
Alevi D, Stimberg M, Sprekeler H, Obermayer K, Augustin M. Brian2CUDA: Flexible and Efficient Simulation of Spiking Neural Network Models on GPUs. Front Neuroinform 2022; 16:883700. [PMID: 36387586 PMCID: PMC9660315 DOI: 10.3389/fninf.2022.883700] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2022] [Accepted: 05/09/2022] [Indexed: 03/26/2024] Open
Abstract
Graphics processing units (GPUs) are widely available and have been used with great success to accelerate scientific computing in the last decade. These advances, however, are often not available to researchers interested in simulating spiking neural networks, but lacking the technical knowledge to write the necessary low-level code. Writing low-level code is not necessary when using the popular Brian simulator, which provides a framework to generate efficient CPU code from high-level model definitions in Python. Here, we present Brian2CUDA, an open-source software that extends the Brian simulator with a GPU backend. Our implementation generates efficient code for the numerical integration of neuronal states and for the propagation of synaptic events on GPUs, making use of their massively parallel arithmetic capabilities. We benchmark the performance improvements of our software for several model types and find that it can accelerate simulations by up to three orders of magnitude compared to Brian's CPU backend. Currently, Brian2CUDA is the only package that supports Brian's full feature set on GPUs, including arbitrary neuron and synapse models, plasticity rules, and heterogeneous delays. When comparing its performance with Brian2GeNN, another GPU-based backend for the Brian simulator with fewer features, we find that Brian2CUDA gives comparable speedups, while being typically slower for small and faster for large networks. By combining the flexibility of the Brian simulator with the simulation speed of GPUs, Brian2CUDA enables researchers to efficiently simulate spiking neural networks with minimal effort and thereby makes the advancements of GPU computing available to a larger audience of neuroscientists.
Collapse
Affiliation(s)
- Denis Alevi
- Technische Universität Berlin, Chair of Modelling of Cognitive Processes, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Marcel Stimberg
- Sorbonne Université, INSERM, CNRS, Institut de la Vision, Paris, France
| | - Henning Sprekeler
- Technische Universität Berlin, Chair of Modelling of Cognitive Processes, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Klaus Obermayer
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Technische Universität Berlin, Chair of Neural Information Processing, Berlin, Germany
| | - Moritz Augustin
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Technische Universität Berlin, Chair of Neural Information Processing, Berlin, Germany
| |
Collapse
|
28
|
Végh J, Berki ÁJ. On the Role of Speed in Technological and Biological Information Transfer for Computations. Acta Biotheor 2022; 70:26. [PMID: 36287247 PMCID: PMC9606061 DOI: 10.1007/s10441-022-09450-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2022] [Accepted: 10/12/2022] [Indexed: 11/10/2022]
Abstract
In all kinds of implementations of computing, whether technological or biological, some material carrier for the information exists, so in real-world implementations, the propagation speed of information cannot exceed the speed of its carrier. Because of this limitation, one must also consider the transfer time between computing units for any implementation. We need a different mathematical method to consider this limitation: classic mathematics can only describe infinitely fast and small computing system implementations. The difference between mathematical handling methods leads to different descriptions of the computing features of the systems. The proposed handling also explains why biological implementations can have lifelong learning and technological ones cannot. Our conclusion about learning matches published experimental evidence, both in biological and technological computing.
Collapse
Affiliation(s)
| | - Ádám József Berki
- Department of Neurology, Semmelweis University, 1085 Budapest, Hungary
- János Szentágothai Doctoral School of Neurosciences, Semmelweis University, 1085 Budapest, Hungary
| |
Collapse
|
29
|
Garg N, Balafrej I, Stewart TC, Portal JM, Bocquet M, Querlioz D, Drouin D, Rouat J, Beilliard Y, Alibart F. Voltage-dependent synaptic plasticity: Unsupervised probabilistic Hebbian plasticity rule based on neurons membrane potential. Front Neurosci 2022; 16:983950. [PMID: 36340782 PMCID: PMC9634260 DOI: 10.3389/fnins.2022.983950] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2022] [Accepted: 09/05/2022] [Indexed: 11/27/2022] Open
Abstract
This study proposes voltage-dependent-synaptic plasticity (VDSP), a novel brain-inspired unsupervised local learning rule for the online implementation of Hebb’s plasticity mechanism on neuromorphic hardware. The proposed VDSP learning rule updates the synaptic conductance on the spike of the postsynaptic neuron only, which reduces by a factor of two the number of updates with respect to standard spike timing dependent plasticity (STDP). This update is dependent on the membrane potential of the presynaptic neuron, which is readily available as part of neuron implementation and hence does not require additional memory for storage. Moreover, the update is also regularized on synaptic weight and prevents explosion or vanishing of weights on repeated stimulation. Rigorous mathematical analysis is performed to draw an equivalence between VDSP and STDP. To validate the system-level performance of VDSP, we train a single-layer spiking neural network (SNN) for the recognition of handwritten digits. We report 85.01 ± 0.76% (Mean ± SD) accuracy for a network of 100 output neurons on the MNIST dataset. The performance improves when scaling the network size (89.93 ± 0.41% for 400 output neurons, 90.56 ± 0.27 for 500 neurons), which validates the applicability of the proposed learning rule for spatial pattern recognition tasks. Future work will consider more complicated tasks. Interestingly, the learning rule better adapts than STDP to the frequency of input signal and does not require hand-tuning of hyperparameters.
Collapse
Affiliation(s)
- Nikhil Garg
- Institut Interdisciplinaire d’Innovation Technologique (3IT), Université de Sherbrooke, Sherbrooke, QC, Canada
- Laboratoire Nanotechnologies Nanosystèmes (LN2)—CNRS UMI-3463, Université de Sherbrooke, Sherbrooke, QC, Canada
- Institute of Electronics, Microelectronics and Nanotechnology (IEMN), Université de Lille, Villeneuve-d’Ascq, France
- *Correspondence: Nikhil Garg,
| | - Ismael Balafrej
- Institut Interdisciplinaire d’Innovation Technologique (3IT), Université de Sherbrooke, Sherbrooke, QC, Canada
- Laboratoire Nanotechnologies Nanosystèmes (LN2)—CNRS UMI-3463, Université de Sherbrooke, Sherbrooke, QC, Canada
- NECOTIS Research Lab, Department of Electrical and Computer Engineering, University of Sherbrooke, Sherbrooke, QC, Canada
| | - Terrence C. Stewart
- National Research Council Canada, University of Waterloo Collaboration Centre, Waterloo, ON, Canada
| | - Jean-Michel Portal
- Aix-Marseille Université, Université de Toulon, CNRS, IM2NP, Marseille, France
| | - Marc Bocquet
- Institute of Electronics, Microelectronics and Nanotechnology (IEMN), Université de Lille, Villeneuve-d’Ascq, France
| | - Damien Querlioz
- Université Paris-Saclay, CNRS, Centre de Nanosciences et de Nanotechnologies, Palaiseau, France
| | - Dominique Drouin
- Institut Interdisciplinaire d’Innovation Technologique (3IT), Université de Sherbrooke, Sherbrooke, QC, Canada
- Laboratoire Nanotechnologies Nanosystèmes (LN2)—CNRS UMI-3463, Université de Sherbrooke, Sherbrooke, QC, Canada
| | - Jean Rouat
- Institut Interdisciplinaire d’Innovation Technologique (3IT), Université de Sherbrooke, Sherbrooke, QC, Canada
- Laboratoire Nanotechnologies Nanosystèmes (LN2)—CNRS UMI-3463, Université de Sherbrooke, Sherbrooke, QC, Canada
- NECOTIS Research Lab, Department of Electrical and Computer Engineering, University of Sherbrooke, Sherbrooke, QC, Canada
| | - Yann Beilliard
- Institut Interdisciplinaire d’Innovation Technologique (3IT), Université de Sherbrooke, Sherbrooke, QC, Canada
- Laboratoire Nanotechnologies Nanosystèmes (LN2)—CNRS UMI-3463, Université de Sherbrooke, Sherbrooke, QC, Canada
| | - Fabien Alibart
- Institut Interdisciplinaire d’Innovation Technologique (3IT), Université de Sherbrooke, Sherbrooke, QC, Canada
- Laboratoire Nanotechnologies Nanosystèmes (LN2)—CNRS UMI-3463, Université de Sherbrooke, Sherbrooke, QC, Canada
- Institute of Electronics, Microelectronics and Nanotechnology (IEMN), Université de Lille, Villeneuve-d’Ascq, France
- Fabien Alibart,
| |
Collapse
|
30
|
Abstract
Activation of Ca2+/calmodulin-dependent kinase II (CaMKII) plays a critical role in long-term potentiation (LTP), a long accepted cellular model for learning and memory. However, how LTP and memories survive the turnover of synaptic proteins, particularly CaMKII, remains a mystery. Here, we take advantage of the finding that constitutive Ca2+-independent CaMKII activity, acquired prior to slice preparation, provides a lasting memory trace at synapses. In slice culture, this persistent CaMKII activity, in the absence of Ca2+ stimulation, remains stable over a 2-wk period, well beyond the turnover of CaMKII protein. We propose that the nascent CaMKII protein present at 2 wk acquired its activity from preexisting active CaMKII molecules, which transferred their activity to newly synthesized CaMKII molecules and thus maintain the memory in the face of protein turnover.
Collapse
|
31
|
Talidou A, Frankland PW, Mabbott D, Lefebvre J. Homeostatic coordination and up-regulation of neural activity by activity-dependent myelination. NATURE COMPUTATIONAL SCIENCE 2022; 2:665-676. [PMID: 38177260 DOI: 10.1038/s43588-022-00315-z] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/27/2022] [Accepted: 08/10/2022] [Indexed: 01/06/2024]
Abstract
Activity-dependent myelination (ADM) is a fundamental dimension of brain plasticity through which myelin changes as a function of neural activity. Mediated by structural changes in glia, ADM notably regulates axonal conduction velocity. Yet, it remains unclear how neural activity impacts myelination to orchestrate the timing of neural signalling, and how ADM shapes neural activity. We developed a model of spiking neurons enhanced with neuron-oligodendrocyte feedback and examined the relationship between ADM and neural activity. We found that ADM implements a homeostatic gain control mechanism that enhances neural firing rates and correlations through the temporal coordination of action potentials as axon lengths increase. Stimuli engage ADM plasticity to trigger bidirectional and reversible changes in conduction delays, as may occur during learning. Furthermore, ADM was found to enhance information transmission under various types of time-varying stimuli. These results highlight the role of ADM in shaping neural activity and communication.
Collapse
Affiliation(s)
- Afroditi Talidou
- Department of Biology, University of Ottawa, Ottawa, Ontario, Canada.
- Krembil Research Institute, University Health Network, Toronto, Ontario, Canada.
| | - Paul W Frankland
- Neurosciences and Mental Health, The Hospital for Sick Children, Toronto, Ontario, Canada
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada
| | - Donald Mabbott
- Neurosciences and Mental Health, The Hospital for Sick Children, Toronto, Ontario, Canada
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada
| | - Jérémie Lefebvre
- Department of Biology, University of Ottawa, Ottawa, Ontario, Canada
- Krembil Research Institute, University Health Network, Toronto, Ontario, Canada
- Department of Physics, University of Ottawa, Ottawa, Ontario, Canada
- Department of Mathematics, University of Toronto, Toronto, Ontario, Canada
| |
Collapse
|
32
|
Chauhan K, Khaledi-Nasab A, Neiman AB, Tass PA. Dynamics of phase oscillator networks with synaptic weight and structural plasticity. Sci Rep 2022; 12:15003. [PMID: 36056151 PMCID: PMC9440105 DOI: 10.1038/s41598-022-19417-9] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2022] [Accepted: 08/29/2022] [Indexed: 11/08/2022] Open
Abstract
We study the dynamics of Kuramoto oscillator networks with two distinct adaptation processes, one varying the coupling strengths and the other altering the network structure. Such systems model certain networks of oscillatory neurons where the neuronal dynamics, synaptic weights, and network structure interact with and shape each other. We model synaptic weight adaptation with spike-timing-dependent plasticity (STDP) that runs on a longer time scale than neuronal spiking. Structural changes that include addition and elimination of contacts occur at yet a longer time scale than the weight adaptations. First, we study the steady-state dynamics of Kuramoto networks that are bistable and can settle in synchronized or desynchronized states. To compare the impact of adding structural plasticity, we contrast the network with only STDP to one with a combination of STDP and structural plasticity. We show that the inclusion of structural plasticity optimizes the synchronized state of a network by allowing for synchronization with fewer links than a network with STDP alone. With non-identical units in the network, the addition of structural plasticity leads to the emergence of correlations between the oscillators' natural frequencies and node degrees. In the desynchronized regime, the structural plasticity decreases the number of contacts, leading to a sparse network. In this way, adding structural plasticity strengthens both synchronized and desynchronized states of a network. Second, we use desynchronizing coordinated reset stimulation and synchronizing periodic stimulation to induce desynchronized and synchronized states, respectively. Our findings indicate that a network with a combination of STDP and structural plasticity may require stronger and longer stimulation to switch between the states than a network with STDP only.
Collapse
Affiliation(s)
- Kanishk Chauhan
- Department of Physics and Astronomy, Ohio University, Athens, OH, 45701, USA.
- Neuroscience Program, Ohio University, Athens, OH, 45701, USA.
| | - Ali Khaledi-Nasab
- Department of Neurosurgery, Stanford University, Stanford, CA, 94305, USA
| | - Alexander B Neiman
- Department of Physics and Astronomy, Ohio University, Athens, OH, 45701, USA
- Neuroscience Program, Ohio University, Athens, OH, 45701, USA
| | - Peter A Tass
- Department of Neurosurgery, Stanford University, Stanford, CA, 94305, USA
| |
Collapse
|
33
|
Kim SH, Woo J, Choi K, Choi M, Han K. Neural Information Processing and Computations of Two-Input Synapses. Neural Comput 2022; 34:2102-2131. [PMID: 36027799 DOI: 10.1162/neco_a_01534] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2021] [Accepted: 06/02/2022] [Indexed: 11/04/2022]
Abstract
Information processing in artificial neural networks is largely dependent on the nature of neuron models. While commonly used models are designed for linear integration of synaptic inputs, accumulating experimental evidence suggests that biological neurons are capable of nonlinear computations for many converging synaptic inputs via homo- and heterosynaptic mechanisms. This nonlinear neuronal computation may play an important role in complex information processing at the neural circuit level. Here we characterize the dynamics and coding properties of neuron models on synaptic transmissions delivered from two hidden states. The neuronal information processing is influenced by the cooperative and competitive interactions among synapses and the coherence of the hidden states. Furthermore, we demonstrate that neuronal information processing under two-input synaptic transmission can be mapped to linearly nonseparable XOR as well as basic AND/OR operations. In particular, the mixtures of linear and nonlinear neuron models outperform the fashion-MNIST test compared to the neural networks consisting of only one type. This study provides a computational framework for assessing information processing of neuron and synapse models that may be beneficial for the design of brain-inspired artificial intelligence algorithms and neuromorphic systems.
Collapse
Affiliation(s)
- Soon Ho Kim
- Laboratory of Computational Neurophysics, Convergence Research Center for Brain Science, Brain Science Institute, Korea Institute of Science and Technology, Seoul 02792, South Korea
| | - Junhyuk Woo
- Laboratory of Computational Neurophysics, Convergence Research Center for Brain Science, Brain Science Institute, Korea Institute of Science and Technology, Seoul 02792, South Korea
| | - Kiri Choi
- School of Computational Sciences, Korea Institute for Advanced Study, Seoul 02455, South Korea
| | - MooYoung Choi
- Department of Physics and Astronomy and Center for Theoretical Physics, Seoul National University, Seoul 08826, South Korea
| | - Kyungreem Han
- Laboratory of Computational Neurophysics, Convergence Research Center for Brain Science, Brain Science Institute, Korea Institute of Science and Technology, Seoul 02792, South Korea
| |
Collapse
|
34
|
Mondal Y, Pena RFO, Rotstein HG. Temporal filters in response to presynaptic spike trains: interplay of cellular, synaptic and short-term plasticity time scales. J Comput Neurosci 2022; 50:395-429. [DOI: 10.1007/s10827-022-00822-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2021] [Revised: 05/06/2022] [Accepted: 05/25/2022] [Indexed: 10/16/2022]
|
35
|
Trensch G, Morrison A. A System-on-Chip Based Hybrid Neuromorphic Compute Node Architecture for Reproducible Hyper-Real-Time Simulations of Spiking Neural Networks. Front Neuroinform 2022; 16:884033. [PMID: 35846779 PMCID: PMC9277345 DOI: 10.3389/fninf.2022.884033] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2022] [Accepted: 05/23/2022] [Indexed: 11/23/2022] Open
Abstract
Despite the great strides neuroscience has made in recent decades, the underlying principles of brain function remain largely unknown. Advancing the field strongly depends on the ability to study large-scale neural networks and perform complex simulations. In this context, simulations in hyper-real-time are of high interest, as they would enable both comprehensive parameter scans and the study of slow processes, such as learning and long-term memory. Not even the fastest supercomputer available today is able to meet the challenge of accurate and reproducible simulation with hyper-real acceleration. The development of novel neuromorphic computer architectures holds out promise, but the high costs and long development cycles for application-specific hardware solutions makes it difficult to keep pace with the rapid developments in neuroscience. However, advances in System-on-Chip (SoC) device technology and tools are now providing interesting new design possibilities for application-specific implementations. Here, we present a novel hybrid software-hardware architecture approach for a neuromorphic compute node intended to work in a multi-node cluster configuration. The node design builds on the Xilinx Zynq-7000 SoC device architecture that combines a powerful programmable logic gate array (FPGA) and a dual-core ARM Cortex-A9 processor extension on a single chip. Our proposed architecture makes use of both and takes advantage of their tight coupling. We show that available SoC device technology can be used to build smaller neuromorphic computing clusters that enable hyper-real-time simulation of networks consisting of tens of thousands of neurons, and are thus capable of meeting the high demands for modeling and simulation in neuroscience.
Collapse
Affiliation(s)
- Guido Trensch
- Simulation and Data Laboratory Neuroscience, Jülich Supercomputing Centre, Institute for Advanced Simulation, Jülich Research Centre, Jülich, Germany.,Department of Computer Science 3-Software Engineering, RWTH Aachen University, Aachen, Germany
| | - Abigail Morrison
- Simulation and Data Laboratory Neuroscience, Jülich Supercomputing Centre, Institute for Advanced Simulation, Jülich Research Centre, Jülich, Germany.,Department of Computer Science 3-Software Engineering, RWTH Aachen University, Aachen, Germany.,Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), JARA-Institute Brain Structure-Function Relationship (JBI-1/INM-10), Research Centre Jülich, Jülich, Germany
| |
Collapse
|
36
|
Chrysanthidis N, Fiebig F, Lansner A, Herman P. Traces of semantization - from episodic to semantic memory in a spiking cortical network model. eNeuro 2022; 9:ENEURO.0062-22.2022. [PMID: 35803714 PMCID: PMC9347313 DOI: 10.1523/eneuro.0062-22.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2022] [Revised: 05/05/2022] [Accepted: 05/28/2022] [Indexed: 11/21/2022] Open
Abstract
Episodic memory is a recollection of past personal experiences associated with particular times and places. This kind of memory is commonly subject to loss of contextual information or" semantization", which gradually decouples the encoded memory items from their associated contexts while transforming them into semantic or gist-like representations. Novel extensions to the classical Remember/Know behavioral paradigm attribute the loss of episodicity to multiple exposures of an item in different contexts. Despite recent advancements explaining semantization at a behavioral level, the underlying neural mechanisms remain poorly understood. In this study, we suggest and evaluate a novel hypothesis proposing that Bayesian-Hebbian synaptic plasticity mechanisms might cause semantization of episodic memory. We implement a cortical spiking neural network model with a Bayesian-Hebbian learning rule called Bayesian Confidence Propagation Neural Network (BCPNN), which captures the semantization phenomenon and offers a mechanistic explanation for it. Encoding items across multiple contexts leads to item-context decoupling akin to semantization. We compare BCPNN plasticity with the more commonly used spike-timing dependent plasticity (STDP) learning rule in the same episodic memory task. Unlike BCPNN, STDP does not explain the decontextualization process. We further examine how selective plasticity modulation of isolated salient events may enhance preferential retention and resistance to semantization. Our model reproduces important features of episodicity on behavioral timescales under various biological constraints whilst also offering a novel neural and synaptic explanation for semantization, thereby casting new light on the interplay between episodic and semantic memory processes.Significance StatementRemembering single episodes is a fundamental attribute of cognition. Difficulties recollecting contextual information is a key sign of episodic memory loss or semantization. Behavioral studies demonstrate that semantization of episodic memory can occur rapidly, yet the neural mechanisms underlying this effect are insufficiently investigated. In line with recent behavioral findings, we show that multiple stimulus exposures in different contexts may advance item-context decoupling. We suggest a Bayesian-Hebbian synaptic plasticity hypothesis of memory semantization and further show that a transient modulation of plasticity during salient events may disrupt the decontextualization process by strengthening memory traces, and thus, enhancing preferential retention. The proposed cortical network-of-networks model thus bridges micro and mesoscale synaptic effects with network dynamics and behavior.
Collapse
Affiliation(s)
- Nikolaos Chrysanthidis
- Division of Computational Science and Technology, School of Electrical Engineering and Computer Science, KTH Royal Institute of Technology, 10044 Stockholm, Sweden
| | - Florian Fiebig
- Division of Computational Science and Technology, School of Electrical Engineering and Computer Science, KTH Royal Institute of Technology, 10044 Stockholm, Sweden
| | - Anders Lansner
- Division of Computational Science and Technology, School of Electrical Engineering and Computer Science, KTH Royal Institute of Technology, 10044 Stockholm, Sweden
- Department of Mathematics, Stockholm University, 10691 Stockholm, Sweden
| | - Pawel Herman
- Division of Computational Science and Technology, School of Electrical Engineering and Computer Science, KTH Royal Institute of Technology, 10044 Stockholm, Sweden
- Digital Futures, Stockholm, Sweden
- Swedish e-Science Research Centre, Stockholm, Sweden
| |
Collapse
|
37
|
Liu J, Hua Y, Yang R, Luo Y, Lu H, Wang Y, Yang S, Ding X. Bio-Inspired Autonomous Learning Algorithm With Application to Mobile Robot Obstacle Avoidance. Front Neurosci 2022; 16:905596. [PMID: 35844210 PMCID: PMC9279938 DOI: 10.3389/fnins.2022.905596] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2022] [Accepted: 06/08/2022] [Indexed: 11/23/2022] Open
Abstract
Spiking Neural Networks (SNNs) are often considered the third generation of Artificial Neural Networks (ANNs), owing to their high information processing capability and the accurate simulation of biological neural network behaviors. Though the research for SNNs has been quite active in recent years, there are still some challenges to applying SNNs to various potential applications, especially for robot control. In this study, a biologically inspired autonomous learning algorithm based on reward modulated spike-timing-dependent plasticity is proposed, where a novel rewarding generation mechanism is used to generate the reward signals for both learning and decision-making processes. The proposed learning algorithm is evaluated by a mobile robot obstacle avoidance task and experimental results show that the mobile robot with the proposed algorithm exhibits a good learning ability. The robot can successfully avoid obstacles in the environment after some learning trials. This provides an alternative method to design and apply the bio-inspired robot with autonomous learning capability in the typical robotic task scenario.
Collapse
Affiliation(s)
- Junxiu Liu
- School of Electronic Engineering, Guangxi Normal University, Guilin, China
| | - Yifan Hua
- School of Electronic Engineering, Guangxi Normal University, Guilin, China
| | - Rixing Yang
- College of Innovation and Entrepreneurship, Guangxi Normal University, Guilin, China
- *Correspondence: Rixing Yang
| | - Yuling Luo
- School of Electronic Engineering, Guangxi Normal University, Guilin, China
| | - Hao Lu
- School of Electronic Engineering, Guangxi Normal University, Guilin, China
| | - Yanhu Wang
- School of Electronic Engineering, Guangxi Normal University, Guilin, China
| | - Su Yang
- Department of Computer Science, Swansea University, Swansea, United Kingdom
| | - Xuemei Ding
- School of Computing, Engineering and Intelligent Systems, Ulster University, Derry, United Kingdom
| |
Collapse
|
38
|
Makarov VA, Lobov SA, Shchanikov S, Mikhaylov A, Kazantsev VB. Toward Reflective Spiking Neural Networks Exploiting Memristive Devices. Front Comput Neurosci 2022; 16:859874. [PMID: 35782090 PMCID: PMC9243340 DOI: 10.3389/fncom.2022.859874] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2022] [Accepted: 05/10/2022] [Indexed: 11/29/2022] Open
Abstract
The design of modern convolutional artificial neural networks (ANNs) composed of formal neurons copies the architecture of the visual cortex. Signals proceed through a hierarchy, where receptive fields become increasingly more complex and coding sparse. Nowadays, ANNs outperform humans in controlled pattern recognition tasks yet remain far behind in cognition. In part, it happens due to limited knowledge about the higher echelons of the brain hierarchy, where neurons actively generate predictions about what will happen next, i.e., the information processing jumps from reflex to reflection. In this study, we forecast that spiking neural networks (SNNs) can achieve the next qualitative leap. Reflective SNNs may take advantage of their intrinsic dynamics and mimic complex, not reflex-based, brain actions. They also enable a significant reduction in energy consumption. However, the training of SNNs is a challenging problem, strongly limiting their deployment. We then briefly overview new insights provided by the concept of a high-dimensional brain, which has been put forward to explain the potential power of single neurons in higher brain stations and deep SNN layers. Finally, we discuss the prospect of implementing neural networks in memristive systems. Such systems can densely pack on a chip 2D or 3D arrays of plastic synaptic contacts directly processing analog information. Thus, memristive devices are a good candidate for implementing in-memory and in-sensor computing. Then, memristive SNNs can diverge from the development of ANNs and build their niche, cognitive, or reflective computations.
Collapse
Affiliation(s)
- Valeri A. Makarov
- Instituto de Matemática Interdisciplinar, Universidad Complutense de Madrid, Madrid, Spain
- Department of Neurotechnologies, Research Institute of Physics and Technology, Laboratory of Stochastic Multistable Systems, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia
- *Correspondence: Valeri A. Makarov,
| | - Sergey A. Lobov
- Department of Neurotechnologies, Research Institute of Physics and Technology, Laboratory of Stochastic Multistable Systems, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia
- Neuroscience and Cognitive Technology Laboratory, Center for Technologies in Robotics and Mechatronics Components, Innopolis University, Innopolis, Russia
- Center For Neurotechnology and Machine Learning, Immanuel Kant Baltic Federal University, Kaliningrad, Russia
| | - Sergey Shchanikov
- Department of Neurotechnologies, Research Institute of Physics and Technology, Laboratory of Stochastic Multistable Systems, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia
- Department of Information Technologies, Vladimir State University, Vladimir, Russia
| | - Alexey Mikhaylov
- Department of Neurotechnologies, Research Institute of Physics and Technology, Laboratory of Stochastic Multistable Systems, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia
| | - Viktor B. Kazantsev
- Department of Neurotechnologies, Research Institute of Physics and Technology, Laboratory of Stochastic Multistable Systems, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia
- Neuroscience and Cognitive Technology Laboratory, Center for Technologies in Robotics and Mechatronics Components, Innopolis University, Innopolis, Russia
- Center For Neurotechnology and Machine Learning, Immanuel Kant Baltic Federal University, Kaliningrad, Russia
| |
Collapse
|
39
|
Bouhadjar Y, Wouters DJ, Diesmann M, Tetzlaff T. Sequence learning, prediction, and replay in networks of spiking neurons. PLoS Comput Biol 2022; 18:e1010233. [PMID: 35727857 PMCID: PMC9273101 DOI: 10.1371/journal.pcbi.1010233] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2021] [Revised: 07/11/2022] [Accepted: 05/20/2022] [Indexed: 11/24/2022] Open
Abstract
Sequence learning, prediction and replay have been proposed to constitute the universal computations performed by the neocortex. The Hierarchical Temporal Memory (HTM) algorithm realizes these forms of computation. It learns sequences in an unsupervised and continuous manner using local learning rules, permits a context specific prediction of future sequence elements, and generates mismatch signals in case the predictions are not met. While the HTM algorithm accounts for a number of biological features such as topographic receptive fields, nonlinear dendritic processing, and sparse connectivity, it is based on abstract discrete-time neuron and synapse dynamics, as well as on plasticity mechanisms that can only partly be related to known biological mechanisms. Here, we devise a continuous-time implementation of the temporal-memory (TM) component of the HTM algorithm, which is based on a recurrent network of spiking neurons with biophysically interpretable variables and parameters. The model learns high-order sequences by means of a structural Hebbian synaptic plasticity mechanism supplemented with a rate-based homeostatic control. In combination with nonlinear dendritic input integration and local inhibitory feedback, this type of plasticity leads to the dynamic self-organization of narrow sequence-specific subnetworks. These subnetworks provide the substrate for a faithful propagation of sparse, synchronous activity, and, thereby, for a robust, context specific prediction of future sequence elements as well as for the autonomous replay of previously learned sequences. By strengthening the link to biology, our implementation facilitates the evaluation of the TM hypothesis based on experimentally accessible quantities. The continuous-time implementation of the TM algorithm permits, in particular, an investigation of the role of sequence timing for sequence learning, prediction and replay. We demonstrate this aspect by studying the effect of the sequence speed on the sequence learning performance and on the speed of autonomous sequence replay. Essentially all data processed by mammals and many other living organisms is sequential. This holds true for all types of sensory input data as well as motor output activity. Being able to form memories of such sequential data, to predict future sequence elements, and to replay learned sequences is a necessary prerequisite for survival. It has been hypothesized that sequence learning, prediction and replay constitute the fundamental computations performed by the neocortex. The Hierarchical Temporal Memory (HTM) constitutes an abstract powerful algorithm implementing this form of computation and has been proposed to serve as a model of neocortical processing. In this study, we are reformulating this algorithm in terms of known biological ingredients and mechanisms to foster the verifiability of the HTM hypothesis based on electrophysiological and behavioral data. The proposed model learns continuously in an unsupervised manner by biologically plausible, local plasticity mechanisms, and successfully predicts and replays complex sequences. Apart from establishing contact to biology, the study sheds light on the mechanisms determining at what speed we can process sequences and provides an explanation of fast sequence replay observed in the hippocampus and in the neocortex.
Collapse
Affiliation(s)
- Younes Bouhadjar
- Institute of Neuroscience and Medicine (INM-6), & Institute for Advanced Simulation (IAS-6), & JARA BRAIN Institute Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Peter Grünberg Institute (PGI-7,10), Jülich Research Centre and JARA, Jülich, Germany
- RWTH Aachen University, Aachen, Germany
- * E-mail:
| | - Dirk J. Wouters
- Institute of Electronic Materials (IWE 2) & JARA-FIT, RWTH Aachen University, Aachen, Germany
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6), & Institute for Advanced Simulation (IAS-6), & JARA BRAIN Institute Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, & Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen University, Aachen, Germany
| | - Tom Tetzlaff
- Institute of Neuroscience and Medicine (INM-6), & Institute for Advanced Simulation (IAS-6), & JARA BRAIN Institute Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| |
Collapse
|
40
|
Juarez-Lora A, Ponce-Ponce VH, Sossa H, Rubio-Espino E. R-STDP Spiking Neural Network Architecture for Motion Control on a Changing Friction Joint Robotic Arm. Front Neurorobot 2022; 16:904017. [PMID: 35663727 PMCID: PMC9161736 DOI: 10.3389/fnbot.2022.904017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2022] [Accepted: 04/14/2022] [Indexed: 11/13/2022] Open
Abstract
Neuromorphic computing is a recent class of brain-inspired high-performance computer platforms and algorithms involving biologically-inspired models adopting hardware implementation in integrated circuits. The neuromorphic computing applications have provoked the rise of highly connected neurons and synapses in analog circuit systems that can be used to solve today's challenging machine learning problems. In conjunction with biologically plausible learning rules, such as the Hebbian learning and memristive devices, biologically-inspired spiking neural networks are considered the next-generation neuromorphic hardware construction blocks that will enable the deployment of new analog in situ learning capable and energetic efficient brain-like devices. These features are envisioned for modern mobile robotic implementations, currently challenging to overcome the pervasive von Neumann computer architecture. This study proposes a new neural architecture using the spike-time-dependent plasticity learning method and step-forward encoding algorithm for a self tuning neural control of motion in a joint robotic arm subjected to dynamic modifications. Simulations were conducted to demonstrate the proposed neural architecture's feasibility as the network successfully compensates for changing dynamics at each simulation run.
Collapse
Affiliation(s)
- Alejandro Juarez-Lora
- Instituto Politécnico Nacional, Centro de Investigación en Computación, Mexico City, México
| | - Victor H. Ponce-Ponce
- Instituto Politécnico Nacional, Centro de Investigación en Computación, Mexico City, México
| | | | | |
Collapse
|
41
|
Peres L, Rhodes O. Parallelization of Neural Processing on Neuromorphic Hardware. Front Neurosci 2022; 16:867027. [PMID: 35620669 PMCID: PMC9128596 DOI: 10.3389/fnins.2022.867027] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2022] [Accepted: 04/08/2022] [Indexed: 11/16/2022] Open
Abstract
Learning and development in real brains typically happens over long timescales, making long-term exploration of these features a significant research challenge. One way to address this problem is to use computational models to explore the brain, with Spiking Neural Networks a popular choice to capture neuron and synapse dynamics. However, researchers require simulation tools and platforms to execute simulations in real- or sub-realtime, to enable exploration of features such as long-term learning and neural pathologies over meaningful periods. This article presents novel multicore processing strategies on the SpiNNaker Neuromorphic hardware, addressing parallelization of Spiking Neural Network operations through allocation of dedicated computational units to specific tasks (such as neural and synaptic processing) to optimize performance. The work advances previous real-time simulations of a cortical microcircuit model, parameterizing load balancing between computational units in order to explore trade-offs between computational complexity and speed, to provide the best fit for a given application. By exploiting the flexibility of the SpiNNaker Neuromorphic platform, up to 9× throughput of neural operations is demonstrated when running biologically representative Spiking Neural Networks.
Collapse
|
42
|
Moradi K, Aldarraji Z, Luthra M, Madison GP, Ascoli GA. Normalized unitary synaptic signaling of the hippocampus and entorhinal cortex predicted by deep learning of experimental recordings. Commun Biol 2022; 5:418. [PMID: 35513471 PMCID: PMC9072429 DOI: 10.1038/s42003-022-03329-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2021] [Accepted: 03/30/2022] [Indexed: 11/21/2022] Open
Abstract
Biologically realistic computer simulations of neuronal circuits require systematic data-driven modeling of neuron type-specific synaptic activity. However, limited experimental yield, heterogeneous recordings conditions, and ambiguous neuronal identification have so far prevented the consistent characterization of synaptic signals for all connections of any neural system. We introduce a strategy to overcome these challenges and report a comprehensive synaptic quantification among all known neuron types of the hippocampal-entorhinal network. First, we reconstructed >2600 synaptic traces from ∼1200 publications into a unified computational representation of synaptic dynamics. We then trained a deep learning architecture with the resulting parameters, each annotated with detailed metadata such as recording method, solutions, and temperature. The model learned to predict the synaptic properties of all 3,120 circuit connections in arbitrary conditions with accuracy approaching the intrinsic experimental variability. Analysis of data normalized and completed with the deep learning model revealed that synaptic signals are controlled by few latent variables associated with specific molecular markers and interrelating conductance, decay time constant, and short-term plasticity. We freely release the tools and full dataset of unitary synaptic values in 32 covariate settings. Normalized synaptic data can be used in brain simulations, and to predict and test experimental hypothesis. A deep learning model trained on roughly 2,600 synaptic traces from hippocampal electrophysiology datasets demonstrates how specific covariates influence synaptic signals.
Collapse
Affiliation(s)
- Keivan Moradi
- Interdisciplinary Neuroscience Program and Krasnow Institute for Advanced Study, George Mason University, Fairfax, VA, USA.,Department of Neurobiology, David Geffen School of Medicine, University of California, Los Angeles, CA, USA
| | - Zainab Aldarraji
- Bioengineering Department and Volgenau School of Engineering, George Mason University, Fairfax, VA, USA
| | - Megha Luthra
- Bioengineering Department and Volgenau School of Engineering, George Mason University, Fairfax, VA, USA
| | - Grey P Madison
- Chemistry and Biochemistry Department, College of Science, George Mason University, Fairfax, VA, USA
| | - Giorgio A Ascoli
- Interdisciplinary Neuroscience Program and Krasnow Institute for Advanced Study, George Mason University, Fairfax, VA, USA. .,Bioengineering Department and Volgenau School of Engineering, George Mason University, Fairfax, VA, USA.
| |
Collapse
|
43
|
Vignoud G, Robert P. Spontaneous dynamics of synaptic weights in stochastic models with pair-based spike-timing-dependent plasticity. Phys Rev E 2022; 105:054405. [PMID: 35706237 DOI: 10.1103/physreve.105.054405] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2021] [Accepted: 03/31/2022] [Indexed: 06/15/2023]
Abstract
We investigate spike-timing dependent plasticity (STPD) in the case of a synapse connecting two neuronal cells. We develop a theoretical analysis of several STDP rules using Markovian theory. In this context there are two different timescales, fast neuronal activity and slower synaptic weight updates. Exploiting this timescale separation, we derive the long-time limits of a single synaptic weight subject to STDP. We show that the pairing model of presynaptic and postsynaptic spikes controls the synaptic weight dynamics for small external input on an excitatory synapse. This result implies in particular that mean-field analysis of plasticity may miss some important properties of STDP. Anti-Hebbian STDP favors the emergence of a stable synaptic weight. In the case of an inhibitory synapse the pairing schemes matter less, and we observe convergence of the synaptic weight to a nonnull value only for Hebbian STDP. We extensively study different asymptotic regimes for STDP rules, raising interesting questions for future work on adaptative neuronal networks and, more generally, on adaptative systems.
Collapse
Affiliation(s)
- Gaëtan Vignoud
- INRIA Paris, 2 rue Simone Iff, 75589 Paris Cedex 12, France and Center for Interdisciplinary Research in Biology (CIRB), Collège de France (CNRS UMR 7241, INSERM U1050), 11 Place Marcelin Berthelot, 75005 Paris, France
| | | |
Collapse
|
44
|
Hodassman S, Vardi R, Tugendhaft Y, Goldental A, Kanter I. Efficient dendritic learning as an alternative to synaptic plasticity hypothesis. Sci Rep 2022; 12:6571. [PMID: 35484180 PMCID: PMC9051213 DOI: 10.1038/s41598-022-10466-8] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2022] [Accepted: 04/08/2022] [Indexed: 11/09/2022] Open
Abstract
Synaptic plasticity is a long-lasting core hypothesis of brain learning that suggests local adaptation between two connecting neurons and forms the foundation of machine learning. The main complexity of synaptic plasticity is that synapses and dendrites connect neurons in series and existing experiments cannot pinpoint the significant imprinted adaptation location. We showed efficient backpropagation and Hebbian learning on dendritic trees, inspired by experimental-based evidence, for sub-dendritic adaptation and its nonlinear amplification. It has proven to achieve success rates approaching unity for handwritten digits recognition, indicating realization of deep learning even by a single dendrite or neuron. Additionally, dendritic amplification practically generates an exponential number of input crosses, higher-order interactions, with the number of inputs, which enhance success rates. However, direct implementation of a large number of the cross weights and their exhaustive manipulation independently is beyond existing and anticipated computational power. Hence, a new type of nonlinear adaptive dendritic hardware for imitating dendritic learning and estimating the computational capability of the brain must be built.
Collapse
Affiliation(s)
- Shiri Hodassman
- Department of Physics, Bar-Ilan University, 52900, Ramat-Gan, Israel
| | - Roni Vardi
- Gonda Interdisciplinary Brain Research Center, Bar-Ilan University, 52900, Ramat-Gan, Israel
| | - Yael Tugendhaft
- Department of Physics, Bar-Ilan University, 52900, Ramat-Gan, Israel
| | - Amir Goldental
- Department of Physics, Bar-Ilan University, 52900, Ramat-Gan, Israel
| | - Ido Kanter
- Department of Physics, Bar-Ilan University, 52900, Ramat-Gan, Israel. .,Gonda Interdisciplinary Brain Research Center, Bar-Ilan University, 52900, Ramat-Gan, Israel.
| |
Collapse
|
45
|
Bio-plausible digital implementation of a reward modulated STDP synapse. Neural Comput Appl 2022. [DOI: 10.1007/s00521-022-07220-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
Abstract
AbstractReward-modulated Spike-Timing-Dependent Plasticity (R-STDP) is a learning method for Spiking Neural Network (SNN) that makes use of an external learning signal to modulate the synaptic plasticity produced by Spike-Timing-Dependent Plasticity (STDP). Combining the advantages of reinforcement learning and the biological plausibility of STDP, online learning on SNN in real-world scenarios can be applied. This paper presents a fully digital architecture, implemented on an Field-Programmable Gate Array (FPGA), including the R-STDP learning mechanism in a SNN. The hardware results obtained are comparable to the software simulations results using the Brian2 simulator. The maximum error is of 0.083 when a 14-bits fix-point precision is used in realtime. The presented architecture shows an accuracy of 95% when tested in an obstacle avoidance problem on mobile robotics with a minimum use of resources.
Collapse
|
46
|
Phillips RS, Rubin JE. Putting the theory into 'burstlet theory' with a biophysical model of burstlets and bursts in the respiratory preBötzinger complex. eLife 2022; 11:75713. [PMID: 35380537 PMCID: PMC9023056 DOI: 10.7554/elife.75713] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2021] [Accepted: 04/04/2022] [Indexed: 11/17/2022] Open
Abstract
Inspiratory breathing rhythms arise from synchronized neuronal activity in a bilaterally distributed brainstem structure known as the preBötzinger complex (preBötC). In in vitro slice preparations containing the preBötC, extracellular potassium must be elevated above physiological levels (to 7–9 mM) to observe regular rhythmic respiratory motor output in the hypoglossal nerve to which the preBötC projects. Reexamination of how extracellular K+ affects preBötC neuronal activity has revealed that low-amplitude oscillations persist at physiological levels. These oscillatory events are subthreshold from the standpoint of transmission to motor output and are dubbed burstlets. Burstlets arise from synchronized neural activity in a rhythmogenic neuronal subpopulation within the preBötC that in some instances may fail to recruit the larger network events, or bursts, required to generate motor output. The fraction of subthreshold preBötC oscillatory events (burstlet fraction) decreases sigmoidally with increasing extracellular potassium. These observations underlie the burstlet theory of respiratory rhythm generation. Experimental and computational studies have suggested that recruitment of the non-rhythmogenic component of the preBötC population requires intracellular Ca2+ dynamics and activation of a calcium-activated nonselective cationic current. In this computational study, we show how intracellular calcium dynamics driven by synaptically triggered Ca2+ influx as well as Ca2+ release/uptake by the endoplasmic reticulum in conjunction with a calcium-activated nonselective cationic current can reproduce and offer an explanation for many of the key properties associated with the burstlet theory of respiratory rhythm generation. Altogether, our modeling work provides a mechanistic basis that can unify a wide range of experimental findings on rhythm generation and motor output recruitment in the preBötC.
Collapse
|
47
|
Adaptive erasure of spurious sequences in sensory cortical circuits. Neuron 2022; 110:1857-1868.e5. [PMID: 35358415 PMCID: PMC9616807 DOI: 10.1016/j.neuron.2022.03.006] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2021] [Revised: 11/12/2021] [Accepted: 03/07/2022] [Indexed: 12/02/2022]
Abstract
Sequential activity reflecting previously experienced temporal sequences is considered a hallmark of learning across cortical areas. However, it is unknown how cortical circuits avoid the converse problem: producing spurious sequences that are not reflecting sequences in their inputs. We develop methods to quantify and study sequentiality in neural responses. We show that recurrent circuit responses generally include spurious sequences, which are specifically prevented in circuits that obey two widely known features of cortical microcircuit organization: Dale’s law and Hebbian connectivity. In particular, spike-timing-dependent plasticity in excitation-inhibition networks leads to an adaptive erasure of spurious sequences. We tested our theory in multielectrode recordings from the visual cortex of awake ferrets. Although responses to natural stimuli were largely non-sequential, responses to artificial stimuli initially included spurious sequences, which diminished over extended exposure. These results reveal an unexpected role for Hebbian experience-dependent plasticity and Dale’s law in sensory cortical circuits. Recurrent circuits generate spurious sequences without sequential inputs A principled measure of total sequentiality in population responses is developed Theory predicts that Hebbian plasticity should abolish spurious sequences Spurious sequences in the visual cortex diminish with experience
Collapse
|
48
|
Perez S, Cui Y, Vignoud G, Perrin E, Mendes A, Zheng Z, Touboul J, Venance L. Striatum expresses region-specific plasticity consistent with distinct memory abilities. Cell Rep 2022; 38:110521. [PMID: 35294877 DOI: 10.1016/j.celrep.2022.110521] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2021] [Revised: 12/23/2021] [Accepted: 02/21/2022] [Indexed: 11/24/2022] Open
Abstract
The striatum mediates two learning modalities: goal-directed behavior in dorsomedial (DMS) and habits in dorsolateral (DLS) striata. The synaptic bases of these learnings are still elusive. Indeed, while ample research has described DLS plasticity, little remains known about DMS plasticity and its involvement in procedural learning. Here, we find symmetric and asymmetric anti-Hebbian spike-timing-dependent plasticity (STDP) in DMS and DLS, respectively, with opposite plasticity dominance upon increasing corticostriatal activity. During motor-skill learning, plasticity is engaged in DMS and striatonigral DLS neurons only during early learning stages, whereas striatopallidal DLS neurons are mobilized only during late phases. With a mathematical modeling approach, we find that symmetric anti-Hebbian STDP favors memory flexibility, while asymmetric anti-Hebbian STDP favors memory maintenance, consistent with memory processes at play in procedural learning.
Collapse
Affiliation(s)
- Sylvie Perez
- Center for Interdisciplinary Research in Biology (CIRB), College de France, CNRS, INSERM, Université PSL, Paris, France
| | - Yihui Cui
- Center for Interdisciplinary Research in Biology (CIRB), College de France, CNRS, INSERM, Université PSL, Paris, France; Department of Neurobiology, Department of Neurology of Sir Run Run Shaw Hospital, Zhejiang University School of Medicine, Hangzhou 310058, China
| | - Gaëtan Vignoud
- Center for Interdisciplinary Research in Biology (CIRB), College de France, CNRS, INSERM, Université PSL, Paris, France; MAMBA-Modelling and Analysis for Medical and Biological Applications, Inria Paris, LJLL (UMR-7598) -Laboratory Jacques-Louis Lions, Paris, France
| | - Elodie Perrin
- Center for Interdisciplinary Research in Biology (CIRB), College de France, CNRS, INSERM, Université PSL, Paris, France
| | - Alexandre Mendes
- Center for Interdisciplinary Research in Biology (CIRB), College de France, CNRS, INSERM, Université PSL, Paris, France
| | - Zhiwei Zheng
- Department of Neurobiology, Department of Neurology of Sir Run Run Shaw Hospital, Zhejiang University School of Medicine, Hangzhou 310058, China
| | - Jonathan Touboul
- Department of Mathematics and Volen National Center for Complex Systems, Brandeis University, Waltham, MA, USA
| | - Laurent Venance
- Center for Interdisciplinary Research in Biology (CIRB), College de France, CNRS, INSERM, Université PSL, Paris, France.
| |
Collapse
|
49
|
Pronold J, Jordan J, Wylie BJN, Kitayama I, Diesmann M, Kunkel S. Routing Brain Traffic Through the Von Neumann Bottleneck: Parallel Sorting and Refactoring. Front Neuroinform 2022; 15:785068. [PMID: 35300490 PMCID: PMC8921864 DOI: 10.3389/fninf.2021.785068] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2021] [Accepted: 12/24/2021] [Indexed: 11/26/2022] Open
Abstract
Generic simulation code for spiking neuronal networks spends the major part of the time in the phase where spikes have arrived at a compute node and need to be delivered to their target neurons. These spikes were emitted over the last interval between communication steps by source neurons distributed across many compute nodes and are inherently irregular and unsorted with respect to their targets. For finding those targets, the spikes need to be dispatched to a three-dimensional data structure with decisions on target thread and synapse type to be made on the way. With growing network size, a compute node receives spikes from an increasing number of different source neurons until in the limit each synapse on the compute node has a unique source. Here, we show analytically how this sparsity emerges over the practically relevant range of network sizes from a hundred thousand to a billion neurons. By profiling a production code we investigate opportunities for algorithmic changes to avoid indirections and branching. Every thread hosts an equal share of the neurons on a compute node. In the original algorithm, all threads search through all spikes to pick out the relevant ones. With increasing network size, the fraction of hits remains invariant but the absolute number of rejections grows. Our new alternative algorithm equally divides the spikes among the threads and immediately sorts them in parallel according to target thread and synapse type. After this, every thread completes delivery solely of the section of spikes for its own neurons. Independent of the number of threads, all spikes are looked at only two times. The new algorithm halves the number of instructions in spike delivery which leads to a reduction of simulation time of up to 40 %. Thus, spike delivery is a fully parallelizable process with a single synchronization point and thereby well suited for many-core systems. Our analysis indicates that further progress requires a reduction of the latency that the instructions experience in accessing memory. The study provides the foundation for the exploration of methods of latency hiding like software pipelining and software-induced prefetching.
Collapse
Affiliation(s)
- Jari Pronold
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- RWTH Aachen University, Aachen, Germany
| | - Jakob Jordan
- Department of Physiology, University of Bern, Bern, Switzerland
| | - Brian J. N. Wylie
- Jülich Supercomputing Centre, Jülich Research Centre, Jülich, Germany
| | | | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen University, Aachen, Germany
| | - Susanne Kunkel
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| |
Collapse
|
50
|
Spiliotis K, Starke J, Franz D, Richter A, Köhling R. Deep brain stimulation for movement disorder treatment: exploring frequency-dependent efficacy in a computational network model. BIOLOGICAL CYBERNETICS 2022; 116:93-116. [PMID: 34894291 PMCID: PMC8866393 DOI: 10.1007/s00422-021-00909-2] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/20/2020] [Accepted: 10/31/2021] [Indexed: 06/14/2023]
Abstract
A large-scale computational model of the basal ganglia network and thalamus is proposed to describe movement disorders and treatment effects of deep brain stimulation (DBS). The model of this complex network considers three areas of the basal ganglia region: the subthalamic nucleus (STN) as target area of DBS, the globus pallidus, both pars externa and pars interna (GPe-GPi), and the thalamus. Parkinsonian conditions are simulated by assuming reduced dopaminergic input and corresponding pronounced inhibitory or disinhibited projections to GPe and GPi. Macroscopic quantities are derived which correlate closely to thalamic responses and hence motor programme fidelity. It can be demonstrated that depending on different levels of striatal projections to the GPe and GPi, the dynamics of these macroscopic quantities (synchronisation index, mean synaptic activity and response efficacy) switch from normal to Parkinsonian conditions. Simulating DBS of the STN affects the dynamics of the entire network, increasing the thalamic activity to levels close to normal, while differing from both normal and Parkinsonian dynamics. Using the mentioned macroscopic quantities, the model proposes optimal DBS frequency ranges above 130 Hz.
Collapse
Affiliation(s)
| | - Jens Starke
- Institute of Mathematics, University of Rostock, 18057 Rostock, Germany
| | - Denise Franz
- Oscar-Langendorff-Institute of Physiology, Rostock University Medical Center, Rostock, Germany
| | - Angelika Richter
- Institute of Pharmacology, Pharmacy and Toxicology, Faculty of Veterinary Medicine, University of Leipzig, Leipzig, Germany
| | - Rüdiger Köhling
- Oscar-Langendorff-Institute of Physiology, Rostock University Medical Center, Rostock, Germany
| |
Collapse
|