1
|
Morabito A, Zerlaut Y, Dhanasobhon D, Berthaux E, Tzilivaki A, Moneron G, Cathala L, Poirazi P, Bacci A, DiGregorio D, Lourenço J, Rebola N. A dendritic substrate for temporal diversity of cortical inhibition. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.07.09.602783. [PMID: 39026855 PMCID: PMC11257522 DOI: 10.1101/2024.07.09.602783] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/20/2024]
Abstract
In the mammalian neocortex, GABAergic interneurons (INs) inhibit cortical networks in profoundly different ways. The extent to which this depends on how different INs process excitatory signals along their dendrites is poorly understood. Here, we reveal that the functional specialization of two major populations of cortical INs is determined by the unique association of different dendritic integration modes with distinct synaptic organization motifs. We found that somatostatin (SST)-INs exhibit NMDAR-dependent dendritic integration and uniform synapse density along the dendritic tree. In contrast, dendrites of parvalbumin (PV)-INs exhibit passive synaptic integration coupled with proximally enriched synaptic distributions. Theoretical analysis shows that these two dendritic configurations result in different strategies to optimize synaptic efficacy in thin dendritic structures. Yet, the two configurations lead to distinct temporal engagement of each IN during network activity. We confirmed these predictions with in vivo recordings of IN activity in the visual cortex of awake mice, revealing a rapid and linear recruitment of PV-INs as opposed to a long-lasting integrative activation of SST-INs. Our work reveals the existence of distinct dendritic strategies that confer distinct temporal representations for the two major classes of neocortical INs and thus dynamics of inhibition.
Collapse
|
2
|
Johnsen KA, Cruzado NA, Menard ZC, Willats AA, Charles AS, Markowitz JE, Rozell CJ. Bridging model and experiment in systems neuroscience with Cleo: the Closed-Loop, Electrophysiology, and Optophysiology simulation testbed. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2023.01.27.525963. [PMID: 39026717 PMCID: PMC11257437 DOI: 10.1101/2023.01.27.525963] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/20/2024]
Abstract
Systems neuroscience has experienced an explosion of new tools for reading and writing neural activity, enabling exciting new experiments such as all-optical or closed-loop control that effect powerful causal interventions. At the same time, improved computational models are capable of reproducing behavior and neural activity with increasing fidelity. Unfortunately, these advances have drastically increased the complexity of integrating different lines of research, resulting in the missed opportunities and untapped potential of suboptimal experiments. Experiment simulation can help bridge this gap, allowing model and experiment to better inform each other by providing a low-cost testbed for experiment design, model validation, and methods engineering. Specifically, this can be achieved by incorporating the simulation of the experimental interface into our models, but no existing tool integrates optogenetics, two-photon calcium imaging, electrode recording, and flexible closed-loop processing with neural population simulations. To address this need, we have developed Cleo: the Closed-Loop, Electrophysiology, and Optophysiology experiment simulation testbed. Cleo is a Python package enabling injection of recording and stimulation devices as well as closed-loop control with realistic latency into a Brian spiking neural network model. It is the only publicly available tool currently supporting two-photon and multi-opsin/wavelength optogenetics. To facilitate adoption and extension by the community, Cleo is open-source, modular, tested, and documented, and can export results to various data formats. Here we describe the design and features of Cleo, validate output of individual components and integrated experiments, and demonstrate its utility for advancing optogenetic techniques in prospective experiments using previously published systems neuroscience models.
Collapse
|
3
|
Białas M, Mirończuk MM, Mańdziuk J. Leveraging spiking neural networks for topic modeling. Neural Netw 2024; 178:106494. [PMID: 38972130 DOI: 10.1016/j.neunet.2024.106494] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2024] [Revised: 05/06/2024] [Accepted: 06/25/2024] [Indexed: 07/09/2024]
Abstract
This article investigates the application of spiking neural networks (SNNs) to the problem of topic modeling (TM): the identification of significant groups of words that represent human-understandable topics in large sets of documents. Our research is based on the hypothesis that an SNN that implements the Hebbian learning paradigm is capable of becoming specialized in the detection of statistically significant word patterns in the presence of adequately tailored sequential input. To support this hypothesis, we propose a novel spiking topic model (STM) that transforms text into a sequence of spikes and uses that sequence to train single-layer SNNs. In STM, each SNN neuron represents one topic, and each of the neuron's weights corresponds to one word. STM synaptic connections are modified according to spike-timing-dependent plasticity; after training, the neurons' strongest weights are interpreted as the words that represent topics. We compare the performance of STM with four other TM methods Latent Dirichlet Allocation (LDA), Biterm Topic Model (BTM), Embedding Topic Model (ETM) and BERTopic on three datasets: 20Newsgroups, BBC news, and AG news. The results demonstrate that STM can discover high-quality topics and successfully compete with comparative classical methods. This sheds new light on the possibility of the adaptation of SNN models in unsupervised natural language processing.
Collapse
Affiliation(s)
- Marcin Białas
- National Information Processing Institute, al. Niepodległości 188b, 00-608, Warsaw, Poland.
| | | | - Jacek Mańdziuk
- Faculty of Mathematics and Information Science, Warsaw University of Technology, Warsaw, Poland.
| |
Collapse
|
4
|
Chini M, Hnida M, Kostka JK, Chen YN, Hanganu-Opatz IL. Preconfigured architecture of the developing mouse brain. Cell Rep 2024; 43:114267. [PMID: 38795344 DOI: 10.1016/j.celrep.2024.114267] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2023] [Revised: 03/13/2024] [Accepted: 05/08/2024] [Indexed: 05/27/2024] Open
Abstract
In the adult brain, structural and functional parameters, such as synaptic sizes and neuronal firing rates, follow right-skewed and heavy-tailed distributions. While this organization is thought to have significant implications, its development is still largely unknown. Here, we address this knowledge gap by investigating a large-scale dataset recorded from the prefrontal cortex and the olfactory bulb of mice aged 4-60 postnatal days. We show that firing rates and spike train interactions have a largely stable distribution shape throughout the first 60 postnatal days and that the prefrontal cortex displays a functional small-world architecture. Moreover, early brain activity exhibits an oligarchical organization, where high-firing neurons have hub-like properties. In a neural network model, we show that analogously right-skewed and heavy-tailed synaptic parameters are instrumental to consistently recapitulate the experimental data. Thus, functional and structural parameters in the developing brain are already extremely distributed, suggesting that this organization is preconfigured and not experience dependent.
Collapse
Affiliation(s)
- Mattia Chini
- Institute of Developmental Neurophysiology, Center for Molecular Neurobiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany.
| | - Marilena Hnida
- Institute of Developmental Neurophysiology, Center for Molecular Neurobiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - Johanna K Kostka
- Institute of Developmental Neurophysiology, Center for Molecular Neurobiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - Yu-Nan Chen
- Institute of Developmental Neurophysiology, Center for Molecular Neurobiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - Ileana L Hanganu-Opatz
- Institute of Developmental Neurophysiology, Center for Molecular Neurobiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| |
Collapse
|
5
|
Beaubois R, Cheslet J, Duenki T, De Venuto G, Carè M, Khoyratee F, Chiappalone M, Branchereau P, Ikeuchi Y, Levi T. BiœmuS: A new tool for neurological disorders studies through real-time emulation and hybridization using biomimetic Spiking Neural Network. Nat Commun 2024; 15:5142. [PMID: 38902236 PMCID: PMC11190274 DOI: 10.1038/s41467-024-48905-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2023] [Accepted: 05/15/2024] [Indexed: 06/22/2024] Open
Abstract
Characterization and modeling of biological neural networks has emerged as a field driving significant advancements in our understanding of brain function and related pathologies. As of today, pharmacological treatments for neurological disorders remain limited, pushing the exploration of promising alternative approaches such as electroceutics. Recent research in bioelectronics and neuromorphic engineering have fostered the development of the new generation of neuroprostheses for brain repair. However, achieving their full potential necessitates a deeper understanding of biohybrid interaction. In this study, we present a novel real-time, biomimetic, cost-effective and user-friendly neural network capable of real-time emulation for biohybrid experiments. Our system facilitates the investigation and replication of biophysically detailed neural network dynamics while prioritizing cost-efficiency, flexibility and ease of use. We showcase the feasibility of conducting biohybrid experiments using standard biophysical interfaces and a variety of biological cells as well as real-time emulation of diverse network configurations. We envision our system as a crucial step towards the development of neuromorphic-based neuroprostheses for bioelectrical therapeutics, enabling seamless communication with biological networks on a comparable timescale. Its embedded real-time functionality enhances practicality and accessibility, amplifying its potential for real-world applications in biohybrid experiments.
Collapse
Affiliation(s)
- Romain Beaubois
- IMS, CNRS UMR5218, Bordeaux INP, University of Bordeaux, Talence, France
- Institute of Industrial Science, The University of Tokyo, Tokyo, Japan
- LIMMS, CNRS-Institute of Industrial Science, UMI 2820, The University of Tokyo, Tokyo, Japan
| | - Jérémy Cheslet
- IMS, CNRS UMR5218, Bordeaux INP, University of Bordeaux, Talence, France
- Institute of Industrial Science, The University of Tokyo, Tokyo, Japan
- LIMMS, CNRS-Institute of Industrial Science, UMI 2820, The University of Tokyo, Tokyo, Japan
| | - Tomoya Duenki
- Institute of Industrial Science, The University of Tokyo, Tokyo, Japan
- LIMMS, CNRS-Institute of Industrial Science, UMI 2820, The University of Tokyo, Tokyo, Japan
- Department of Chemistry and Biotechnology, Graduate School of Engineering, The University of Tokyo, Tokyo, Japan
- Institute for AI and Beyond, The University of Tokyo, Tokyo, Japan
| | | | - Marta Carè
- DIBRIS, University of Genova, Genova, Italy
- IRCCS Ospedale Policlinico San Martino, Genova, Italy
- Rehab Technologies, Istituto Italiano di Tecnologia, Genova, Italy
| | - Farad Khoyratee
- IMS, CNRS UMR5218, Bordeaux INP, University of Bordeaux, Talence, France
| | - Michela Chiappalone
- DIBRIS, University of Genova, Genova, Italy
- IRCCS Ospedale Policlinico San Martino, Genova, Italy
- Rehab Technologies, Istituto Italiano di Tecnologia, Genova, Italy
| | | | - Yoshiho Ikeuchi
- Institute of Industrial Science, The University of Tokyo, Tokyo, Japan
- LIMMS, CNRS-Institute of Industrial Science, UMI 2820, The University of Tokyo, Tokyo, Japan
- Institute for AI and Beyond, The University of Tokyo, Tokyo, Japan
| | - Timothée Levi
- IMS, CNRS UMR5218, Bordeaux INP, University of Bordeaux, Talence, France.
| |
Collapse
|
6
|
Wei J, Li L, Zhang J, Shi E, Yang J, Liu X. Computational Modeling of the Prefrontal-Cingulate Cortex to Investigate the Role of Coupling Relationships for Balancing Emotion and Cognition. Neurosci Bull 2024:10.1007/s12264-024-01246-7. [PMID: 38869704 DOI: 10.1007/s12264-024-01246-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2023] [Accepted: 02/11/2024] [Indexed: 06/14/2024] Open
Abstract
Within the prefrontal-cingulate cortex, abnormalities in coupling between neuronal networks can disturb the emotion-cognition interactions, contributing to the development of mental disorders such as depression. Despite this understanding, the neural circuit mechanisms underlying this phenomenon remain elusive. In this study, we present a biophysical computational model encompassing three crucial regions, including the dorsolateral prefrontal cortex, subgenual anterior cingulate cortex, and ventromedial prefrontal cortex. The objective is to investigate the role of coupling relationships within the prefrontal-cingulate cortex networks in balancing emotions and cognitive processes. The numerical results confirm that coupled weights play a crucial role in the balance of emotional cognitive networks. Furthermore, our model predicts the pathogenic mechanism of depression resulting from abnormalities in the subgenual cortex, and network functionality was restored through intervention in the dorsolateral prefrontal cortex. This study utilizes computational modeling techniques to provide an insight explanation for the diagnosis and treatment of depression.
Collapse
Affiliation(s)
- Jinzhao Wei
- Key Laboratory of Digital Medical Engineering of Hebei Province, Hebei University, Baoding, 071000, China
- College of Electronic Information Engineering, Hebei University, Baoding, 071000, China
| | - Licong Li
- Key Laboratory of Digital Medical Engineering of Hebei Province, Hebei University, Baoding, 071000, China.
- College of Electronic Information Engineering, Hebei University, Baoding, 071000, China.
| | - Jiayi Zhang
- Key Laboratory of Digital Medical Engineering of Hebei Province, Hebei University, Baoding, 071000, China
- College of Electronic Information Engineering, Hebei University, Baoding, 071000, China
| | - Erdong Shi
- Key Laboratory of Digital Medical Engineering of Hebei Province, Hebei University, Baoding, 071000, China
- College of Electronic Information Engineering, Hebei University, Baoding, 071000, China
| | - Jianli Yang
- Key Laboratory of Digital Medical Engineering of Hebei Province, Hebei University, Baoding, 071000, China
- College of Electronic Information Engineering, Hebei University, Baoding, 071000, China
| | - Xiuling Liu
- Key Laboratory of Digital Medical Engineering of Hebei Province, Hebei University, Baoding, 071000, China.
- College of Electronic Information Engineering, Hebei University, Baoding, 071000, China.
| |
Collapse
|
7
|
de Brito Van Velze M, Dhanasobhon D, Martinez M, Morabito A, Berthaux E, Pinho CM, Zerlaut Y, Rebola N. Feedforward and disinhibitory circuits differentially control activity of cortical somatostatin interneurons during behavioral state transitions. Cell Rep 2024; 43:114197. [PMID: 38733587 DOI: 10.1016/j.celrep.2024.114197] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2023] [Revised: 03/26/2024] [Accepted: 04/21/2024] [Indexed: 05/13/2024] Open
Abstract
Interneurons (INs), specifically those in disinhibitory circuits like somatostatin (SST) and vasoactive intestinal peptide (VIP)-INs, are strongly modulated by the behavioral context. Yet, the mechanisms by which these INs are recruited during active states and whether their activity is consistent across sensory cortices remain unclear. We now report that in mice, locomotor activity strongly recruits SST-INs in the primary somatosensory (S1) but not the visual (V1) cortex. This diverse engagement of SST-INs cannot be explained by differences in VIP-IN function but is absent in the presence of visual input, suggesting the involvement of feedforward sensory pathways. Accordingly, inactivating the somatosensory thalamus, but not decreasing VIP-IN activity, significantly reduces the modulation of SST-INs by locomotion. Model simulations suggest that the differences in SST-INs across behavioral states can be explained by varying ratios of VIP- and thalamus-driven activity. By integrating feedforward activity with neuromodulation, SST-INs are anticipated to be crucial for adapting sensory processing to behavioral states.
Collapse
Affiliation(s)
- Marcel de Brito Van Velze
- ICM, Paris Brain Institute, Hôpital de la Pitié-Salpêtrière, Sorbonne Université, INSERM, CNRS, 75013 Paris, France
| | - Dhanasak Dhanasobhon
- ICM, Paris Brain Institute, Hôpital de la Pitié-Salpêtrière, Sorbonne Université, INSERM, CNRS, 75013 Paris, France
| | - Marie Martinez
- ICM, Paris Brain Institute, Hôpital de la Pitié-Salpêtrière, Sorbonne Université, INSERM, CNRS, 75013 Paris, France
| | - Annunziato Morabito
- ICM, Paris Brain Institute, Hôpital de la Pitié-Salpêtrière, Sorbonne Université, INSERM, CNRS, 75013 Paris, France
| | - Emmanuelle Berthaux
- ICM, Paris Brain Institute, Hôpital de la Pitié-Salpêtrière, Sorbonne Université, INSERM, CNRS, 75013 Paris, France
| | - Cibele Martins Pinho
- ICM, Paris Brain Institute, Hôpital de la Pitié-Salpêtrière, Sorbonne Université, INSERM, CNRS, 75013 Paris, France
| | - Yann Zerlaut
- ICM, Paris Brain Institute, Hôpital de la Pitié-Salpêtrière, Sorbonne Université, INSERM, CNRS, 75013 Paris, France.
| | - Nelson Rebola
- ICM, Paris Brain Institute, Hôpital de la Pitié-Salpêtrière, Sorbonne Université, INSERM, CNRS, 75013 Paris, France.
| |
Collapse
|
8
|
Ni S, Harris B, Gong P. Distributed and dynamical communication: a mechanism for flexible cortico-cortical interactions and its functional roles in visual attention. Commun Biol 2024; 7:550. [PMID: 38719883 PMCID: PMC11078951 DOI: 10.1038/s42003-024-06228-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2023] [Accepted: 04/22/2024] [Indexed: 05/12/2024] Open
Abstract
Perceptual and cognitive processing relies on flexible communication among cortical areas; however, the underlying neural mechanism remains unclear. Here we report a mechanism based on the realistic spatiotemporal dynamics of propagating wave patterns in neural population activity. Using a biophysically plausible, multiarea spiking neural circuit model, we demonstrate that these wave patterns, characterized by their rich and complex dynamics, can account for a wide variety of empirically observed neural processes. The coordinated interactions of these wave patterns give rise to distributed and dynamic communication (DDC) that enables flexible and rapid routing of neural activity across cortical areas. We elucidate how DDC unifies the previously proposed oscillation synchronization-based and subspace-based views of interareal communication, offering experimentally testable predictions that we validate through the analysis of Allen Institute Neuropixels data. Furthermore, we demonstrate that DDC can be effectively modulated during attention tasks through the interplay of neuromodulators and cortical feedback loops. This modulation process explains many neural effects of attention, underscoring the fundamental functional role of DDC in cognition.
Collapse
Affiliation(s)
- Shencong Ni
- School of Physics, University of Sydney, Sydney, NSW, Australia
| | - Brendan Harris
- School of Physics, University of Sydney, Sydney, NSW, Australia
| | - Pulin Gong
- School of Physics, University of Sydney, Sydney, NSW, Australia.
| |
Collapse
|
9
|
Zeldenrust F, Calcini N, Yan X, Bijlsma A, Celikel T. The tuning of tuning: How adaptation influences single cell information transfer. PLoS Comput Biol 2024; 20:e1012043. [PMID: 38739640 PMCID: PMC11115315 DOI: 10.1371/journal.pcbi.1012043] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2023] [Revised: 05/23/2024] [Accepted: 04/01/2024] [Indexed: 05/16/2024] Open
Abstract
Sensory neurons reconstruct the world from action potentials (spikes) impinging on them. To effectively transfer information about the stimulus to the next processing level, a neuron needs to be able to adapt its working range to the properties of the stimulus. Here, we focus on the intrinsic neural properties that influence information transfer in cortical neurons and how tightly their properties need to be tuned to the stimulus statistics for them to be effective. We start by measuring the intrinsic information encoding properties of putative excitatory and inhibitory neurons in L2/3 of the mouse barrel cortex. Excitatory neurons show high thresholds and strong adaptation, making them fire sparsely and resulting in a strong compression of information, whereas inhibitory neurons that favour fast spiking transfer more information. Next, we turn to computational modelling and ask how two properties influence information transfer: 1) spike-frequency adaptation and 2) the shape of the IV-curve. We find that a subthreshold (but not threshold) adaptation, the 'h-current', and a properly tuned leak conductance can increase the information transfer of a neuron, whereas threshold adaptation can increase its working range. Finally, we verify the effect of the IV-curve slope in our experimental recordings and show that excitatory neurons form a more heterogeneous population than inhibitory neurons. These relationships between intrinsic neural features and neural coding that had not been quantified before will aid computational, theoretical and systems neuroscientists in understanding how neuronal populations can alter their coding properties, such as through the impact of neuromodulators. Why the variability of intrinsic properties of excitatory neurons is larger than that of inhibitory ones is an exciting question, for which future research is needed.
Collapse
Affiliation(s)
- Fleur Zeldenrust
- Donders Institute for Brain, Cognition, and Behaviour, Radboud University, Nijmegen - the Netherlands
| | - Niccolò Calcini
- Maastricht Centre for Systems Biology (MaCSBio), University of Maastricht, Maastricht, The Netherlands
| | - Xuan Yan
- Institute of Neuroscience, Chinese Academy of Sciences, Beijing, China
| | - Ate Bijlsma
- Department of Population Health Sciences / Department of Biology, Universiteit Utrecht, the Netherlands
| | - Tansu Celikel
- School of Psychology, Georgia Institute of Technology, Atlanta - GA, United States of America
| |
Collapse
|
10
|
Albesa-González A, Clopath C. Learning with filopodia and spines: Complementary strong and weak competition lead to specialized, graded, and protected receptive fields. PLoS Comput Biol 2024; 20:e1012110. [PMID: 38743789 PMCID: PMC11125506 DOI: 10.1371/journal.pcbi.1012110] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2023] [Revised: 05/24/2024] [Accepted: 04/25/2024] [Indexed: 05/16/2024] Open
Abstract
Filopodia are thin synaptic protrusions that have been long known to play an important role in early development. Recently, they have been found to be more abundant in the adult cortex than previously thought, and more plastic than spines (button-shaped mature synapses). Inspired by these findings, we introduce a new model of synaptic plasticity that jointly describes learning of filopodia and spines. The model assumes that filopodia exhibit strongly competitive learning dynamics -similarly to additive spike-timing-dependent plasticity (STDP). At the same time it proposes that, if filopodia undergo sufficient potentiation, they consolidate into spines. Spines follow weakly competitive learning, classically associated with multiplicative, soft-bounded models of STDP. This makes spines more stable and sensitive to the fine structure of input correlations. We show that our learning rule has a selectivity comparable to additive STDP and captures input correlations as well as multiplicative models of STDP. We also show how it can protect previously formed memories and perform synaptic consolidation. Overall, our results can be seen as a phenomenological description of how filopodia and spines could cooperate to overcome the individual difficulties faced by strong and weak competition mechanisms.
Collapse
Affiliation(s)
| | - Claudia Clopath
- Department of Bioengineering, Imperial College London, London, United Kingdom
| |
Collapse
|
11
|
Ma D, Jin X, Sun S, Li Y, Wu X, Hu Y, Yang F, Tang H, Zhu X, Lin P, Pan G. Darwin3: a large-scale neuromorphic chip with a novel ISA and on-chip learning. Natl Sci Rev 2024; 11:nwae102. [PMID: 38689713 PMCID: PMC11060491 DOI: 10.1093/nsr/nwae102] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2023] [Revised: 02/03/2024] [Accepted: 02/23/2024] [Indexed: 05/02/2024] Open
Abstract
Spiking neural networks (SNNs) are gaining increasing attention for their biological plausibility and potential for improved computational efficiency. To match the high spatial-temporal dynamics in SNNs, neuromorphic chips are highly desired to execute SNNs in hardware-based neuron and synapse circuits directly. This paper presents a large-scale neuromorphic chip named Darwin3 with a novel instruction set architecture, which comprises 10 primary instructions and a few extended instructions. It supports flexible neuron model programming and local learning rule designs. The Darwin3 chip architecture is designed in a mesh of computing nodes with an innovative routing algorithm. We used a compression mechanism to represent synaptic connections, significantly reducing memory usage. The Darwin3 chip supports up to 2.35 million neurons, making it the largest of its kind on the neuron scale. The experimental results showed that the code density was improved by up to 28.3× in Darwin3, and that the neuron core fan-in and fan-out were improved by up to 4096× and 3072× by connection compression compared to the physical memory depth. Our Darwin3 chip also provided memory saving between 6.8× and 200.8× when mapping convolutional spiking neural networks onto the chip, demonstrating state-of-the-art performance in accuracy and latency compared to other neuromorphic chips.
Collapse
Affiliation(s)
- De Ma
- College of Computer Science and Technology, Zhejiang University, Hangzhou 310027, China
- Research Center for Intelligent Computing Hardware, Zhejiang Lab, Hangzhou 311121, China
- The State Key Lab of Brain-Machine Intelligence, Zhejiang University, Hangzhou 310027, China
- MOE Frontier Science Center for Brain Science and Brain-machine Integration, Zhejiang University, Hangzhou 310027, China
| | - Xiaofei Jin
- College of Computer Science and Technology, Zhejiang University, Hangzhou 310027, China
- Research Center for Intelligent Computing Hardware, Zhejiang Lab, Hangzhou 311121, China
| | - Shichun Sun
- Research Center for Intelligent Computing Hardware, Zhejiang Lab, Hangzhou 311121, China
| | - Yitao Li
- College of Computer Science and Technology, Zhejiang University, Hangzhou 310027, China
- The State Key Lab of Brain-Machine Intelligence, Zhejiang University, Hangzhou 310027, China
| | - Xundong Wu
- Research Center for Intelligent Computing Hardware, Zhejiang Lab, Hangzhou 311121, China
| | - Youneng Hu
- College of Computer Science and Technology, Zhejiang University, Hangzhou 310027, China
| | - Fangchao Yang
- Research Center for Intelligent Computing Hardware, Zhejiang Lab, Hangzhou 311121, China
| | - Huajin Tang
- College of Computer Science and Technology, Zhejiang University, Hangzhou 310027, China
- Research Center for Intelligent Computing Hardware, Zhejiang Lab, Hangzhou 311121, China
- The State Key Lab of Brain-Machine Intelligence, Zhejiang University, Hangzhou 310027, China
- MOE Frontier Science Center for Brain Science and Brain-machine Integration, Zhejiang University, Hangzhou 310027, China
| | - Xiaolei Zhu
- College of Micro-Nano College of Micro-Nano Electronics, Zhejiang University, Hangzhou 311200, China
- Research Center for Intelligent Computing Hardware, Zhejiang Lab, Hangzhou 311121, China
| | - Peng Lin
- College of Computer Science and Technology, Zhejiang University, Hangzhou 310027, China
- The State Key Lab of Brain-Machine Intelligence, Zhejiang University, Hangzhou 310027, China
- MOE Frontier Science Center for Brain Science and Brain-machine Integration, Zhejiang University, Hangzhou 310027, China
| | - Gang Pan
- College of Computer Science and Technology, Zhejiang University, Hangzhou 310027, China
- Research Center for Intelligent Computing Hardware, Zhejiang Lab, Hangzhou 311121, China
- The State Key Lab of Brain-Machine Intelligence, Zhejiang University, Hangzhou 310027, China
- MOE Frontier Science Center for Brain Science and Brain-machine Integration, Zhejiang University, Hangzhou 310027, China
| |
Collapse
|
12
|
Fida AA, Mittal S, Khanday FA. Mott memristor based stochastic neurons for probabilistic computing. NANOTECHNOLOGY 2024; 35:295201. [PMID: 38593756 DOI: 10.1088/1361-6528/ad3c4b] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/25/2023] [Accepted: 04/09/2024] [Indexed: 04/11/2024]
Abstract
Many studies suggest that probabilistic spiking in biological neural systems is beneficial as it aids learning and provides Bayesian inference-like dynamics. If appropriately utilised, noise and stochasticity in nanoscale devices can benefit neuromorphic systems. In this paper, we build a stochastic leaky integrate and fire (LIF) neuron, utilising a Mott memristor's inherent stochastic switching dynamics. We demonstrate that the developed LIF neuron is capable of biological neural dynamics. We leverage these characteristics of the proposed LIF neuron by integrating it into a population-coded spiking neural network and a spiking restricted Boltzmann machine (sRBM), thereby showcasing its ability to implement probabilistic learning and inference. The sRBM achieves a software-comparable accuracy of 87.13%. Unlike CMOS-based probabilistic neurons, our design does not require any external noise sources. The designed neurons are highly energy efficient and ultra-compact, requiring only three components: a resistor, a capacitor and a memristor device.
Collapse
Affiliation(s)
- Aabid Amin Fida
- Electronics and Communication Engineering, Indian Institute of Technology, Roorkee, Uttrakhand, India
| | - Sparsh Mittal
- Electronics and Communication Engineering, Indian Institute of Technology, Roorkee, Uttrakhand, India
| | - Farooq Ahmad Khanday
- Electronics and Instrumentation Technology, University of Kashmir, Srinagar, J&K, India
| |
Collapse
|
13
|
Hong R, Zheng T, Marra V, Yang D, Liu JK. Multi-scale modelling of the epileptic brain: advantages of computational therapy exploration. J Neural Eng 2024; 21:021002. [PMID: 38621378 DOI: 10.1088/1741-2552/ad3eb4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2023] [Accepted: 04/15/2024] [Indexed: 04/17/2024]
Abstract
Objective: Epilepsy is a complex disease spanning across multiple scales, from ion channels in neurons to neuronal circuits across the entire brain. Over the past decades, computational models have been used to describe the pathophysiological activity of the epileptic brain from different aspects. Traditionally, each computational model can aid in optimizing therapeutic interventions, therefore, providing a particular view to design strategies for treating epilepsy. As a result, most studies are concerned with generating specific models of the epileptic brain that can help us understand the certain machinery of the pathological state. Those specific models vary in complexity and biological accuracy, with system-level models often lacking biological details.Approach: Here, we review various types of computational model of epilepsy and discuss their potential for different therapeutic approaches and scenarios, including drug discovery, surgical strategies, brain stimulation, and seizure prediction. We propose that we need to consider an integrated approach with a unified modelling framework across multiple scales to understand the epileptic brain. Our proposal is based on the recent increase in computational power, which has opened up the possibility of unifying those specific epileptic models into simulations with an unprecedented level of detail.Main results: A multi-scale epilepsy model can bridge the gap between biologically detailed models, used to address molecular and cellular questions, and brain-wide models based on abstract models which can account for complex neurological and behavioural observations.Significance: With these efforts, we move toward the next generation of epileptic brain models capable of connecting cellular features, such as ion channel properties, with standard clinical measures such as seizure severity.
Collapse
Affiliation(s)
- Rongqi Hong
- School of Computer Science, Centre for Human Brain Health, University of Birmingham, Birmingham, United Kingdom
| | - Tingting Zheng
- School of Computer Science, Centre for Human Brain Health, University of Birmingham, Birmingham, United Kingdom
| | | | - Dongping Yang
- Research Centre for Frontier Fundamental Studies, Zhejiang Lab, Hangzhou, People's Republic of China
| | - Jian K Liu
- School of Computer Science, Centre for Human Brain Health, University of Birmingham, Birmingham, United Kingdom
| |
Collapse
|
14
|
El Srouji L, Abdelghany M, Ambethkar HR, Lee YJ, Berkay On M, Yoo SJB. Perspective: an optoelectronic future for heterogeneous, dendritic computing. Front Neurosci 2024; 18:1394271. [PMID: 38699677 PMCID: PMC11064649 DOI: 10.3389/fnins.2024.1394271] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2024] [Accepted: 04/08/2024] [Indexed: 05/05/2024] Open
Abstract
With the increasing number of applications reliant on large neural network models, the pursuit of more suitable computing architectures is becoming increasingly relevant. Progress toward co-integrated silicon photonic and CMOS circuits provides new opportunities for computing architectures with high bandwidth optical networks and high-speed computing. In this paper, we discuss trends in neuromorphic computing architecture and outline an optoelectronic future for heterogeneous, dendritic neuromorphic computing.
Collapse
Affiliation(s)
| | | | | | | | | | - S. J. Ben Yoo
- Department of Electrical and Computer Engineering, University of California, Davis, Davis, CA, United States
| |
Collapse
|
15
|
Miedema R, Strydis C. ExaFlexHH: an exascale-ready, flexible multi-FPGA library for biologically plausible brain simulations. Front Neuroinform 2024; 18:1330875. [PMID: 38680548 PMCID: PMC11045893 DOI: 10.3389/fninf.2024.1330875] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2023] [Accepted: 02/05/2024] [Indexed: 05/01/2024] Open
Abstract
Introduction In-silico simulations are a powerful tool in modern neuroscience for enhancing our understanding of complex brain systems at various physiological levels. To model biologically realistic and detailed systems, an ideal simulation platform must possess: (1) high performance and performance scalability, (2) flexibility, and (3) ease of use for non-technical users. However, most existing platforms and libraries do not meet all three criteria, particularly for complex models such as the Hodgkin-Huxley (HH) model or for complex neuron-connectivity modeling such as gap junctions. Methods This work introduces ExaFlexHH, an exascale-ready, flexible library for simulating HH models on multi-FPGA platforms. Utilizing FPGA-based Data-Flow Engines (DFEs) and the dataflow programming paradigm, ExaFlexHH addresses all three requirements. The library is also parameterizable and compliant with NeuroML, a prominent brain-description language in computational neuroscience. We demonstrate the performance scalability of the platform by implementing a highly demanding extended-Hodgkin-Huxley (eHH) model of the Inferior Olive using ExaFlexHH. Results Model simulation results show linear scalability for unconnected networks and near-linear scalability for networks with complex synaptic plasticity, with a 1.99 × performance increase using two FPGAs compared to a single FPGA simulation, and 7.96 × when using eight FPGAs in a scalable ring topology. Notably, our results also reveal consistent performance efficiency in GFLOPS per watt, further facilitating exascale-ready computing speeds and pushing the boundaries of future brain-simulation platforms. Discussion The ExaFlexHH library shows superior resource efficiency, quantified in FLOPS per hardware resources, benchmarked against other competitive FPGA-based brain simulation implementations.
Collapse
Affiliation(s)
- Rene Miedema
- Department of Neuroscience, Erasmus Medical Center, Rotterdam, Netherlands
| | - Christos Strydis
- Department of Neuroscience, Erasmus Medical Center, Rotterdam, Netherlands
- Quantum and Computer Engineering Department, Delft University of Technology, Delft, Netherlands
| |
Collapse
|
16
|
Fitz H, Hagoort P, Petersson KM. Neurobiological Causal Models of Language Processing. NEUROBIOLOGY OF LANGUAGE (CAMBRIDGE, MASS.) 2024; 5:225-247. [PMID: 38645618 PMCID: PMC11025648 DOI: 10.1162/nol_a_00133] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/29/2022] [Accepted: 12/18/2023] [Indexed: 04/23/2024]
Abstract
The language faculty is physically realized in the neurobiological infrastructure of the human brain. Despite significant efforts, an integrated understanding of this system remains a formidable challenge. What is missing from most theoretical accounts is a specification of the neural mechanisms that implement language function. Computational models that have been put forward generally lack an explicit neurobiological foundation. We propose a neurobiologically informed causal modeling approach which offers a framework for how to bridge this gap. A neurobiological causal model is a mechanistic description of language processing that is grounded in, and constrained by, the characteristics of the neurobiological substrate. It intends to model the generators of language behavior at the level of implementational causality. We describe key features and neurobiological component parts from which causal models can be built and provide guidelines on how to implement them in model simulations. Then we outline how this approach can shed new light on the core computational machinery for language, the long-term storage of words in the mental lexicon and combinatorial processing in sentence comprehension. In contrast to cognitive theories of behavior, causal models are formulated in the "machine language" of neurobiology which is universal to human cognition. We argue that neurobiological causal modeling should be pursued in addition to existing approaches. Eventually, this approach will allow us to develop an explicit computational neurobiology of language.
Collapse
Affiliation(s)
- Hartmut Fitz
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
- Neurobiology of Language Department, Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands
| | - Peter Hagoort
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
- Neurobiology of Language Department, Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands
| | - Karl Magnus Petersson
- Neurobiology of Language Department, Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands
- Faculty of Medicine and Biomedical Sciences, University of Algarve, Faro, Portugal
| |
Collapse
|
17
|
Kuroki S, Mizuseki K. CA3 Circuit Model Compressing Sequential Information in Theta Oscillation and Replay. Neural Comput 2024; 36:501-548. [PMID: 38457750 DOI: 10.1162/neco_a_01641] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2023] [Accepted: 11/20/2023] [Indexed: 03/10/2024]
Abstract
The hippocampus plays a critical role in the compression and retrieval of sequential information. During wakefulness, it achieves this through theta phase precession and theta sequences. Subsequently, during periods of sleep or rest, the compressed information reactivates through sharp-wave ripple events, manifesting as memory replay. However, how these sequential neuronal activities are generated and how they store information about the external environment remain unknown. We developed a hippocampal cornu ammonis 3 (CA3) computational model based on anatomical and electrophysiological evidence from the biological CA3 circuit to address these questions. The model comprises theta rhythm inhibition, place input, and CA3-CA3 plastic recurrent connection. The model can compress the sequence of the external inputs, reproduce theta phase precession and replay, learn additional sequences, and reorganize previously learned sequences. A gradual increase in synaptic inputs, controlled by interactions between theta-paced inhibition and place inputs, explained the mechanism of sequence acquisition. This model highlights the crucial role of plasticity in the CA3 recurrent connection and theta oscillational dynamics and hypothesizes how the CA3 circuit acquires, compresses, and replays sequential information.
Collapse
Affiliation(s)
- Satoshi Kuroki
- Department of Physiology, Graduate School of Medicine, Osaka Metropolitan University, Osaka, 545-8585, Japan
| | - Kenji Mizuseki
- Department of Physiology, Graduate School of Medicine, Osaka Metropolitan University, Osaka, 545-8585, Japan
| |
Collapse
|
18
|
Venkadesh S, Shaikh A, Shakeri H, Barreto E, Van Horn JD. Biophysical modulation and robustness of itinerant complexity in neuronal networks. FRONTIERS IN NETWORK PHYSIOLOGY 2024; 4:1302499. [PMID: 38516614 PMCID: PMC10954887 DOI: 10.3389/fnetp.2024.1302499] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/26/2023] [Accepted: 02/26/2024] [Indexed: 03/23/2024]
Abstract
Transient synchronization of bursting activity in neuronal networks, which occurs in patterns of metastable itinerant phase relationships between neurons, is a notable feature of network dynamics observed in vivo. However, the mechanisms that contribute to this dynamical complexity in neuronal circuits are not well understood. Local circuits in cortical regions consist of populations of neurons with diverse intrinsic oscillatory features. In this study, we numerically show that the phenomenon of transient synchronization, also referred to as metastability, can emerge in an inhibitory neuronal population when the neurons' intrinsic fast-spiking dynamics are appropriately modulated by slower inputs from an excitatory neuronal population. Using a compact model of a mesoscopic-scale network consisting of excitatory pyramidal and inhibitory fast-spiking neurons, our work demonstrates a relationship between the frequency of pyramidal population oscillations and the features of emergent metastability in the inhibitory population. In addition, we introduce a method to characterize collective transitions in metastable networks. Finally, we discuss potential applications of this study in mechanistically understanding cortical network dynamics.
Collapse
Affiliation(s)
- Siva Venkadesh
- Department of Psychology, University of Virginia, Charlottesville, VA, United States
| | - Asmir Shaikh
- Department of Computer Science, University of Virginia, Charlottesville, VA, United States
| | - Heman Shakeri
- School of Data Science, University of Virginia, Charlottesville, VA, United States
- Biomedical Engineering, University of Virginia, Charlottesville, VA, United States
| | - Ernest Barreto
- Department of Physics and Astronomy and the Interdisciplinary Program in Neuroscience, George Mason University, Fairfax, VA, United States
| | - John Darrell Van Horn
- Department of Psychology, University of Virginia, Charlottesville, VA, United States
- School of Data Science, University of Virginia, Charlottesville, VA, United States
| |
Collapse
|
19
|
Arreguit J, Ramalingasetty ST, Ijspeert A. FARMS: Framework for Animal and Robot Modeling and Simulation. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2023.09.25.559130. [PMID: 38293071 PMCID: PMC10827226 DOI: 10.1101/2023.09.25.559130] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/01/2024]
Abstract
The study of animal locomotion and neuromechanical control offers valuable insights for advancing research in neuroscience, biomechanics, and robotics. We have developed FARMS (Framework for Animal and Robot Modeling and Simulation), an open-source, interdisciplinary framework, designed to facilitate access to neuromechanical simulations for modeling, simulation, and analysis of animal locomotion and bio-inspired robotic systems. By providing an accessible and user-friendly platform, FARMS aims to lower the barriers for researchers to explore the complex interactions between the nervous system, musculoskeletal structures, and their environment. Integrating the MuJoCo physics engine in a modular manner, FARMS enables realistic simulations and fosters collaboration among neuroscientists, biologists, and roboticists. FARMS has already been extensively used to study locomotion in animals such as mice, drosophila, fish, salamanders, and centipedes, serving as a platform to investigate the role of central pattern generators and sensory feedback. This article provides an overview of the FARMS framework, discusses its interdisciplinary approach, showcases its versatility through specific case studies, and highlights its effectiveness in advancing our understanding of locomotion. In particular, we show how we used FARMS to study amphibious locomotion by presenting experimental demonstrations across morphologies and environments based on neural controllers with central pattern generators and sensory feedback circuits models. Overall, the goal of FARMS is to contribute to a deeper understanding of animal locomotion, the development of innovative bio-inspired robotic systems, and promote accessibility in neuromechanical research.
Collapse
Affiliation(s)
- Jonathan Arreguit
- BioRob, School of Engineering, Institute of Bioengineering, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Shravan Tata Ramalingasetty
- BioRob, School of Engineering, Institute of Bioengineering, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
- Department of Neurobiology and Anatomy, College of Medicine, Drexel University, Philadelphia, USA
| | - Auke Ijspeert
- BioRob, School of Engineering, Institute of Bioengineering, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| |
Collapse
|
20
|
Aguirre F, Sebastian A, Le Gallo M, Song W, Wang T, Yang JJ, Lu W, Chang MF, Ielmini D, Yang Y, Mehonic A, Kenyon A, Villena MA, Roldán JB, Wu Y, Hsu HH, Raghavan N, Suñé J, Miranda E, Eltawil A, Setti G, Smagulova K, Salama KN, Krestinskaya O, Yan X, Ang KW, Jain S, Li S, Alharbi O, Pazos S, Lanza M. Hardware implementation of memristor-based artificial neural networks. Nat Commun 2024; 15:1974. [PMID: 38438350 PMCID: PMC10912231 DOI: 10.1038/s41467-024-45670-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2023] [Accepted: 02/01/2024] [Indexed: 03/06/2024] Open
Abstract
Artificial Intelligence (AI) is currently experiencing a bloom driven by deep learning (DL) techniques, which rely on networks of connected simple computing units operating in parallel. The low communication bandwidth between memory and processing units in conventional von Neumann machines does not support the requirements of emerging applications that rely extensively on large sets of data. More recent computing paradigms, such as high parallelization and near-memory computing, help alleviate the data communication bottleneck to some extent, but paradigm- shifting concepts are required. Memristors, a novel beyond-complementary metal-oxide-semiconductor (CMOS) technology, are a promising choice for memory devices due to their unique intrinsic device-level properties, enabling both storing and computing with a small, massively-parallel footprint at low power. Theoretically, this directly translates to a major boost in energy efficiency and computational throughput, but various practical challenges remain. In this work we review the latest efforts for achieving hardware-based memristive artificial neural networks (ANNs), describing with detail the working principia of each block and the different design alternatives with their own advantages and disadvantages, as well as the tools required for accurate estimation of performance metrics. Ultimately, we aim to provide a comprehensive protocol of the materials and methods involved in memristive neural networks to those aiming to start working in this field and the experts looking for a holistic approach.
Collapse
Affiliation(s)
- Fernando Aguirre
- Physical Science and Engineering Division, King Abdullah University of Science and Technology (KAUST), Thuwal, 23955-6900, Saudi Arabia
- Departament d'Enginyeria Electrònica, Universitat Autònoma de Barcelona (UAB), 08193, Barcelona, Spain
| | | | | | - Wenhao Song
- Department of Electrical and Computer Engineering, University of Southern California (USC), Los Angeles, CA, 90089, USA
| | - Tong Wang
- Department of Electrical and Computer Engineering, University of Southern California (USC), Los Angeles, CA, 90089, USA
| | - J Joshua Yang
- Department of Electrical and Computer Engineering, University of Southern California (USC), Los Angeles, CA, 90089, USA
| | - Wei Lu
- Department of Electrical Engineering and Computer Science, University of Michigan, Ann Arbor, MI, 48109, USA
| | - Meng-Fan Chang
- Department of Electrical Engineering, National Tsing Hua University, Hsinchu, 30013, Taiwan
| | - Daniele Ielmini
- Dipartimento di Elettronica, Informazione e Bioingegneria, Politecnico di Milano and IUNET, Piazza L. da Vinci 32, 20133, Milano, Italy
| | - Yuchao Yang
- School of Electronic and Computer Engineering, Peking University, Shenzhen, China
| | - Adnan Mehonic
- Department of Electronic and Electrical Engineering, University College London (UCL), Torrington Place, WC1E 7JE, London, UK
| | - Anthony Kenyon
- Department of Electronic and Electrical Engineering, University College London (UCL), Torrington Place, WC1E 7JE, London, UK
| | - Marco A Villena
- Physical Science and Engineering Division, King Abdullah University of Science and Technology (KAUST), Thuwal, 23955-6900, Saudi Arabia
| | - Juan B Roldán
- Departamento de Electrónica y Tecnología de Computadores, Facultad de Ciencias, Universidad de Granada, Avenida Fuentenueva s/n, 18071, Granada, Spain
| | - Yuting Wu
- Department of Electrical Engineering and Computer Science, University of Michigan, Ann Arbor, MI, 48109, USA
| | - Hung-Hsi Hsu
- Department of Electrical Engineering, National Tsing Hua University, Hsinchu, 30013, Taiwan
| | - Nagarajan Raghavan
- Engineering Product Development (EPD) Pillar, Singapore University of Technology & Design, 8 Somapah Road, 487372, Singapore, Singapore
| | - Jordi Suñé
- Departament d'Enginyeria Electrònica, Universitat Autònoma de Barcelona (UAB), 08193, Barcelona, Spain
| | - Enrique Miranda
- Departament d'Enginyeria Electrònica, Universitat Autònoma de Barcelona (UAB), 08193, Barcelona, Spain
| | - Ahmed Eltawil
- Computer, Electrical and Mathematical Sciences and Engineering Division, King Abdullah University of Science and Technology (KAUST), Thuwal, 23955-6900, Saudi Arabia
| | - Gianluca Setti
- Computer, Electrical and Mathematical Sciences and Engineering Division, King Abdullah University of Science and Technology (KAUST), Thuwal, 23955-6900, Saudi Arabia
| | - Kamilya Smagulova
- Computer, Electrical and Mathematical Sciences and Engineering Division, King Abdullah University of Science and Technology (KAUST), Thuwal, 23955-6900, Saudi Arabia
| | - Khaled N Salama
- Computer, Electrical and Mathematical Sciences and Engineering Division, King Abdullah University of Science and Technology (KAUST), Thuwal, 23955-6900, Saudi Arabia
| | - Olga Krestinskaya
- Computer, Electrical and Mathematical Sciences and Engineering Division, King Abdullah University of Science and Technology (KAUST), Thuwal, 23955-6900, Saudi Arabia
| | - Xiaobing Yan
- Key Laboratory of Brain-Like Neuromorphic Devices and Systems of Hebei Province, Hebei University, Baoding, 071002, China
| | - Kah-Wee Ang
- Department of Electrical and Computer Engineering, College of Design and Engineering, National University of Singapore (NUS), Singapore, Singapore
| | - Samarth Jain
- Department of Electrical and Computer Engineering, College of Design and Engineering, National University of Singapore (NUS), Singapore, Singapore
| | - Sifan Li
- Department of Electrical and Computer Engineering, College of Design and Engineering, National University of Singapore (NUS), Singapore, Singapore
| | - Osamah Alharbi
- Physical Science and Engineering Division, King Abdullah University of Science and Technology (KAUST), Thuwal, 23955-6900, Saudi Arabia
| | - Sebastian Pazos
- Physical Science and Engineering Division, King Abdullah University of Science and Technology (KAUST), Thuwal, 23955-6900, Saudi Arabia
| | - Mario Lanza
- Physical Science and Engineering Division, King Abdullah University of Science and Technology (KAUST), Thuwal, 23955-6900, Saudi Arabia.
| |
Collapse
|
21
|
Baravalle R, Canavier CC. Synchrony in Networks of Type 2 Interneurons Is More Robust to Noise with Hyperpolarizing Inhibition Compared to Shunting Inhibition in Both the Stochastic Population Oscillator and the Coupled Oscillator Regimes. eNeuro 2024; 11:ENEURO.0399-23.2024. [PMID: 38471777 PMCID: PMC10972736 DOI: 10.1523/eneuro.0399-23.2024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2023] [Revised: 02/12/2024] [Accepted: 02/28/2024] [Indexed: 03/14/2024] Open
Abstract
Synchronization in the gamma band (25-150 Hz) is mediated by PV+ inhibitory interneurons, and evidence is accumulating for the essential role of gamma oscillations in cognition. Oscillations can arise in inhibitory networks via synaptic interactions between individual oscillatory neurons (mean-driven) or via strong recurrent inhibition that destabilizes the stationary background firing rate in the fluctuation-driven balanced state, causing an oscillation in the population firing rate. Previous theoretical work focused on model neurons with Hodgkin's Type 1 excitability (integrators) connected by current-based synapses. Here we show that networks comprised of simple Type 2 oscillators (resonators) exhibit a supercritical Hopf bifurcation between synchrony and asynchrony and a gradual transition via cycle skipping from coupled oscillators to stochastic population oscillator (SPO), as previously shown for Type 1. We extended our analysis to homogeneous networks with conductance rather than current based synapses and found that networks with hyperpolarizing inhibitory synapses were more robust to noise than those with shunting synapses, both in the coupled oscillator and SPO regime. Assuming that reversal potentials are uniformly distributed between shunting and hyperpolarized values, as observed in one experimental study, converting synapses to purely hyperpolarizing favored synchrony in all cases, whereas conversion to purely shunting synapses made synchrony less robust except at very high conductance strengths. In mature neurons the synaptic reversal potential is controlled by chloride cotransporters that control the intracellular concentrations of chloride and bicarbonate ions, suggesting these transporters as a potential therapeutic target to enhance gamma synchrony and cognition.
Collapse
Affiliation(s)
- Roman Baravalle
- Department of Cell Biology and Anatomy, Louisiana State University Health Sciences Center-New Orleans, New Orleans, Louisiana 70112
| | - Carmen C Canavier
- Department of Cell Biology and Anatomy, Louisiana State University Health Sciences Center-New Orleans, New Orleans, Louisiana 70112
| |
Collapse
|
22
|
Chou GM, Bush NE, Phillips RS, Baertsch NA, Harris KD. Modeling Effects of Variable preBötzinger Complex Network Topology and Cellular Properties on Opioid-Induced Respiratory Depression and Recovery. eNeuro 2024; 11:ENEURO.0284-23.2023. [PMID: 38253582 PMCID: PMC10921262 DOI: 10.1523/eneuro.0284-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2023] [Revised: 09/22/2023] [Accepted: 11/02/2023] [Indexed: 01/24/2024] Open
Abstract
The preBötzinger complex (preBötC), located in the medulla, is the essential rhythm-generating neural network for breathing. The actions of opioids on this network impair its ability to generate robust, rhythmic output, contributing to life-threatening opioid-induced respiratory depression (OIRD). The occurrence of OIRD varies across individuals and internal and external states, increasing the risk of opioid use, yet the mechanisms of this variability are largely unknown. In this study, we utilize a computational model of the preBötC to perform several in silico experiments exploring how differences in network topology and the intrinsic properties of preBötC neurons influence the sensitivity of the network rhythm to opioids. We find that rhythms produced by preBötC networks in silico exhibit variable responses to simulated opioids, similar to the preBötC network in vitro. This variability is primarily due to random differences in network topology and can be manipulated by imposed changes in network connectivity and intrinsic neuronal properties. Our results identify features of the preBötC network that may regulate its susceptibility to opioids.
Collapse
Affiliation(s)
- Grant M Chou
- Department of Computer Science, Western Washington University, Bellingham, Washington 98225
| | - Nicholas E Bush
- Seattle Children's Research Institute, Center for Integrative Brain Research, Seattle, Washington 90101
| | - Ryan S Phillips
- Seattle Children's Research Institute, Center for Integrative Brain Research, Seattle, Washington 90101
| | - Nathan A Baertsch
- Seattle Children's Research Institute, Center for Integrative Brain Research, Seattle, Washington 90101
- Department of Pediatrics, University of Washington, Seattle, Washington 98195
- Department of Physiology and Biophysics, University of Washington, Seattle, Washington 98195
| | - Kameron Decker Harris
- Department of Computer Science, Western Washington University, Bellingham, Washington 98225
| |
Collapse
|
23
|
Zhang X, Dou Z, Kim SH, Upadhyay G, Havert D, Kang S, Kazemi K, Huang K, Aydin O, Huang R, Rahman S, Ellis‐Mohr A, Noblet HA, Lim KH, Chung HJ, Gritton HJ, Saif MTA, Kong HJ, Beggs JM, Gazzola M. Mind In Vitro Platforms: Versatile, Scalable, Robust, and Open Solutions to Interfacing with Living Neurons. ADVANCED SCIENCE (WEINHEIM, BADEN-WURTTEMBERG, GERMANY) 2024; 11:e2306826. [PMID: 38161217 PMCID: PMC10953569 DOI: 10.1002/advs.202306826] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/18/2023] [Revised: 12/12/2023] [Indexed: 01/03/2024]
Abstract
Motivated by the unexplored potential of in vitro neural systems for computing and by the corresponding need of versatile, scalable interfaces for multimodal interaction, an accurate, modular, fully customizable, and portable recording/stimulation solution that can be easily fabricated, robustly operated, and broadly disseminated is presented. This approach entails a reconfigurable platform that works across multiple industry standards and that enables a complete signal chain, from neural substrates sampled through micro-electrode arrays (MEAs) to data acquisition, downstream analysis, and cloud storage. Built-in modularity supports the seamless integration of electrical/optical stimulation and fluidic interfaces. Custom MEA fabrication leverages maskless photolithography, favoring the rapid prototyping of a variety of configurations, spatial topologies, and constitutive materials. Through a dedicated analysis and management software suite, the utility and robustness of this system are demonstrated across neural cultures and applications, including embryonic stem cell-derived and primary neurons, organotypic brain slices, 3D engineered tissue mimics, concurrent calcium imaging, and long-term recording. Overall, this technology, termed "mind in vitro" to underscore the computing inspiration, provides an end-to-end solution that can be widely deployed due to its affordable (>10× cost reduction) and open-source nature, catering to the expanding needs of both conventional and unconventional electrophysiology.
Collapse
Affiliation(s)
- Xiaotian Zhang
- Carl R. Woese Institute for Genomic BiologyUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
| | - Zhi Dou
- Department of Mechanical Science and EngineeringUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
| | - Seung Hyun Kim
- Department of Mechanical Science and EngineeringUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
| | - Gaurav Upadhyay
- Department of Mechanical Science and EngineeringUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
| | - Daniel Havert
- Department of PhysicsIndiana University BloomingtonBloomingtonIN47405USA
| | - Sehong Kang
- Department of Mechanical Science and EngineeringUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
| | - Kimia Kazemi
- Department of Mechanical Science and EngineeringUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
| | - Kai‐Yu Huang
- Department of Chemical and Biomolecular EngineeringUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
| | - Onur Aydin
- Carl R. Woese Institute for Genomic BiologyUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
| | - Raymond Huang
- Department of Mechanical Science and EngineeringUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
| | - Saeedur Rahman
- Department of Mechanical Science and EngineeringUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
| | - Austin Ellis‐Mohr
- Department of Electrical and Computer EngineeringUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
| | - Hayden A. Noblet
- Molecular and Integrative PhysiologyUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
- Neuroscience ProgramUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
| | - Ki H. Lim
- Molecular and Integrative PhysiologyUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
| | - Hee Jung Chung
- Carl R. Woese Institute for Genomic BiologyUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
- Molecular and Integrative PhysiologyUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
- Neuroscience ProgramUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
- Beckman Institute for Advanced Science and TechnologyUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
| | - Howard J. Gritton
- Beckman Institute for Advanced Science and TechnologyUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
- Department of Comparative BiosciencesUniversity of Illinois at Urbana–ChampaignUrbanaIL61802USA
| | - M. Taher A. Saif
- Carl R. Woese Institute for Genomic BiologyUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
- Department of Mechanical Science and EngineeringUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
| | - Hyun Joon Kong
- Carl R. Woese Institute for Genomic BiologyUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
- Department of Chemical and Biomolecular EngineeringUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
| | - John M. Beggs
- Department of PhysicsIndiana University BloomingtonBloomingtonIN47405USA
| | - Mattia Gazzola
- Carl R. Woese Institute for Genomic BiologyUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
- Department of Mechanical Science and EngineeringUniversity of Illinois at Urbana–ChampaignUrbanaIL61801USA
| |
Collapse
|
24
|
Vieth M, Rahimi A, Gorgan Mohammadi A, Triesch J, Ganjtabesh M. Accelerating spiking neural network simulations with PymoNNto and PymoNNtorch. Front Neuroinform 2024; 18:1331220. [PMID: 38444756 PMCID: PMC10913591 DOI: 10.3389/fninf.2024.1331220] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2023] [Accepted: 01/29/2024] [Indexed: 03/07/2024] Open
Abstract
Spiking neural network simulations are a central tool in Computational Neuroscience, Artificial Intelligence, and Neuromorphic Engineering research. A broad range of simulators and software frameworks for such simulations exist with different target application areas. Among these, PymoNNto is a recent Python-based toolbox for spiking neural network simulations that emphasizes the embedding of custom code in a modular and flexible way. While PymoNNto already supports GPU implementations, its backend relies on NumPy operations. Here we introduce PymoNNtorch, which is natively implemented with PyTorch while retaining PymoNNto's modular design. Furthermore, we demonstrate how changes to the implementations of common network operations in combination with PymoNNtorch's native GPU support can offer speed-up over conventional simulators like NEST, ANNarchy, and Brian 2 in certain situations. Overall, we show how PymoNNto's modular and flexible design in combination with PymoNNtorch's GPU acceleration and optimized indexing operations facilitate research and development of spiking neural networks in the Python programming language.
Collapse
Affiliation(s)
- Marius Vieth
- Frankfurt Institute for Advanced Studies, Frankfurt am Main, Germany
| | - Ali Rahimi
- Department of Mathematics, Statistics, and Computer Science - College of Science, University of Tehran, Tehran, Iran
| | - Ashena Gorgan Mohammadi
- Department of Mathematics, Statistics, and Computer Science - College of Science, University of Tehran, Tehran, Iran
| | - Jochen Triesch
- Frankfurt Institute for Advanced Studies, Frankfurt am Main, Germany
| | - Mohammad Ganjtabesh
- Department of Mathematics, Statistics, and Computer Science - College of Science, University of Tehran, Tehran, Iran
| |
Collapse
|
25
|
Islam R, Majurski P, Kwon J, Sharma A, Tummala SRSK. Benchmarking Artificial Neural Network Architectures for High-Performance Spiking Neural Networks. SENSORS (BASEL, SWITZERLAND) 2024; 24:1329. [PMID: 38400487 PMCID: PMC10892219 DOI: 10.3390/s24041329] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/22/2024] [Revised: 02/13/2024] [Accepted: 02/17/2024] [Indexed: 02/25/2024]
Abstract
Organizations managing high-performance computing systems face a multitude of challenges, including overarching concerns such as overall energy consumption, microprocessor clock frequency limitations, and the escalating costs associated with chip production. Evidently, processor speeds have plateaued over the last decade, persisting within the range of 2 GHz to 5 GHz. Scholars assert that brain-inspired computing holds substantial promise for mitigating these challenges. The spiking neural network (SNN) particularly stands out for its commendable power efficiency when juxtaposed with conventional design paradigms. Nevertheless, our scrutiny has brought to light several pivotal challenges impeding the seamless implementation of large-scale neural networks (NNs) on silicon. These challenges encompass the absence of automated tools, the need for multifaceted domain expertise, and the inadequacy of existing algorithms to efficiently partition and place extensive SNN computations onto hardware infrastructure. In this paper, we posit the development of an automated tool flow capable of transmuting any NN into an SNN. This undertaking involves the creation of a novel graph-partitioning algorithm designed to strategically place SNNs on a network-on-chip (NoC), thereby paving the way for future energy-efficient and high-performance computing paradigms. The presented methodology showcases its effectiveness by successfully transforming ANN architectures into SNNs with a marginal average error penalty of merely 2.65%. The proposed graph-partitioning algorithm enables a 14.22% decrease in inter-synaptic communication and an 87.58% reduction in intra-synaptic communication, on average, underscoring the effectiveness of the proposed algorithm in optimizing NN communication pathways. Compared to a baseline graph-partitioning algorithm, the proposed approach exhibits an average decrease of 79.74% in latency and a 14.67% reduction in energy consumption. Using existing NoC tools, the energy-latency product of SNN architectures is, on average, 82.71% lower than that of the baseline architectures.
Collapse
Affiliation(s)
- Riadul Islam
- Department of Computer Science and Electrical Engineering, University of Maryland, Baltimore County, Baltimore, MD 21250, USA
| | - Patrick Majurski
- Department of Computer Science and Electrical Engineering, University of Maryland, Baltimore County, Baltimore, MD 21250, USA
| | - Jun Kwon
- Department of Computer Science and Electrical Engineering, University of Maryland, Baltimore County, Baltimore, MD 21250, USA
| | - Anurag Sharma
- Department of Computer Science and Electrical Engineering, University of Maryland, Baltimore County, Baltimore, MD 21250, USA
| | - Sri Ranga Sai Krishna Tummala
- Department of Computer Science and Electrical Engineering, University of Maryland, Baltimore County, Baltimore, MD 21250, USA
| |
Collapse
|
26
|
Vardalakis N, Aussel A, Rougier NP, Wagner FB. A dynamical computational model of theta generation in hippocampal circuits to study theta-gamma oscillations during neurostimulation. eLife 2024; 12:RP87356. [PMID: 38354040 PMCID: PMC10942594 DOI: 10.7554/elife.87356] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/16/2024] Open
Abstract
Neurostimulation of the hippocampal formation has shown promising results for modulating memory but the underlying mechanisms remain unclear. In particular, the effects on hippocampal theta-nested gamma oscillations and theta phase reset, which are both crucial for memory processes, are unknown. Moreover, these effects cannot be investigated using current computational models, which consider theta oscillations with a fixed amplitude and phase velocity. Here, we developed a novel computational model that includes the medial septum, represented as a set of abstract Kuramoto oscillators producing a dynamical theta rhythm with phase reset, and the hippocampal formation, composed of biophysically realistic neurons and able to generate theta-nested gamma oscillations under theta drive. We showed that, for theta inputs just below the threshold to induce self-sustained theta-nested gamma oscillations, a single stimulation pulse could switch the network behavior from non-oscillatory to a state producing sustained oscillations. Next, we demonstrated that, for a weaker theta input, pulse train stimulation at the theta frequency could transiently restore seemingly physiological oscillations. Importantly, the presence of phase reset influenced whether these two effects depended on the phase at which stimulation onset was delivered, which has practical implications for designing neurostimulation protocols that are triggered by the phase of ongoing theta oscillations. This novel model opens new avenues for studying the effects of neurostimulation on the hippocampal formation. Furthermore, our hybrid approach that combines different levels of abstraction could be extended in future work to other neural circuits that produce dynamical brain rhythms.
Collapse
Affiliation(s)
- Nikolaos Vardalakis
- University of Bordeaux, CNRS, IMNBordeauxFrance
- University of Bordeaux, INRIA, IMNBordeauxFrance
| | - Amélie Aussel
- University of Bordeaux, CNRS, IMNBordeauxFrance
- University of Bordeaux, INRIA, IMNBordeauxFrance
- University of Bordeaux, CNRS, Bordeaux INPTalenceFrance
| | - Nicolas P Rougier
- University of Bordeaux, CNRS, IMNBordeauxFrance
- University of Bordeaux, INRIA, IMNBordeauxFrance
- University of Bordeaux, CNRS, Bordeaux INPTalenceFrance
| | | |
Collapse
|
27
|
Kusch L, Diaz-Pier S, Klijn W, Sontheimer K, Bernard C, Morrison A, Jirsa V. Multiscale co-simulation design pattern for neuroscience applications. Front Neuroinform 2024; 18:1156683. [PMID: 38410682 PMCID: PMC10895016 DOI: 10.3389/fninf.2024.1156683] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2023] [Accepted: 01/19/2024] [Indexed: 02/28/2024] Open
Abstract
Integration of information across heterogeneous sources creates added scientific value. Interoperability of data, tools and models is, however, difficult to accomplish across spatial and temporal scales. Here we introduce the toolbox Parallel Co-Simulation, which enables the interoperation of simulators operating at different scales. We provide a software science co-design pattern and illustrate its functioning along a neuroscience example, in which individual regions of interest are simulated on the cellular level allowing us to study detailed mechanisms, while the remaining network is efficiently simulated on the population level. A workflow is illustrated for the use case of The Virtual Brain and NEST, in which the CA1 region of the cellular-level hippocampus of the mouse is embedded into a full brain network involving micro and macro electrode recordings. This new tool allows integrating knowledge across scales in the same simulation framework and validating them against multiscale experiments, thereby largely widening the explanatory power of computational models.
Collapse
Affiliation(s)
- Lionel Kusch
- Institut de Neurosciences des Systèmes (INS), UMR1106, Aix-Marseille Université, Marseilles, France
| | - Sandra Diaz-Pier
- Simulation and Data Lab Neuroscience, Jülich Supercomputing Centre (JSC), Institute for Advanced Simulation, JARA, Forschungszentrum Jülich GmbH, Jülich, Germany
| | - Wouter Klijn
- Simulation and Data Lab Neuroscience, Jülich Supercomputing Centre (JSC), Institute for Advanced Simulation, JARA, Forschungszentrum Jülich GmbH, Jülich, Germany
| | - Kim Sontheimer
- Simulation and Data Lab Neuroscience, Jülich Supercomputing Centre (JSC), Institute for Advanced Simulation, JARA, Forschungszentrum Jülich GmbH, Jülich, Germany
| | - Christophe Bernard
- Institut de Neurosciences des Systèmes (INS), UMR1106, Aix-Marseille Université, Marseilles, France
| | - Abigail Morrison
- Simulation and Data Lab Neuroscience, Jülich Supercomputing Centre (JSC), Institute for Advanced Simulation, JARA, Forschungszentrum Jülich GmbH, Jülich, Germany
- Forschungszentrum Jülich GmbH, IAS-6/INM-6, JARA, Jülich, Germany
- Computer Science 3 - Software Engineering, RWTH Aachen University, Aachen, Germany
| | - Viktor Jirsa
- Institut de Neurosciences des Systèmes (INS), UMR1106, Aix-Marseille Université, Marseilles, France
| |
Collapse
|
28
|
George TM, Rastogi M, de Cothi W, Clopath C, Stachenfeld K, Barry C. RatInABox, a toolkit for modelling locomotion and neuronal activity in continuous environments. eLife 2024; 13:e85274. [PMID: 38334473 PMCID: PMC10857787 DOI: 10.7554/elife.85274] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2022] [Accepted: 01/03/2024] [Indexed: 02/10/2024] Open
Abstract
Generating synthetic locomotory and neural data is a useful yet cumbersome step commonly required to study theoretical models of the brain's role in spatial navigation. This process can be time consuming and, without a common framework, makes it difficult to reproduce or compare studies which each generate test data in different ways. In response, we present RatInABox, an open-source Python toolkit designed to model realistic rodent locomotion and generate synthetic neural data from spatially modulated cell types. This software provides users with (i) the ability to construct one- or two-dimensional environments with configurable barriers and visual cues, (ii) a physically realistic random motion model fitted to experimental data, (iii) rapid online calculation of neural data for many of the known self-location or velocity selective cell types in the hippocampal formation (including place cells, grid cells, boundary vector cells, head direction cells) and (iv) a framework for constructing custom cell types, multi-layer network models and data- or policy-controlled motion trajectories. The motion and neural models are spatially and temporally continuous as well as topographically sensitive to boundary conditions and walls. We demonstrate that out-of-the-box parameter settings replicate many aspects of rodent foraging behaviour such as velocity statistics and the tendency of rodents to over-explore walls. Numerous tutorial scripts are provided, including examples where RatInABox is used for decoding position from neural data or to solve a navigational reinforcement learning task. We hope this tool will significantly streamline computational research into the brain's role in navigation.
Collapse
Affiliation(s)
- Tom M George
- Sainsbury Wellcome Centre, University College LondonLondonUnited Kingdom
| | - Mehul Rastogi
- Sainsbury Wellcome Centre, University College LondonLondonUnited Kingdom
| | - William de Cothi
- Department of Cell and Developmental Biology, University College LondonLondonUnited Kingdom
| | - Claudia Clopath
- Sainsbury Wellcome Centre, University College LondonLondonUnited Kingdom
- Department of Bioengineering, Imperial College LondonLondonUnited Kingdom
| | | | - Caswell Barry
- Department of Cell and Developmental Biology, University College LondonLondonUnited Kingdom
| |
Collapse
|
29
|
Barta T, Kostal L. Shared input and recurrency in neural networks for metabolically efficient information transmission. PLoS Comput Biol 2024; 20:e1011896. [PMID: 38394341 PMCID: PMC10917264 DOI: 10.1371/journal.pcbi.1011896] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2023] [Revised: 03/06/2024] [Accepted: 02/07/2024] [Indexed: 02/25/2024] Open
Abstract
Shared input to a population of neurons induces noise correlations, which can decrease the information carried by a population activity. Inhibitory feedback in recurrent neural networks can reduce the noise correlations and thus increase the information carried by the population activity. However, the activity of inhibitory neurons is costly. This inhibitory feedback decreases the gain of the population. Thus, depolarization of its neurons requires stronger excitatory synaptic input, which is associated with higher ATP consumption. Given that the goal of neural populations is to transmit as much information as possible at minimal metabolic costs, it is unclear whether the increased information transmission reliability provided by inhibitory feedback compensates for the additional costs. We analyze this problem in a network of leaky integrate-and-fire neurons receiving correlated input. By maximizing mutual information with metabolic cost constraints, we show that there is an optimal strength of recurrent connections in the network, which maximizes the value of mutual information-per-cost. For higher values of input correlation, the mutual information-per-cost is higher for recurrent networks with inhibitory feedback compared to feedforward networks without any inhibitory neurons. Our results, therefore, show that the optimal synaptic strength of a recurrent network can be inferred from metabolically efficient coding arguments and that decorrelation of the input by inhibitory feedback compensates for the associated increased metabolic costs.
Collapse
Affiliation(s)
- Tomas Barta
- Laboratory of Computational Neuroscience, Institute of Physiology of the Czech Academy of Sciences, Prague, Czech Republic
- Neural Coding and Brain Computing Unit, Okinawa Institute of Science and Technology, Onna-son, Okinawa, Japan
| | - Lubomir Kostal
- Laboratory of Computational Neuroscience, Institute of Physiology of the Czech Academy of Sciences, Prague, Czech Republic
| |
Collapse
|
30
|
Becker MP, Idiart MAP. Mean-field method for generic conductance-based integrate-and-fire neurons with finite timescales. Phys Rev E 2024; 109:024406. [PMID: 38491595 DOI: 10.1103/physreve.109.024406] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2023] [Accepted: 11/08/2023] [Indexed: 03/18/2024]
Abstract
The construction of transfer functions in theoretical neuroscience plays an important role in determining the spiking rate behavior of neurons in networks. These functions can be obtained through various fitting methods, but the biological relevance of the parameters is not always clear. However, for stationary inputs, such functions can be obtained without the adjustment of free parameters by using mean-field methods. In this work, we expand current Fokker-Planck approaches to account for the concurrent influence of colored and multiplicative noise terms on generic conductance-based integrate-and-fire neurons. We reduce the resulting stochastic system through the application of the diffusion approximation to a one-dimensional Langevin equation. An effective Fokker-Planck is then constructed using Fox Theory, which is solved numerically using a newly developed double integration procedure to obtain the transfer function and the membrane potential distribution. The solution is capable of reproducing the transfer function and the stationary voltage distribution of simulated neurons across a wide range of parameters. The method can also be easily extended to account for different sources of noise with various multiplicative terms, and it can be used in other types of problems in principle.
Collapse
Affiliation(s)
- Marcelo P Becker
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany
| | - Marco A P Idiart
- Department of Physics, Institute of Physics, Federal University of Rio Grande do Sul, Porto Alegre, Brazil
| |
Collapse
|
31
|
Pochinok I, Stöber TM, Triesch J, Chini M, Hanganu-Opatz IL. A developmental increase of inhibition promotes the emergence of hippocampal ripples. Nat Commun 2024; 15:738. [PMID: 38272901 PMCID: PMC10810866 DOI: 10.1038/s41467-024-44983-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2023] [Accepted: 01/10/2024] [Indexed: 01/27/2024] Open
Abstract
Sharp wave-ripples (SPW-Rs) are a hippocampal network phenomenon critical for memory consolidation and planning. SPW-Rs have been extensively studied in the adult brain, yet their developmental trajectory is poorly understood. While SPWs have been recorded in rodents shortly after birth, the time point and mechanisms of ripple emergence are still unclear. Here, we combine in vivo electrophysiology with optogenetics and chemogenetics in 4 to 12-day-old mice to address this knowledge gap. We show that ripples are robustly detected and induced by light stimulation of channelrhodopsin-2-transfected CA1 pyramidal neurons only from postnatal day 10 onwards. Leveraging a spiking neural network model, we mechanistically link the maturation of inhibition and ripple emergence. We corroborate these findings by reducing ripple rate upon chemogenetic silencing of CA1 interneurons. Finally, we show that early SPW-Rs elicit a more robust prefrontal cortex response than SPWs lacking ripples. Thus, development of inhibition promotes ripples emergence.
Collapse
Affiliation(s)
- Irina Pochinok
- Institute of Developmental Neurophysiology, Center for Molecular Neurobiology (ZMNH), Hamburg Center of Neuroscience (HCNS), University Medical Center Hamburg-Eppendorf, 20251, Hamburg, Germany
| | - Tristan M Stöber
- Frankfurt Institute for Advanced Studies, 60438, Frankfurt am Main, Germany
| | - Jochen Triesch
- Frankfurt Institute for Advanced Studies, 60438, Frankfurt am Main, Germany
| | - Mattia Chini
- Institute of Developmental Neurophysiology, Center for Molecular Neurobiology (ZMNH), Hamburg Center of Neuroscience (HCNS), University Medical Center Hamburg-Eppendorf, 20251, Hamburg, Germany.
| | - Ileana L Hanganu-Opatz
- Institute of Developmental Neurophysiology, Center for Molecular Neurobiology (ZMNH), Hamburg Center of Neuroscience (HCNS), University Medical Center Hamburg-Eppendorf, 20251, Hamburg, Germany.
| |
Collapse
|
32
|
Jürgensen AM, Sakagiannis P, Schleyer M, Gerber B, Nawrot MP. Prediction error drives associative learning and conditioned behavior in a spiking model of Drosophila larva. iScience 2024; 27:108640. [PMID: 38292165 PMCID: PMC10824792 DOI: 10.1016/j.isci.2023.108640] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2023] [Revised: 11/10/2023] [Accepted: 12/01/2023] [Indexed: 02/01/2024] Open
Abstract
Predicting reinforcement from sensory cues is beneficial for goal-directed behavior. In insect brains, underlying associations between cues and reinforcement, encoded by dopaminergic neurons, are formed in the mushroom body. We propose a spiking model of the Drosophila larva mushroom body. It includes a feedback motif conveying learned reinforcement expectation to dopaminergic neurons, which can compute prediction error as the difference between expected and present reinforcement. We demonstrate that this can serve as a driving force in learning. When combined with synaptic homeostasis, our model accounts for theoretically derived features of acquisition and loss of associations that depend on the intensity of the reinforcement and its temporal proximity to the cue. From modeling olfactory learning over the time course of behavioral experiments and simulating the locomotion of individual larvae toward or away from odor sources in a virtual environment, we conclude that learning driven by prediction errors can explain larval behavior.
Collapse
Affiliation(s)
- Anna-Maria Jürgensen
- Computational Systems Neuroscience, Institute of Zoology, University of Cologne, 50674 Cologne, Germany
| | - Panagiotis Sakagiannis
- Computational Systems Neuroscience, Institute of Zoology, University of Cologne, 50674 Cologne, Germany
| | - Michael Schleyer
- Leibniz Institute for Neurobiology (LIN), Department of Genetics, 39118 Magdeburg, Germany
- Institute for the Advancement of Higher Education, Faculty of Science, Hokkaido University, Sapporo 060-08080, Japan
| | - Bertram Gerber
- Leibniz Institute for Neurobiology (LIN), Department of Genetics, 39118 Magdeburg, Germany
- Institute for Biology, Otto-von-Guericke University, 39120 Magdeburg, Germany
- Center for Brain and Behavioral Sciences (CBBS), Otto-von-Guericke University, 39118 Magdeburg, Germany
| | - Martin Paul Nawrot
- Computational Systems Neuroscience, Institute of Zoology, University of Cologne, 50674 Cologne, Germany
| |
Collapse
|
33
|
Gemo E, Spiga S, Brivio S. SHIP: a computational framework for simulating and validating novel technologies in hardware spiking neural networks. Front Neurosci 2024; 17:1270090. [PMID: 38264497 PMCID: PMC10804805 DOI: 10.3389/fnins.2023.1270090] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2023] [Accepted: 12/14/2023] [Indexed: 01/25/2024] Open
Abstract
Investigations in the field of spiking neural networks (SNNs) encompass diverse, yet overlapping, scientific disciplines. Examples range from purely neuroscientific investigations, researches on computational aspects of neuroscience, or applicative-oriented studies aiming to improve SNNs performance or to develop artificial hardware counterparts. However, the simulation of SNNs is a complex task that can not be adequately addressed with a single platform applicable to all scenarios. The optimization of a simulation environment to meet specific metrics often entails compromises in other aspects. This computational challenge has led to an apparent dichotomy of approaches, with model-driven algorithms dedicated to the detailed simulation of biological networks, and data-driven algorithms designed for efficient processing of large input datasets. Nevertheless, material scientists, device physicists, and neuromorphic engineers who develop new technologies for spiking neuromorphic hardware solutions would find benefit in a simulation environment that borrows aspects from both approaches, thus facilitating modeling, analysis, and training of prospective SNN systems. This manuscript explores the numerical challenges deriving from the simulation of spiking neural networks, and introduces SHIP, Spiking (neural network) Hardware In PyTorch, a numerical tool that supports the investigation and/or validation of materials, devices, small circuit blocks within SNN architectures. SHIP facilitates the algorithmic definition of the models for the components of a network, the monitoring of states and output of the modeled systems, and the training of the synaptic weights of the network, by way of user-defined unsupervised learning rules or supervised training techniques derived from conventional machine learning. SHIP offers a valuable tool for researchers and developers in the field of hardware-based spiking neural networks, enabling efficient simulation and validation of novel technologies.
Collapse
Affiliation(s)
- Emanuele Gemo
- CNR–IMM, Unit of Agrate Brianza, Agrate Brianza, Italy
| | | | | |
Collapse
|
34
|
Wang C, Zhang T, Chen X, He S, Li S, Wu S. BrainPy, a flexible, integrative, efficient, and extensible framework for general-purpose brain dynamics programming. eLife 2023; 12:e86365. [PMID: 38132087 PMCID: PMC10796146 DOI: 10.7554/elife.86365] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2023] [Accepted: 12/20/2023] [Indexed: 12/23/2023] Open
Abstract
Elucidating the intricate neural mechanisms underlying brain functions requires integrative brain dynamics modeling. To facilitate this process, it is crucial to develop a general-purpose programming framework that allows users to freely define neural models across multiple scales, efficiently simulate, train, and analyze model dynamics, and conveniently incorporate new modeling approaches. In response to this need, we present BrainPy. BrainPy leverages the advanced just-in-time (JIT) compilation capabilities of JAX and XLA to provide a powerful infrastructure tailored for brain dynamics programming. It offers an integrated platform for building, simulating, training, and analyzing brain dynamics models. Models defined in BrainPy can be JIT compiled into binary instructions for various devices, including Central Processing Unit, Graphics Processing Unit, and Tensor Processing Unit, which ensures high-running performance comparable to native C or CUDA. Additionally, BrainPy features an extensible architecture that allows for easy expansion of new infrastructure, utilities, and machine-learning approaches. This flexibility enables researchers to incorporate cutting-edge techniques and adapt the framework to their specific needs.
Collapse
Affiliation(s)
- Chaoming Wang
- School of Psychological and Cognitive Sciences, IDG/McGovern Institute for Brain Research, Peking-Tsinghua Center for Life Sciences, Center of Quantitative Biology, Academy for Advanced Interdisciplinary Studies, Bejing Key Laboratory of Behavior and Mental Health, Peking UniversityBeijingChina
- Guangdong Institute of Intelligence Science and TechnologyGuangdongChina
| | - Tianqiu Zhang
- School of Psychological and Cognitive Sciences, IDG/McGovern Institute for Brain Research, Peking-Tsinghua Center for Life Sciences, Center of Quantitative Biology, Academy for Advanced Interdisciplinary Studies, Bejing Key Laboratory of Behavior and Mental Health, Peking UniversityBeijingChina
| | - Xiaoyu Chen
- School of Psychological and Cognitive Sciences, IDG/McGovern Institute for Brain Research, Peking-Tsinghua Center for Life Sciences, Center of Quantitative Biology, Academy for Advanced Interdisciplinary Studies, Bejing Key Laboratory of Behavior and Mental Health, Peking UniversityBeijingChina
| | - Sichao He
- Beijing Jiaotong UniversityBeijingChina
| | - Shangyang Li
- School of Psychological and Cognitive Sciences, IDG/McGovern Institute for Brain Research, Peking-Tsinghua Center for Life Sciences, Center of Quantitative Biology, Academy for Advanced Interdisciplinary Studies, Bejing Key Laboratory of Behavior and Mental Health, Peking UniversityBeijingChina
| | - Si Wu
- School of Psychological and Cognitive Sciences, IDG/McGovern Institute for Brain Research, Peking-Tsinghua Center for Life Sciences, Center of Quantitative Biology, Academy for Advanced Interdisciplinary Studies, Bejing Key Laboratory of Behavior and Mental Health, Peking UniversityBeijingChina
- Guangdong Institute of Intelligence Science and TechnologyGuangdongChina
| |
Collapse
|
35
|
O'Rawe JF, Zhou Z, Li AJ, LaFosse PK, Goldbach HC, Histed MH. Excitation creates a distributed pattern of cortical suppression due to varied recurrent input. Neuron 2023; 111:4086-4101.e5. [PMID: 37865083 PMCID: PMC10872553 DOI: 10.1016/j.neuron.2023.09.010] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2022] [Revised: 05/14/2023] [Accepted: 09/08/2023] [Indexed: 10/23/2023]
Abstract
Dense local, recurrent connections are a major feature of cortical circuits, yet how they affect neurons' responses has been unclear, with some studies reporting weak recurrent effects, some reporting amplification, and others indicating local suppression. Here, we show that optogenetic input to mouse V1 excitatory neurons generates salt-and-pepper patterns of both excitation and suppression. Responses in individual neurons are not strongly predicted by that neuron's direct input. A balanced-state network model reconciles a set of diverse observations: the observed dynamics, suppressed responses, decoupling of input and output, and long tail of excited responses. The model shows recurrent excitatory-excitatory connections are strong and also variable across neurons. Together, these results demonstrate that excitatory recurrent connections can have major effects on cortical computations by shaping and changing neurons' responses to input.
Collapse
Affiliation(s)
- Jonathan F O'Rawe
- National Institute of Mental Health Intramural Program, NIH, Bethesda, MD, USA
| | - Zhishang Zhou
- National Institute of Mental Health Intramural Program, NIH, Bethesda, MD, USA
| | - Anna J Li
- National Institute of Mental Health Intramural Program, NIH, Bethesda, MD, USA
| | - Paul K LaFosse
- National Institute of Mental Health Intramural Program, NIH, Bethesda, MD, USA; NIH-University of Maryland Graduate Partnerships Program, Bethesda, MD, USA; Neuroscience and Cognitive Science Program, University of Maryland, College Park, MD, USA
| | - Hannah C Goldbach
- National Institute of Mental Health Intramural Program, NIH, Bethesda, MD, USA
| | - Mark H Histed
- National Institute of Mental Health Intramural Program, NIH, Bethesda, MD, USA.
| |
Collapse
|
36
|
O'Neill KM, Anderson ED, Mukherjee S, Gandu S, McEwan SA, Omelchenko A, Rodriguez AR, Losert W, Meaney DF, Babadi B, Firestein BL. Time-dependent homeostatic mechanisms underlie brain-derived neurotrophic factor action on neural circuitry. Commun Biol 2023; 6:1278. [PMID: 38110605 PMCID: PMC10728104 DOI: 10.1038/s42003-023-05638-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2023] [Accepted: 11/27/2023] [Indexed: 12/20/2023] Open
Abstract
Plasticity and homeostatic mechanisms allow neural networks to maintain proper function while responding to physiological challenges. Despite previous work investigating morphological and synaptic effects of brain-derived neurotrophic factor (BDNF), the most prevalent growth factor in the central nervous system, how exposure to BDNF manifests at the network level remains unknown. Here we report that BDNF treatment affects rodent hippocampal network dynamics during development and recovery from glutamate-induced excitotoxicity in culture. Importantly, these effects are not obvious when traditional activity metrics are used, so we delve more deeply into network organization, functional analyses, and in silico simulations. We demonstrate that BDNF partially restores homeostasis by promoting recovery of weak and medium connections after injury. Imaging and computational analyses suggest these effects are caused by changes to inhibitory neurons and connections. From our in silico simulations, we find that BDNF remodels the network by indirectly strengthening weak excitatory synapses after injury. Ultimately, our findings may explain the difficulties encountered in preclinical and clinical trials with BDNF and also offer information for future trials to consider.
Collapse
Affiliation(s)
- Kate M O'Neill
- Department of Cell Biology and Neuroscience, Rutgers University, Piscataway, NJ, USA
- Biomedical Engineering Graduate Program, Rutgers University, Piscataway, NJ, USA
- Institute for Physical Science & Technology, University of Maryland, College Park, MD, USA
| | - Erin D Anderson
- Department of Bioengineering, University of Pennsylvania, Philadelphia, PA, USA
| | - Shoutik Mukherjee
- Department of Electrical and Computer Engineering, University of Maryland, College Park, MD, USA
| | - Srinivasa Gandu
- Department of Cell Biology and Neuroscience, Rutgers University, Piscataway, NJ, USA
- Cell and Developmental Biology Graduate Program, Rutgers University, Piscataway, NJ, USA
| | - Sara A McEwan
- Department of Cell Biology and Neuroscience, Rutgers University, Piscataway, NJ, USA
- Neuroscience Graduate Program, Rutgers University, Piscataway, NJ, USA
| | - Anton Omelchenko
- Department of Cell Biology and Neuroscience, Rutgers University, Piscataway, NJ, USA
- Neuroscience Graduate Program, Rutgers University, Piscataway, NJ, USA
| | - Ana R Rodriguez
- Department of Cell Biology and Neuroscience, Rutgers University, Piscataway, NJ, USA
- Biomedical Engineering Graduate Program, Rutgers University, Piscataway, NJ, USA
| | - Wolfgang Losert
- Department of Physics, University of Maryland, College Park, MD, USA
- Institute for Physical Science & Technology, University of Maryland, College Park, MD, USA
| | - David F Meaney
- Department of Bioengineering, University of Pennsylvania, Philadelphia, PA, USA
- Department of Neurosurgery, University of Pennsylvania, Philadelphia, PA, USA
| | - Behtash Babadi
- Department of Electrical and Computer Engineering, University of Maryland, College Park, MD, USA
| | - Bonnie L Firestein
- Department of Cell Biology and Neuroscience, Rutgers University, Piscataway, NJ, USA.
| |
Collapse
|
37
|
Burman RJ, Brodersen PJN, Raimondo JV, Sen A, Akerman CJ. Active cortical networks promote shunting fast synaptic inhibition in vivo. Neuron 2023; 111:3531-3540.e6. [PMID: 37659408 DOI: 10.1016/j.neuron.2023.08.005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2023] [Revised: 07/03/2023] [Accepted: 08/04/2023] [Indexed: 09/04/2023]
Abstract
Fast synaptic inhibition determines neuronal response properties in the mammalian brain and is mediated by chloride-permeable ionotropic GABA-A receptors (GABAARs). Despite their fundamental role, it is still not known how GABAARs signal in the intact brain. Here, we use in vivo gramicidin recordings to investigate synaptic GABAAR signaling in mouse cortical pyramidal neurons under conditions that preserve native transmembrane chloride gradients. In anesthetized cortex, synaptic GABAARs exert classic hyperpolarizing effects. In contrast, GABAAR-mediated synaptic signaling in awake cortex is found to be predominantly shunting. This is due to more depolarized GABAAR equilibrium potentials (EGABAAR), which are shown to result from the high levels of synaptic activity that characterize awake cortical networks. Synaptic EGABAAR observed in awake cortex facilitates the desynchronizing effects of inhibitory inputs upon local networks, which increases the flexibility of spiking responses to external inputs. Our findings therefore suggest that GABAAR signaling adapts to optimize cortical functions.
Collapse
Affiliation(s)
- Richard J Burman
- Department of Pharmacology, University of Oxford, Oxford, OX1 3QT, UK; Oxford Epilepsy Research Group, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, OX3 9DU, UK
| | | | - Joseph V Raimondo
- Division of Cell Biology, Department of Human Biology, Neuroscience Institute and Institute of Infectious Diseases and Molecular Medicine, University of Cape Town, Cape Town, 7935, South Africa
| | - Arjune Sen
- Oxford Epilepsy Research Group, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, OX3 9DU, UK
| | - Colin J Akerman
- Department of Pharmacology, University of Oxford, Oxford, OX1 3QT, UK.
| |
Collapse
|
38
|
Susin E, Destexhe A. A Network Model of the Modulation of γ Oscillations by NMDA Receptors in Cerebral Cortex. eNeuro 2023; 10:ENEURO.0157-23.2023. [PMID: 37940562 PMCID: PMC10668239 DOI: 10.1523/eneuro.0157-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2023] [Revised: 08/31/2023] [Accepted: 09/05/2023] [Indexed: 11/10/2023] Open
Abstract
Psychotic drugs such as ketamine induce symptoms close to schizophrenia and stimulate the production of γ oscillations, as also seen in patients, but the underlying mechanisms are still unclear. Here, we have used computational models of cortical networks generating γ oscillations, and have integrated the action of drugs such as ketamine to partially block NMDA receptors (NMDARs). The model can reproduce the paradoxical increase of γ oscillations by NMDA receptor antagonists, assuming that antagonists affect NMDA receptors with higher affinity on inhibitory interneurons. We next used the model to compare the responsiveness of the network to external stimuli, and found that when NMDA channels are blocked, an increase of γ power is observed altogether with an increase of network responsiveness. However, this responsiveness increase applies not only to γ states, but also to asynchronous states with no apparent γ. We conclude that NMDA antagonists induce an increased excitability state, which may or may not produce γ oscillations, but the response to external inputs is exacerbated, which may explain phenomena such as altered perception or hallucinations.
Collapse
Affiliation(s)
- Eduarda Susin
- Institute of Neuroscience (NeuroPSI), Paris-Saclay University, Centre National de la Recherche Scientifique (CNRS), Saclay, France 91400
| | - Alain Destexhe
- Institute of Neuroscience (NeuroPSI), Paris-Saclay University, Centre National de la Recherche Scientifique (CNRS), Saclay, France 91400
| |
Collapse
|
39
|
Liu C, Todorova R, Tang W, Oliva A, Fernandez-Ruiz A. Associative and predictive hippocampal codes support memory-guided behaviors. Science 2023; 382:eadi8237. [PMID: 37856604 PMCID: PMC10894649 DOI: 10.1126/science.adi8237] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2023] [Accepted: 08/21/2023] [Indexed: 10/21/2023]
Abstract
Episodic memory involves learning and recalling associations between items and their spatiotemporal context. Those memories can be further used to generate internal models of the world that enable predictions to be made. The mechanisms that support these associative and predictive aspects of memory are not yet understood. In this study, we used an optogenetic manipulation to perturb the sequential structure, but not global network dynamics, of place cells as rats traversed specific spatial trajectories. This perturbation abolished replay of those trajectories and the development of predictive representations, leading to impaired learning of new optimal trajectories during memory-guided navigation. However, place cell assembly reactivation and reward-context associative learning were unaffected. Our results show a mechanistic dissociation between two complementary hippocampal codes: an associative code (through coactivity) and a predictive code (through sequences).
Collapse
Affiliation(s)
| | | | - Wenbo Tang
- Department of Neurobiology and Behavior, Cornell University, Ithaca, NY, USA
| | - Azahara Oliva
- Department of Neurobiology and Behavior, Cornell University, Ithaca, NY, USA
| | | |
Collapse
|
40
|
Fang W, Chen Y, Ding J, Yu Z, Masquelier T, Chen D, Huang L, Zhou H, Li G, Tian Y. SpikingJelly: An open-source machine learning infrastructure platform for spike-based intelligence. SCIENCE ADVANCES 2023; 9:eadi1480. [PMID: 37801497 PMCID: PMC10558124 DOI: 10.1126/sciadv.adi1480] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/06/2023] [Accepted: 09/05/2023] [Indexed: 10/08/2023]
Abstract
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency by introducing neural dynamics and spike properties. As the emerging spiking deep learning paradigm attracts increasing interest, traditional programming frameworks cannot meet the demands of the automatic differentiation, parallel computation acceleration, and high integration of processing neuromorphic datasets and deployment. In this work, we present the SpikingJelly framework to address the aforementioned dilemma. We contribute a full-stack toolkit for preprocessing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips. Compared to existing methods, the training of deep SNNs can be accelerated 11×, and the superior extensibility and flexibility of SpikingJelly enable users to accelerate custom models at low costs through multilevel inheritance and semiautomatic code generation. SpikingJelly paves the way for synthesizing truly energy-efficient SNN-based machine intelligence systems, which will enrich the ecology of neuromorphic computing.
Collapse
Affiliation(s)
- Wei Fang
- School of Computer Science, Peking University, China
- Peng Cheng Laboratory, China
- School of Electronic and Computer Engineering, Shenzhen Graduate School, Peking University, China
| | - Yanqi Chen
- School of Computer Science, Peking University, China
- Peng Cheng Laboratory, China
| | - Jianhao Ding
- School of Computer Science, Peking University, China
| | - Zhaofei Yu
- Institute for Artificial Intelligence, Peking University, China
| | - Timothée Masquelier
- Centre de Recherche Cerveau et Cognition (CERCO), UMR5549 CNRS–Université Toulouse 3, France
| | - Ding Chen
- Peng Cheng Laboratory, China
- Department of Computer Science and Engineering, Shanghai Jiao Tong University, China
| | - Liwei Huang
- School of Computer Science, Peking University, China
- Peng Cheng Laboratory, China
| | | | - Guoqi Li
- Institute of Automation, Chinese Academy of Sciences, China
- School of Artificial Intelligence, University of Chinese Academy of Sciences, China
| | - Yonghong Tian
- School of Computer Science, Peking University, China
- Peng Cheng Laboratory, China
- School of Electronic and Computer Engineering, Shenzhen Graduate School, Peking University, China
| |
Collapse
|
41
|
Baravalle R, Canavier CC. Synchrony in Networks of Type 2 Interneurons is More Robust to Noise with Hyperpolarizing Inhibition Compared to Shunting Inhibition in Both the Stochastic Population Oscillator and the Coupled Oscillator Regimes. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.09.29.560219. [PMID: 37873166 PMCID: PMC10592850 DOI: 10.1101/2023.09.29.560219] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/25/2023]
Abstract
Synchronization in the gamma band (30-80 Hz) is mediated by PV+ inhibitory interneurons, and evidence is accumulating for the essential role of gamma oscillations in cognition. Oscillations can arise in inhibitory networks via synaptic interactions between individual oscillatory neurons (mean-driven) or via strong recurrent inhibition that destabilizes the stationary background firing rate in the fluctuation-driven balanced state, causing an oscillation in the population firing rate. Previous theoretical work focused on model neurons with Hodgkin's type 1 excitability (integrators) connected by current-based synapses. Here we show that networks comprised of simple type 2 oscillators (resonators) exhibit a supercritical Hopf bifurcation between synchrony and asynchrony and a gradual transition via cycle skipping from coupled oscillators to stochastic population oscillator, as previously shown for type 1. We extended our analysis to homogeneous networks with conductance rather than current based synapses and found that networks with hyperpolarizing inhibitory synapses were more robust to noise than those with shunting synapses, both in the coupled oscillator and stochastic population oscillator regime. Assuming that reversal potentials are uniformly distributed between shunting and hyperpolarized values, as observed in one experimental study, converting synapses to purely hyperpolarizing favored synchrony in all cases, whereas conversion to purely shunting synapses made synchrony less robust except at very high conductance strengths. In mature neurons the synaptic reversal potential is controlled by chloride cotransporters that control the intracellular concentrations of chloride and bicarbonate ions, suggesting these transporters as a potential therapeutic target to enhance gamma synchrony and cognition. Significance Statement Brain rhythms in the gamma frequency band (30-80 Hz) depend on the activity of inhibitory interneurons and evidence for a causal role for gamma oscillations in cognitive functions is accumulating. Here we extend previous studies on synchronization mechanisms to interneurons that have an abrupt threshold frequency below which they cannot sustain firing. In addition to current based synapses, we examined inhibitory networks with conductance based synapses. We found that if the reversal potential for inhibition was below the average membrane potential (hyperpolarizing), synchrony was more robust to noise than if the reversal potential was very close to the average potential (shunting). These results have implications for therapies to ameliorate cognitive deficits.
Collapse
|
42
|
Trinh AT, Girardi-Schappo M, Béïque JC, Longtin A, Maler L. Adaptive spike threshold dynamics associated with sparse spiking of hilar mossy cells are captured by a simple model. J Physiol 2023; 601:4397-4422. [PMID: 37676904 DOI: 10.1113/jp283728] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2022] [Accepted: 08/17/2023] [Indexed: 09/09/2023] Open
Abstract
Hilar mossy cells (hMCs) in the dentate gyrus (DG) receive inputs from DG granule cells (GCs), CA3 pyramidal cells and inhibitory interneurons, and provide feedback input to GCs. Behavioural and in vivo recording experiments implicate hMCs in pattern separation, navigation and spatial learning. Our experiments link hMC intrinsic excitability to their synaptically evoked in vivo spiking outputs. We performed electrophysiological recordings from DG neurons and found that hMCs displayed an adaptative spike threshold that increased both in proportion to the intensity of injected currents, and in response to spiking itself, returning to baseline over a long time scale, thereby instantaneously limiting their firing rate responses. The hMC activity is additionally limited by a prominent medium after-hyperpolarizing potential (AHP) generated by small conductance K+ channels. We hypothesize that these intrinsic hMC properties are responsible for their low in vivo firing rates. Our findings extend previous studies that compare hMCs, CA3 pyramidal cells and hilar inhibitory cells and provide novel quantitative data that contrast the intrinsic properties of these cell types. We developed a phenomenological exponential integrate-and-fire model that closely reproduces the hMC adaptive threshold nonlinearities with respect to their threshold dependence on input current intensity, evoked spike latency and long-lasting spike-induced increase in spike threshold. Our robust and computationally efficient model is amenable to incorporation into large network models of the DG that will deepen our understanding of the neural bases of pattern separation, spatial navigation and learning. KEY POINTS: Previous studies have shown that hilar mossy cells (hMCs) are implicated in pattern separation and the formation of spatial memory, but how their intrinsic properties relate to their in vivo spiking patterns is still unknown. Here we show that the hMCs display electrophysiological properties that distinguish them from the other hilar cell types including a highly adaptive spike threshold that decays slowly. The spike-dependent increase in threshold combined with an after-hyperpolarizing potential mediated by a slow K+ conductance is hypothesized to be responsible for the low-firing rate of the hMC observed in vivo. The hMC's features are well captured by a modified stochastic exponential integrate-and-fire model that has the unique feature of a threshold intrinsically dependant on both the stimulus intensity and the spiking history. This computational model will allow future work to study how the hMCs can contribute to spatial memory formation and navigation.
Collapse
Affiliation(s)
- Anh-Tuan Trinh
- Kavli Institute for Systems Neuroscience, Norwegian University of Science and Technology, Trondheim, Trøndelag, Norway
- Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Ontario, Canada
| | - Mauricio Girardi-Schappo
- Departamento de Física, Universidade Federal de Santa Catarina, Santa Catarina, Florianópolis, Brazil
- Department of Physics, University of Ottawa, Ottawa, Ontario, Canada
| | - Jean-Claude Béïque
- Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Ontario, Canada
- Brain and Mind Institute, University of Ottawa, Ottawa, Ontario, Canada
- Center for Neural Dynamics, University of Ottawa, Ottawa, Ontario, Canada
| | - André Longtin
- Department of Physics, University of Ottawa, Ottawa, Ontario, Canada
- Brain and Mind Institute, University of Ottawa, Ottawa, Ontario, Canada
- Center for Neural Dynamics, University of Ottawa, Ottawa, Ontario, Canada
| | - Leonard Maler
- Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Ontario, Canada
- Brain and Mind Institute, University of Ottawa, Ottawa, Ontario, Canada
- Center for Neural Dynamics, University of Ottawa, Ottawa, Ontario, Canada
| |
Collapse
|
43
|
Roemschied FA, Pacheco DA, Aragon MJ, Ireland EC, Li X, Thieringer K, Pang R, Murthy M. Flexible circuit mechanisms for context-dependent song sequencing. Nature 2023; 622:794-801. [PMID: 37821705 PMCID: PMC10600009 DOI: 10.1038/s41586-023-06632-1] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2021] [Accepted: 09/11/2023] [Indexed: 10/13/2023]
Abstract
Sequenced behaviours, including locomotion, reaching and vocalization, are patterned differently in different contexts, enabling animals to adjust to their environments. How contextual information shapes neural activity to flexibly alter the patterning of actions is not fully understood. Previous work has indicated that this could be achieved via parallel motor circuits, with differing sensitivities to context1,2. Here we demonstrate that a single pathway operates in two regimes dependent on recent sensory history. We leverage the Drosophila song production system3 to investigate the role of several neuron types4-7 in song patterning near versus far from the female fly. Male flies sing 'simple' trains of only one mode far from the female fly but complex song sequences comprising alternations between modes when near her. We find that ventral nerve cord (VNC) circuits are shaped by mutual inhibition and rebound excitability8 between nodes driving the two song modes. Brief sensory input to a direct brain-to-VNC excitatory pathway drives simple song far from the female, whereas prolonged input enables complex song production via simultaneous recruitment of functional disinhibition of VNC circuitry. Thus, female proximity unlocks motor circuit dynamics in the correct context. We construct a compact circuit model to demonstrate that the identified mechanisms suffice to replicate natural song dynamics. These results highlight how canonical circuit motifs8,9 can be combined to enable circuit flexibility required for dynamic communication.
Collapse
Affiliation(s)
- Frederic A Roemschied
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
- European Neuroscience Institute, Göttingen, Germany
| | - Diego A Pacheco
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
- Harvard Medical School, Boston, MA, USA
| | - Max J Aragon
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Elise C Ireland
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Xinping Li
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Kyle Thieringer
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Rich Pang
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | - Mala Murthy
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA.
| |
Collapse
|
44
|
Kern FB, Chao ZC. Short-term neuronal and synaptic plasticity act in synergy for deviance detection in spiking networks. PLoS Comput Biol 2023; 19:e1011554. [PMID: 37831721 PMCID: PMC10599548 DOI: 10.1371/journal.pcbi.1011554] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2023] [Revised: 10/25/2023] [Accepted: 09/29/2023] [Indexed: 10/15/2023] Open
Abstract
Sensory areas of cortex respond more strongly to infrequent stimuli when these violate previously established regularities, a phenomenon known as deviance detection (DD). Previous modeling work has mainly attempted to explain DD on the basis of synaptic plasticity. However, a large fraction of cortical neurons also exhibit firing rate adaptation, an underexplored potential mechanism. Here, we investigate DD in a spiking neuronal network model with two types of short-term plasticity, fast synaptic short-term depression (STD) and slower threshold adaptation (TA). We probe the model with an oddball stimulation paradigm and assess DD by evaluating the network responses. We find that TA is sufficient to elicit DD. It achieves this by habituating neurons near the stimulation site that respond earliest to the frequently presented standard stimulus (local fatigue), which diminishes the response and promotes the recovery (global fatigue) of the wider network. Further, we find a synergy effect between STD and TA, where they interact with each other to achieve greater DD than the sum of their individual effects. We show that this synergy is caused by the local fatigue added by STD, which inhibits the global response to the frequently presented stimulus, allowing greater recovery of TA-mediated global fatigue and making the network more responsive to the deviant stimulus. Finally, we show that the magnitude of DD strongly depends on the timescale of stimulation. We conclude that highly predictable information can be encoded in strong local fatigue, which allows greater global recovery and subsequent heightened sensitivity for DD.
Collapse
Affiliation(s)
- Felix Benjamin Kern
- International Research Center for Neurointelligence (WPI-IRCN), The University of Tokyo, Tokyo, Japan
| | - Zenas C. Chao
- International Research Center for Neurointelligence (WPI-IRCN), The University of Tokyo, Tokyo, Japan
| |
Collapse
|
45
|
Müller-Komorowska D, Kuru B, Beck H, Braganza O. Phase information is conserved in sparse, synchronous population-rate-codes via phase-to-rate recoding. Nat Commun 2023; 14:6106. [PMID: 37777512 PMCID: PMC10543394 DOI: 10.1038/s41467-023-41803-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2022] [Accepted: 09/19/2023] [Indexed: 10/02/2023] Open
Abstract
Neural computation is often traced in terms of either rate- or phase-codes. However, most circuit operations will simultaneously affect information across both coding schemes. It remains unclear how phase and rate coded information is transmitted, in the face of continuous modification at consecutive processing stages. Here, we study this question in the entorhinal cortex (EC)- dentate gyrus (DG)- CA3 system using three distinct computational models. We demonstrate that DG feedback inhibition leverages EC phase information to improve rate-coding, a computation we term phase-to-rate recoding. Our results suggest that it i) supports the conservation of phase information within sparse rate-codes and ii) enhances the efficiency of plasticity in downstream CA3 via increased synchrony. Given the ubiquity of both phase-coding and feedback circuits, our results raise the question whether phase-to-rate recoding is a recurring computational motif, which supports the generation of sparse, synchronous population-rate-codes in areas beyond the DG.
Collapse
Affiliation(s)
- Daniel Müller-Komorowska
- Neural Coding and Brain Computing Unit, Okinawa Institute of Science and Technology Graduate University, Okinawa, 904-0495, Japan.
- Institute for Experimental Epileptology and Cognition Research, University of Bonn, Bonn, Germany.
| | - Baris Kuru
- Institute for Experimental Epileptology and Cognition Research, University of Bonn, Bonn, Germany
| | - Heinz Beck
- Institute for Experimental Epileptology and Cognition Research, University of Bonn, Bonn, Germany
- Deutsches Zentrum für Neurodegenerative Erkrankungen e.V, Bonn, Germany
| | - Oliver Braganza
- Institute for Experimental Epileptology and Cognition Research, University of Bonn, Bonn, Germany.
- Institute for Socio-Economics, University of Duisburg-Essen, Duisburg, Germany.
| |
Collapse
|
46
|
Pham MD, D’Angiulli A, Dehnavi MM, Chhabra R. From Brain Models to Robotic Embodied Cognition: How Does Biological Plausibility Inform Neuromorphic Systems? Brain Sci 2023; 13:1316. [PMID: 37759917 PMCID: PMC10526461 DOI: 10.3390/brainsci13091316] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2023] [Revised: 09/05/2023] [Accepted: 09/07/2023] [Indexed: 09/29/2023] Open
Abstract
We examine the challenging "marriage" between computational efficiency and biological plausibility-A crucial node in the domain of spiking neural networks at the intersection of neuroscience, artificial intelligence, and robotics. Through a transdisciplinary review, we retrace the historical and most recent constraining influences that these parallel fields have exerted on descriptive analysis of the brain, construction of predictive brain models, and ultimately, the embodiment of neural networks in an enacted robotic agent. We study models of Spiking Neural Networks (SNN) as the central means enabling autonomous and intelligent behaviors in biological systems. We then provide a critical comparison of the available hardware and software to emulate SNNs for investigating biological entities and their application on artificial systems. Neuromorphics is identified as a promising tool to embody SNNs in real physical systems and different neuromorphic chips are compared. The concepts required for describing SNNs are dissected and contextualized in the new no man's land between cognitive neuroscience and artificial intelligence. Although there are recent reviews on the application of neuromorphic computing in various modules of the guidance, navigation, and control of robotic systems, the focus of this paper is more on closing the cognition loop in SNN-embodied robotics. We argue that biologically viable spiking neuronal models used for electroencephalogram signals are excellent candidates for furthering our knowledge of the explainability of SNNs. We complete our survey by reviewing different robotic modules that can benefit from neuromorphic hardware, e.g., perception (with a focus on vision), localization, and cognition. We conclude that the tradeoff between symbolic computational power and biological plausibility of hardware can be best addressed by neuromorphics, whose presence in neurorobotics provides an accountable empirical testbench for investigating synthetic and natural embodied cognition. We argue this is where both theoretical and empirical future work should converge in multidisciplinary efforts involving neuroscience, artificial intelligence, and robotics.
Collapse
Affiliation(s)
- Martin Do Pham
- Department of Computer Science, University of Toronto, Toronto, ON M5S 1A1, Canada; (M.D.P.); (M.M.D.)
| | - Amedeo D’Angiulli
- Department of Neuroscience, Carleton University, Ottawa, ON K1S 5B6, Canada;
| | - Maryam Mehri Dehnavi
- Department of Computer Science, University of Toronto, Toronto, ON M5S 1A1, Canada; (M.D.P.); (M.M.D.)
| | - Robin Chhabra
- Department of Mechanical and Aerospace Engineering, Carleton University, Ottawa, ON K1S 5B6, Canada
| |
Collapse
|
47
|
Zhu L, Mangan M, Webb B. Neuromorphic sequence learning with an event camera on routes through vegetation. Sci Robot 2023; 8:eadg3679. [PMID: 37756384 DOI: 10.1126/scirobotics.adg3679] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2022] [Accepted: 08/29/2023] [Indexed: 09/29/2023]
Abstract
For many robotics applications, it is desirable to have relatively low-power and efficient onboard solutions. We took inspiration from insects, such as ants, that are capable of learning and following routes in complex natural environments using relatively constrained sensory and neural systems. Such capabilities are particularly relevant to applications such as agricultural robotics, where visual navigation through dense vegetation remains a challenging task. In this scenario, a route is likely to have high self-similarity and be subject to changing lighting conditions and motion over uneven terrain, and the effects of wind on leaves increase the variability of the input. We used a bioinspired event camera on a terrestrial robot to collect visual sequences along routes in natural outdoor environments and applied a neural algorithm for spatiotemporal memory that is closely based on a known neural circuit in the insect brain. We show that this method is plausible to support route recognition for visual navigation and more robust than SeqSLAM when evaluated on repeated runs on the same route or routes with small lateral offsets. By encoding memory in a spiking neural network running on a neuromorphic computer, our model can evaluate visual familiarity in real time from event camera footage.
Collapse
Affiliation(s)
- Le Zhu
- School of Informatics, University of Edinburgh, EH8 9AB Edinburgh, UK
| | - Michael Mangan
- Sheffield Robotics, Department of Computer Science, University of Sheffield, S1 4DP Sheffield, UK
| | - Barbara Webb
- School of Informatics, University of Edinburgh, EH8 9AB Edinburgh, UK
| |
Collapse
|
48
|
Sanaullah, Koravuna S, Rückert U, Jungeblut T. Evaluation of Spiking Neural Nets-Based Image Classification Using the Runtime Simulator RAVSim. Int J Neural Syst 2023; 33:2350044. [PMID: 37604777 DOI: 10.1142/s0129065723500442] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 08/23/2023]
Abstract
Spiking Neural Networks (SNNs) help achieve brain-like efficiency and functionality by building neurons and synapses that mimic the human brain's transmission of electrical signals. However, optimal SNN implementation requires a precise balance of parametric values. To design such ubiquitous neural networks, a graphical tool for visualizing, analyzing, and explaining the internal behavior of spikes is crucial. Although some popular SNN simulators are available, these tools do not allow users to interact with the neural network during simulation. To this end, we have introduced the first runtime interactive simulator, called Runtime Analyzing and Visualization Simulator (RAVSim),a developed to analyze and dynamically visualize the behavior of SNNs, allowing end-users to interact, observe output concentration reactions, and make changes directly during the simulation. In this paper, we present RAVSim with the current implementation of runtime interaction using the LIF neural model with different connectivity schemes, an image classification model using SNNs, and a dataset creation feature. Our main objective is to primarily investigate binary classification using SNNs with RGB images. We created a feed-forward network using the LIF neural model for an image classification algorithm and evaluated it by using RAVSim. The algorithm classifies faces with and without masks, achieving an accuracy of 91.8% using 1000 neurons in a hidden layer, 0.0758 MSE, and an execution time of ∼10[Formula: see text]min on the CPU. The experimental results show that using RAVSim not only increases network design speed but also accelerates user learning capability.
Collapse
Affiliation(s)
- Sanaullah
- Department of Engineering and Mathematics, Bielefeld University of Applied Science, Bielefeld, Germany
| | - Shamini Koravuna
- Department of Cognitive Interaction Technology Center, Bielefeld University, Bielefeld, Germany
| | - Ulrich Rückert
- Department of Cognitive Interaction Technology Center, Bielefeld University, Bielefeld, Germany
| | - Thorsten Jungeblut
- Department of Engineering and Mathematics, Bielefeld University of Applied Science, Bielefeld, Germany
| |
Collapse
|
49
|
Wang Z, Li X, Fan J, Meng J, Lin Z, Pan Y, Wei Y. SWsnn: A Novel Simulator for Spiking Neural Networks. J Comput Biol 2023; 30:951-960. [PMID: 37585615 DOI: 10.1089/cmb.2023.0098] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 08/18/2023] Open
Abstract
Spiking neural network (SNN) simulators play an important role in neural system modeling and brain function research. They can help scientists reproduce and explore neuronal activities in brain regions, neuroscience, brain-like computing, and other fields and can also be applied to artificial intelligence, machine learning, and other fields. At present, many simulators using central processing unit (CPU) or graphics processing unit (GPU) have been developed. However, due to the randomness of connections between neurons and spiking events in SNN simulation, this causes a lot of memory access time. To alleviate this problem, we developed an SNN simulator SWsnn based on the new Sunway SW26010pro processor. The SW26010pro processor consists of six core groups, each with 16 MB of local data memory (LDM). LDM has the characteristics of high-speed read and write, which is suitable for performing simulation tasks similar to SNNs. Experimental results show that SWsnn runs faster than other mainstream GPU-based simulators when simulating a certain scale of neural network, showing a strong performance advantage. To conduct larger scale simulations, SWsnn designed a simulation computation based on a large shared model of Sunway processor and developed a multiprocessor version of SWsnn based on this mode, achieving larger scale SNN simulations.
Collapse
Affiliation(s)
- Zhichao Wang
- Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
- Southern University of Science and Technology, Shenzhen, China
| | - Xuelei Li
- Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
| | - Jianping Fan
- University of Chinese Academy of Sciences, Beijing, China
| | - Jintao Meng
- Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
| | - Zhenli Lin
- Shenzhen University General Hospital, China
| | - Yi Pan
- Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
| | - Yanjie Wei
- Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
| |
Collapse
|
50
|
Huang CH, Lin CCK. New biophysical rate-based modeling of long-term plasticity in mean-field neuronal population models. Comput Biol Med 2023; 163:107213. [PMID: 37413849 DOI: 10.1016/j.compbiomed.2023.107213] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2022] [Revised: 05/20/2023] [Accepted: 06/25/2023] [Indexed: 07/08/2023]
Abstract
The formation of customized neural networks as the basis of brain functions such as receptive field selectivity, learning or memory depends heavily on the long-term plasticity of synaptic connections. However, the current mean-field population models commonly used to simulate large-scale neural network dynamics lack explicit links to the underlying cellular mechanisms of long-term plasticity. In this study, we developed a new mean-field population model, the plastic density-based neural mass model (pdNMM), by incorporating a newly developed rate-based plasticity model based on the calcium control hypothesis into an existing density-based neural mass model. Derivation of the plasticity model was carried out using population density methods. Our results showed that the synaptic plasticity represented by the resulting rate-based plasticity model exhibited Bienenstock-Cooper-Munro-like learning rules. Furthermore, we demonstrated that the pdNMM accurately reproduced previous experimental observations of long-term plasticity, including characteristics of Hebbian plasticity such as longevity, associativity and input specificity, on hippocampal slices, and the formation of receptive field selectivity in the visual cortex. In conclusion, the pdNMM is a novel approach that can confer long-term plasticity to conventional mean-field neuronal population models.
Collapse
Affiliation(s)
- Chih-Hsu Huang
- Department of Neurology, College of Medicine, National Cheng Kung University, Tainan, Taiwan
| | - Chou-Ching K Lin
- Department of Neurology, College of Medicine, National Cheng Kung University, Tainan, Taiwan; Department of Neurology, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan; Innovation Center of Medical Devices and Technology, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan; Medical Device Innovation Center, National Cheng Kung University, Tainan, Taiwan.
| |
Collapse
|