1
|
Chatterjee R, Paluh JL, Chowdhury S, Mondal S, Raha A, Mukherjee A. SyNC, a Computationally Extensive and Realistic Neural Net to Identify Relative Impacts of Synaptopathy Mechanisms on Glutamatergic Neurons and Their Networks in Autism and Complex Neurological Disorders. Front Cell Neurosci 2021; 15:674030. [PMID: 34354570 PMCID: PMC8330424 DOI: 10.3389/fncel.2021.674030] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2021] [Accepted: 05/25/2021] [Indexed: 11/24/2022] Open
Abstract
Synaptic function and experience-dependent plasticity across multiple synapses are dependent on the types of neurons interacting as well as the intricate mechanisms that operate at the molecular level of the synapse. To understand the complexity of information processing at synaptic networks will rely in part on effective computational models. Such models should also evaluate disruptions to synaptic function by multiple mechanisms. By co-development of algorithms alongside hardware, real time analysis metrics can be co-prioritized along with biological complexity. The hippocampus is implicated in autism spectrum disorders (ASD) and within this region glutamatergic neurons constitute 90% of the neurons integral to the functioning of neuronal networks. Here we generate a computational model referred to as ASD interrogator (ASDint) and corresponding hardware to enable in silicon analysis of multiple ASD mechanisms affecting glutamatergic neuron synapses. The hardware architecture Synaptic Neuronal Circuit, SyNC, is a novel GPU accelerator or neural net, that extends discovery by acting as a biologically relevant realistic neuron synapse in real time. Co-developed ASDint and SyNC expand spiking neural network models of plasticity to comparative analysis of retrograde messengers. The SyNC model is realized in an ASIC architecture, which enables the ability to compute increasingly complex scenarios without sacrificing area efficiency of the model. Here we apply the ASDint model to analyse neuronal circuitry dysfunctions associated with autism spectral disorder (ASD) synaptopathies and their effects on the synaptic learning parameter and demonstrate SyNC on an ideal ASDint scenario. Our work highlights the value of secondary pathways in regard to evaluating complex ASD synaptopathy mechanisms. By comparing the degree of variation in the synaptic learning parameter to the response obtained from simulations of the ideal scenario we determine the potency and time of the effect of a particular evaluated mechanism. Hence simulations of such scenarios in even a small neuronal network now allows us to identify relative impacts of changed parameters and their effect on synaptic function. Based on this, we can estimate the minimum fraction of a neuron exhibiting a particular dysfunction scenario required to lead to complete failure of a neural network to coordinate pre-synaptic and post-synaptic outputs.
Collapse
Affiliation(s)
- Rounak Chatterjee
- Department of Electronics and Telecommunication Engineering, Jadavpur University, Kolkata, India
| | - Janet L Paluh
- SUNY Polytechnic Institute, College of Nanoscale Science and Engineering, Nanobioscience, Albany, NY, United States
| | - Souradeep Chowdhury
- Department of Electronics and Telecommunication Engineering, Jadavpur University, Kolkata, India
| | - Soham Mondal
- Flash Controller Team, Memory Solutions, Samsung Semiconductor India Research, Samsung Electronics Co., Ltd., Bangalore, India
| | - Arnab Raha
- Advanced Architecture Research, Intel Intelligent Systems Group, Intel Edge AI, Intel Corporation, Santa Clara, CA, United States
| | | |
Collapse
|
2
|
Susi G, Antón-Toro LF, Maestú F, Pereda E, Mirasso C. nMNSD-A Spiking Neuron-Based Classifier That Combines Weight-Adjustment and Delay-Shift. Front Neurosci 2021; 15:582608. [PMID: 33679293 PMCID: PMC7933525 DOI: 10.3389/fnins.2021.582608] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2020] [Accepted: 01/15/2021] [Indexed: 12/01/2022] Open
Abstract
The recent “multi-neuronal spike sequence detector” (MNSD) architecture integrates the weight- and delay-adjustment methods by combining heterosynaptic plasticity with the neurocomputational feature spike latency, representing a new opportunity to understand the mechanisms underlying biological learning. Unfortunately, the range of problems to which this topology can be applied is limited because of the low cardinality of the parallel spike trains that it can process, and the lack of a visualization mechanism to understand its internal operation. We present here the nMNSD structure, which is a generalization of the MNSD to any number of inputs. The mathematical framework of the structure is introduced, together with the “trapezoid method,” that is a reduced method to analyze the recognition mechanism operated by the nMNSD in response to a specific input parallel spike train. We apply the nMNSD to a classification problem previously faced with the classical MNSD from the same authors, showing the new possibilities the nMNSD opens, with associated improvement in classification performances. Finally, we benchmark the nMNSD on the classification of static inputs (MNIST database) obtaining state-of-the-art accuracies together with advantageous aspects in terms of time- and energy-efficiency if compared to similar classification methods.
Collapse
Affiliation(s)
- Gianluca Susi
- UPM-UCM Laboratory of Cognitive and Computational Neuroscience, Centro de Tecnologia Biomedica, Madrid, Spain.,Departamento de Psicología Experimental, Facultad de Psicología, Universidad Complutense de Madrid, Madrid, Spain.,Department of Civil Engineering and Computer Science, University of Rome "Tor Vergata", Rome, Italy
| | - Luis F Antón-Toro
- UPM-UCM Laboratory of Cognitive and Computational Neuroscience, Centro de Tecnologia Biomedica, Madrid, Spain.,Departamento de Psicología Experimental, Facultad de Psicología, Universidad Complutense de Madrid, Madrid, Spain
| | - Fernando Maestú
- UPM-UCM Laboratory of Cognitive and Computational Neuroscience, Centro de Tecnologia Biomedica, Madrid, Spain.,Departamento de Psicología Experimental, Facultad de Psicología, Universidad Complutense de Madrid, Madrid, Spain.,CIBER-BBN: Networking Research Center on Bioengineering, Biomaterials and Nanomedicine, Madrid, Spain
| | - Ernesto Pereda
- UPM-UCM Laboratory of Cognitive and Computational Neuroscience, Centro de Tecnologia Biomedica, Madrid, Spain.,Departamento de Ingeniería Industrial & IUNE & ITB. Universidad de La Laguna, Tenerife, Spain
| | - Claudio Mirasso
- Instituto de Física Interdisciplinar y Sistemas Complejos (IFISC, UIB-CSIC), Palma de Mallorca, Spain
| |
Collapse
|
3
|
Zhang Y, Qu H, Luo X, Chen Y, Wang Y, Zhang M, Li Z. A new recursive least squares-based learning algorithm for spiking neurons. Neural Netw 2021; 138:110-125. [PMID: 33636484 DOI: 10.1016/j.neunet.2021.01.016] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2019] [Revised: 12/15/2020] [Accepted: 01/18/2021] [Indexed: 10/22/2022]
Abstract
Spiking neural networks (SNNs) are regarded as effective models for processing spatio-temporal information. However, their inherent complexity of temporal coding makes it an arduous task to put forward an effective supervised learning algorithm, which still puzzles researchers in this area. In this paper, we propose a Recursive Least Squares-Based Learning Rule (RLSBLR) for SNN to generate the desired spatio-temporal spike train. During the learning process of our method, the weight update is driven by the cost function defined by the difference between the membrane potential and the firing threshold. The amount of weight modification depends not only on the impact of the current error function, but also on the previous error functions which are evaluated by current weights. In order to improve the learning performance, we integrate a modified synaptic delay learning to the proposed RLSBLR. We conduct experiments in different settings, such as spiking lengths, number of inputs, firing rates, noises and learning parameters, to thoroughly investigate the performance of this learning algorithm. The proposed RLSBLR is compared with competitive algorithms of Perceptron-Based Spiking Neuron Learning Rule (PBSNLR) and Remote Supervised Method (ReSuMe). Experimental results demonstrate that the proposed RLSBLR can achieve higher learning accuracy, higher efficiency and better robustness against different types of noise. In addition, we apply the proposed RLSBLR to open source database TIDIGITS, and the results show that our algorithm has a good practical application performance.
Collapse
Affiliation(s)
- Yun Zhang
- Department of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 610054, PR China
| | - Hong Qu
- Department of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 610054, PR China.
| | - Xiaoling Luo
- Department of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 610054, PR China
| | - Yi Chen
- Department of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 610054, PR China
| | - Yuchen Wang
- Department of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 610054, PR China
| | - Malu Zhang
- Department of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 610054, PR China
| | - Zefang Li
- China Coal Research Institute, Beijing 100013, PR China
| |
Collapse
|
4
|
Hussain I, Thounaojam DM. SpiFoG: an efficient supervised learning algorithm for the network of spiking neurons. Sci Rep 2020; 10:13122. [PMID: 32753645 PMCID: PMC7403331 DOI: 10.1038/s41598-020-70136-5] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2020] [Accepted: 07/23/2020] [Indexed: 11/29/2022] Open
Abstract
There has been a lot of research on supervised learning in spiking neural network (SNN) for a couple of decades to improve computational efficiency. However, evolutionary algorithm based supervised learning for SNN has not been investigated thoroughly which is still in embryo stage. This paper introduce an efficient algorithm (SpiFoG) to train multilayer feed forward SNN in supervised manner that uses elitist floating point genetic algorithm with hybrid crossover. The evidence from neuroscience claims that the brain uses spike times with random synaptic delays for information processing. Therefore, leaky-integrate-and-fire spiking neuron is used in this research introducing random synaptic delays. The SpiFoG allows both excitatory and inhibitory neurons by allowing a mixture of positive and negative synaptic weights. In addition, random synaptic delays are also trained with synaptic weights in an efficient manner. Moreover, computational efficiency of SpiFoG was increased by reducing the total simulation time and increasing the time step since increasing time step within the total simulation time takes less iteration. The SpiFoG is benchmarked on Iris and WBC dataset drawn from the UCI machine learning repository and found better performance than state-of-the-art techniques.
Collapse
Affiliation(s)
- Irshed Hussain
- Computer Vision Laboratory, Department of Computer Science and Engineering, National Institute of Technology Silchar, Silchar, Assam, 788010, India.
| | - Dalton Meitei Thounaojam
- Computer Vision Laboratory, Department of Computer Science and Engineering, National Institute of Technology Silchar, Silchar, Assam, 788010, India
| |
Collapse
|
5
|
Müller NIC, Sonntag M, Maraslioglu A, Hirtz JJ, Friauf E. Topographic map refinement and synaptic strengthening of a sound localization circuit require spontaneous peripheral activity. J Physiol 2019; 597:5469-5493. [PMID: 31529505 DOI: 10.1113/jp277757] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2019] [Accepted: 09/13/2019] [Indexed: 12/21/2022] Open
Abstract
KEY POINTS Loss of the calcium sensor otoferlin disrupts neurotransmission from inner hair cells. Central auditory nuclei are functionally denervated in otoferlin knockout mice (Otof KOs) via gene ablation confined to the periphery. We employed juvenile and young adult Otof KO mice (postnatal days (P)10-12 and P27-49) as a model for lacking spontaneous activity and deafness, respectively. We studied the impact of peripheral activity on synaptic refinement in the sound localization circuit from the medial nucleus of the trapezoid body (MNTB) to the lateral superior olive (LSO). MNTB in vivo recordings demonstrated drastically reduced spontaneous spiking and deafness in Otof KOs. Juvenile KOs showed impaired synapse elimination and strengthening, manifested by broader MNTB-LSO inputs, imprecise MNTB-LSO topography and weaker MNTB-LSO fibres. The impairments persisted into young adulthood. Further functional refinement after hearing onset was undetected in young adult wild-types. Collectively, activity deprivation confined to peripheral protein loss impairs functional MNTB-LSO refinement during a critical prehearing period. ABSTRACT Circuit refinement is critical for the developing sound localization pathways in the auditory brainstem. In prehearing mice (hearing onset around postnatal day (P)12), spontaneous activity propagates from the periphery to central auditory nuclei. At the glycinergic projection from the medial nucleus of the trapezoid body (MNTB) to the lateral superior olive (LSO) of neonatal mice, super-numerous MNTB fibres innervate a given LSO neuron. Between P4 and P9, MNTB fibres are functionally eliminated, whereas the remaining fibres are strengthened. Little is known about MNTB-LSO circuit refinement after P20. Moreover, MNTB-LSO refinement upon activity deprivation confined to the periphery is largely unexplored. This leaves a considerable knowledge gap, as deprivation often occurs in patients with congenital deafness, e.g. upon mutations in the otoferlin gene (OTOF). Here, we analysed juvenile (P10-12) and young adult (P27-49) otoferlin knockout (Otof KO) mice with respect to MNTB-LSO refinement. MNTB in vivo recordings revealed drastically reduced spontaneous activity and deafness in knockouts (KOs), confirming deprivation. As RNA sequencing revealed Otof absence in the MNTB and LSO of wild-types, Otof loss in KOs is specific to the periphery. Functional denervation impaired MNTB-LSO synapse elimination and strengthening, which was assessed by glutamate uncaging and electrical stimulation. Impaired elimination led to imprecise MNTB-LSO topography. Impaired strengthening was associated with lower quantal content per MNTB fibre. In young adult KOs, the MNTB-LSO circuit remained unrefined. Further functional refinement after P12 appeared absent in wild-types. Collectively, we provide novel insights into functional MNTB-LSO circuit maturation governed by a cochlea-specific protein. The central malfunctions in Otof KOs may have implications for patients with sensorineuronal hearing loss.
Collapse
Affiliation(s)
- Nicolas I C Müller
- Animal Physiology Group, Department of Biology, University of Kaiserslautern, D-67663, Kaiserslautern, Germany
| | - Mandy Sonntag
- Paul Flechsig Institute of Brain Research, Faculty of Medicine, University of Leipzig, D-04103, Leipzig, Germany
| | - Ayse Maraslioglu
- Animal Physiology Group, Department of Biology, University of Kaiserslautern, D-67663, Kaiserslautern, Germany
| | - Jan J Hirtz
- Animal Physiology Group, Department of Biology, University of Kaiserslautern, D-67663, Kaiserslautern, Germany.,Physiology of Neuronal Networks, Department of Biology, University of Kaiserslautern, D-67663, Kaiserslautern, Germany
| | - Eckhard Friauf
- Animal Physiology Group, Department of Biology, University of Kaiserslautern, D-67663, Kaiserslautern, Germany
| |
Collapse
|
6
|
Taherkhani A, Belatreche A, Li Y, Cosma G, Maguire LP, McGinnity TM. A review of learning in biologically plausible spiking neural networks. Neural Netw 2019; 122:253-272. [PMID: 31726331 DOI: 10.1016/j.neunet.2019.09.036] [Citation(s) in RCA: 73] [Impact Index Per Article: 14.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2019] [Revised: 09/17/2019] [Accepted: 09/23/2019] [Indexed: 11/30/2022]
Abstract
Artificial neural networks have been used as a powerful processing tool in various areas such as pattern recognition, control, robotics, and bioinformatics. Their wide applicability has encouraged researchers to improve artificial neural networks by investigating the biological brain. Neurological research has significantly progressed in recent years and continues to reveal new characteristics of biological neurons. New technologies can now capture temporal changes in the internal activity of the brain in more detail and help clarify the relationship between brain activity and the perception of a given stimulus. This new knowledge has led to a new type of artificial neural network, the Spiking Neural Network (SNN), that draws more faithfully on biological properties to provide higher processing abilities. A review of recent developments in learning of spiking neurons is presented in this paper. First the biological background of SNN learning algorithms is reviewed. The important elements of a learning algorithm such as the neuron model, synaptic plasticity, information encoding and SNN topologies are then presented. Then, a critical review of the state-of-the-art learning algorithms for SNNs using single and multiple spikes is presented. Additionally, deep spiking neural networks are reviewed, and challenges and opportunities in the SNN field are discussed.
Collapse
Affiliation(s)
- Aboozar Taherkhani
- School of Computer Science and Informatics, Faculty of Computing, Engineering and Media, De Montfort University, Leicester, UK.
| | - Ammar Belatreche
- Department of Computer and Information Sciences, Northumbria University, Newcastle upon Tyne, UK
| | - Yuhua Li
- School of Computer Science and Informatics, Cardiff University, Cardiff, UK
| | - Georgina Cosma
- Department of Computer Science, Loughborough University, Loughborough, UK
| | - Liam P Maguire
- Intelligent Systems Research Centre, Ulster University, Northern Ireland, Derry, UK
| | - T M McGinnity
- Intelligent Systems Research Centre, Ulster University, Northern Ireland, Derry, UK; School of Science and Technology, Nottingham Trent University, Nottingham, UK
| |
Collapse
|
7
|
Wang X, Lin X, Dang X. A Delay Learning Algorithm Based on Spike Train Kernels for Spiking Neurons. Front Neurosci 2019; 13:252. [PMID: 30971877 PMCID: PMC6445871 DOI: 10.3389/fnins.2019.00252] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2018] [Accepted: 03/04/2018] [Indexed: 11/13/2022] Open
Abstract
Neuroscience research confirms that the synaptic delays are not constant, but can be modulated. This paper proposes a supervised delay learning algorithm for spiking neurons with temporal encoding, in which both the weight and delay of a synaptic connection can be adjusted to enhance the learning performance. The proposed algorithm firstly defines spike train kernels to transform discrete spike trains during the learning phase into continuous analog signals so that common mathematical operations can be performed on them, and then deduces the supervised learning rules of synaptic weights and delays by gradient descent method. The proposed algorithm is successfully applied to various spike train learning tasks, and the effects of parameters of synaptic delays are analyzed in detail. Experimental results show that the network with dynamic delays achieves higher learning accuracy and less learning epochs than the network with static delays. The delay learning algorithm is further validated on a practical example of an image classification problem. The results again show that it can achieve a good classification performance with a proper receptive field. Therefore, the synaptic delay learning is significant for practical applications and theoretical researches of spiking neural networks.
Collapse
Affiliation(s)
- Xiangwen Wang
- College of Computer Science and Engineering, Northwest Normal University, Lanzhou, China
| | - Xianghong Lin
- College of Computer Science and Engineering, Northwest Normal University, Lanzhou, China
| | - Xiaochao Dang
- College of Computer Science and Engineering, Northwest Normal University, Lanzhou, China
| |
Collapse
|
8
|
Synchronization of Chemical Synaptic Coupling of the Chay Neuron System under Time Delay. APPLIED SCIENCES-BASEL 2018. [DOI: 10.3390/app8060927] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
|
9
|
Rothman JS, Silver RA. NeuroMatic: An Integrated Open-Source Software Toolkit for Acquisition, Analysis and Simulation of Electrophysiological Data. Front Neuroinform 2018; 12:14. [PMID: 29670519 PMCID: PMC5893720 DOI: 10.3389/fninf.2018.00014] [Citation(s) in RCA: 134] [Impact Index Per Article: 22.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2017] [Accepted: 03/15/2018] [Indexed: 12/21/2022] Open
Abstract
Acquisition, analysis and simulation of electrophysiological properties of the nervous system require multiple software packages. This makes it difficult to conserve experimental metadata and track the analysis performed. It also complicates certain experimental approaches such as online analysis. To address this, we developed NeuroMatic, an open-source software toolkit that performs data acquisition (episodic, continuous and triggered recordings), data analysis (spike rasters, spontaneous event detection, curve fitting, stationarity) and simulations (stochastic synaptic transmission, synaptic short-term plasticity, integrate-and-fire and Hodgkin-Huxley-like single-compartment models). The merging of a wide range of tools into a single package facilitates a more integrated style of research, from the development of online analysis functions during data acquisition, to the simulation of synaptic conductance trains during dynamic-clamp experiments. Moreover, NeuroMatic has the advantage of working within Igor Pro, a platform-independent environment that includes an extensive library of built-in functions, a history window for reviewing the user's workflow and the ability to produce publication-quality graphics. Since its original release, NeuroMatic has been used in a wide range of scientific studies and its user base has grown considerably. NeuroMatic version 3.0 can be found at http://www.neuromatic.thinkrandom.com and https://github.com/SilverLabUCL/NeuroMatic.
Collapse
Affiliation(s)
- Jason S Rothman
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, United Kingdom
| | - R Angus Silver
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, United Kingdom
| |
Collapse
|
10
|
Krächan EG, Fischer AU, Franke J, Friauf E. Synaptic reliability and temporal precision are achieved via high quantal content and effective replenishment: auditory brainstem versus hippocampus. J Physiol 2017; 595:839-864. [PMID: 27673320 PMCID: PMC5285727 DOI: 10.1113/jp272799] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2016] [Accepted: 09/07/2016] [Indexed: 12/11/2022] Open
Abstract
KEY POINTS Auditory brainstem neurons involved in sound source localization are equipped with several morphological and molecular features that enable them to compute interaural level and time differences. As sound source localization works continually, synaptic transmission between these neurons should be reliable and temporally precise, even during sustained periods of high-frequency activity. Using patch-clamp recordings in acute brain slices, we compared synaptic reliability and temporal precision in the seconds-minute range between auditory and two types of hippocampal synapses; the latter are less confronted with temporally precise high-frequency transmission than the auditory ones. We found striking differences in synaptic properties (e.g. continually high quantal content) that allow auditory synapses to reliably release vesicles at much higher rate than their hippocampal counterparts. Thus, they are indefatigable and also in a position to transfer information with exquisite temporal precision and their performance appears to be supported by very efficient replenishment mechanisms. ABSTRACT At early stations of the auditory pathway, information is encoded by precise signal timing and rate. Auditory synapses must maintain the relative timing of events with submillisecond precision even during sustained and high-frequency stimulation. In non-auditory brain regions, e.g. telencephalic ones, synapses are activated at considerably lower frequencies. Central to understanding the heterogeneity of synaptic systems is the elucidation of the physical, chemical and biological factors that determine synapse performance. In this study, we used slice recordings from three synapse types in the mouse auditory brainstem and hippocampus. Whereas the auditory brainstem nuclei experience high-frequency activity in vivo, the hippocampal circuits are activated at much lower frequencies. We challenged the synapses with sustained high-frequency stimulation (up to 200 Hz for 60 s) and found significant performance differences. Our results show that auditory brainstem synapses differ considerably from their hippocampal counterparts in several aspects, namely resistance to synaptic fatigue, low failure rate and exquisite temporal precision. Their high-fidelity performance supports the functional demands and appears to be due to the large size of the readily releasable pool and a high release probability, which together result in a high quantal content. In conjunction with very efficient vesicle replenishment mechanisms, these properties provide extremely rapid and temporally precise signalling required for neuronal communication at early stations of the auditory system, even during sustained activation in the minute range.
Collapse
Affiliation(s)
- Elisa G Krächan
- Animal Physiology Group, Department of BiologyUniversity of KaiserslauternD‐67663KaiserslauternGermany
| | - Alexander U Fischer
- Animal Physiology Group, Department of BiologyUniversity of KaiserslauternD‐67663KaiserslauternGermany
| | - Jürgen Franke
- Chair for Applied Mathematical Statistics, Department of MathematicsUniversity of KaiserslauternD‐67663KaiserslauternGermany
| | - Eckhard Friauf
- Animal Physiology Group, Department of BiologyUniversity of KaiserslauternD‐67663KaiserslauternGermany
| |
Collapse
|
11
|
Harada K, Matsuoka H, Fujihara H, Ueta Y, Yanagawa Y, Inoue M. GABA Signaling and Neuroactive Steroids in Adrenal Medullary Chromaffin Cells. Front Cell Neurosci 2016; 10:100. [PMID: 27147972 PMCID: PMC4834308 DOI: 10.3389/fncel.2016.00100] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2015] [Accepted: 04/01/2016] [Indexed: 01/22/2023] Open
Abstract
Gamma-aminobutyric acid (GABA) is produced not only in the brain, but also in endocrine cells by the two isoforms of glutamic acid decarboxylase (GAD), GAD65 and GAD67. In rat adrenal medullary chromaffin cells only GAD67 is expressed, and GABA is stored in large dense core vesicles (LDCVs), but not synaptic-like microvesicles (SLMVs). The α3β2/3γ2 complex represents the majority of GABAA receptors expressed in rat and guinea pig chromaffin cells, whereas PC12 cells, an immortalized rat chromaffin cell line, express the α1 subunit as well as the α3. The expression of α3, but not α1, in PC12 cells is enhanced by glucocorticoid activity, which may be mediated by both the mineralocorticoid receptor (MR) and the glucocorticoid receptor (GR). GABA has two actions mediated by GABAA receptors in chromaffin cells: it induces catecholamine secretion by itself and produces an inhibition of synaptically evoked secretion by a shunt effect. Allopregnanolone, a neuroactive steroid which is secreted from the adrenal cortex, produces a marked facilitation of GABAA receptor channel activity. Since there are no GABAergic nerve fibers in the adrenal medulla, GABA may function as a para/autocrine factor in the chromaffin cells. This function of GABA may be facilitated by expression of the immature isoforms of GAD and GABAA receptors and the lack of expression of plasma membrane GABA transporters (GATs). In this review, we will consider how the para/autocrine function of GABA is achieved, focusing on the structural and molecular mechanisms for GABA signaling.
Collapse
Affiliation(s)
- Keita Harada
- Department of Cell and Systems Physiology, University of Occupational and Environmental Health School of Medicine Kitakyushu, Japan
| | - Hidetada Matsuoka
- Department of Cell and Systems Physiology, University of Occupational and Environmental Health School of Medicine Kitakyushu, Japan
| | - Hiroaki Fujihara
- Department of Physiology, University of Occupational and Environmental Health School of Medicine Kitakyushu, Japan
| | - Yoichi Ueta
- Department of Physiology, University of Occupational and Environmental Health School of Medicine Kitakyushu, Japan
| | - Yuchio Yanagawa
- Department of Genetic and Behavioral Neuroscience, Gunma University Graduate School of Medicine Maebashi, Japan
| | - Masumi Inoue
- Department of Cell and Systems Physiology, University of Occupational and Environmental Health School of Medicine Kitakyushu, Japan
| |
Collapse
|
12
|
Lanore F, Silver RA. Extracting quantal properties of transmission at central synapses. NEUROMETHODS 2016; 113:193-211. [PMID: 30245548 DOI: 10.1007/978-1-4939-3411-9_10] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
Abstract
Chemical synapses enable neurons to communicate rapidly, process and filter signals and to store information. However, studying their functional properties is difficult because synaptic connections typically consist of multiple synaptic contacts that release vesicles stochastically and exhibit time-dependent behavior. Moreover, most central synapses are small and inaccessible to direct measurements. Estimation of synaptic properties from responses recorded at the soma is complicated by the presence of nonuniform release probability and nonuniform quantal properties. The presence of multivesicular release and postsynaptic receptor saturation at some synapses can also complicate the interpretation of quantal parameters. Multiple-probability fluctuation analysis (MPFA; also known as variance-mean analysis) is a method that has been developed for estimating synaptic parameters from the variance and mean amplitude of synaptic responses recorded at different release probabilities. This statistical approach, which incorporates nonuniform synaptic properties, has become widely used for studying synaptic transmission. In this chapter, we describe the statistical models used to extract quantal parameters and discuss their interpretation when applying MPFA.
Collapse
Affiliation(s)
- Frederic Lanore
- Department of Neuroscience, Physiology and Pharmacology, University College London, London WC1E 6BT, UK
| | - R Angus Silver
- Department of Neuroscience, Physiology and Pharmacology, University College London, London WC1E 6BT, UK
| |
Collapse
|
13
|
Taherkhani A, Belatreche A, Li Y, Maguire LP. DL-ReSuMe: A Delay Learning-Based Remote Supervised Method for Spiking Neurons. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2015; 26:3137-3149. [PMID: 25794401 DOI: 10.1109/tnnls.2015.2404938] [Citation(s) in RCA: 31] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Abstract
Recent research has shown the potential capability of spiking neural networks (SNNs) to model complex information processing in the brain. There is biological evidence to prove the use of the precise timing of spikes for information coding. However, the exact learning mechanism in which the neuron is trained to fire at precise times remains an open problem. The majority of the existing learning methods for SNNs are based on weight adjustment. However, there is also biological evidence that the synaptic delay is not constant. In this paper, a learning method for spiking neurons, called delay learning remote supervised method (DL-ReSuMe), is proposed to merge the delay shift approach and ReSuMe-based weight adjustment to enhance the learning performance. DL-ReSuMe uses more biologically plausible properties, such as delay learning, and needs less weight adjustment than ReSuMe. Simulation results have shown that the proposed DL-ReSuMe approach achieves learning accuracy and learning speed improvements compared with ReSuMe.
Collapse
|
14
|
Saveliev A, Khuzakhmetova V, Samigullin D, Skorinkin A, Kovyazina I, Nikolsky E, Bukharaeva E. Bayesian analysis of the kinetics of quantal transmitter secretion at the neuromuscular junction. J Comput Neurosci 2015; 39:119-29. [PMID: 26129670 DOI: 10.1007/s10827-015-0567-3] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2015] [Revised: 06/15/2015] [Accepted: 06/19/2015] [Indexed: 11/29/2022]
Abstract
The timing of transmitter release from nerve endings is considered nowadays as one of the factors determining the plasticity and efficacy of synaptic transmission. In the neuromuscular junction, the moments of release of individual acetylcholine quanta are related to the synaptic delays of uniquantal endplate currents recorded under conditions of lowered extracellular calcium. Using Bayesian modelling, we performed a statistical analysis of synaptic delays in mouse neuromuscular junction with different patterns of rhythmic nerve stimulation and when the entry of calcium ions into the nerve terminal was modified. We have obtained a statistical model of the release timing which is represented as the summation of two independent statistical distributions. The first of these is the exponentially modified Gaussian distribution. The mixture of normal and exponential components in this distribution can be interpreted as a two-stage mechanism of early and late periods of phasic synchronous secretion. The parameters of this distribution depend on both the stimulation frequency of the motor nerve and the calcium ions' entry conditions. The second distribution was modelled as quasi-uniform, with parameters independent of nerve stimulation frequency and calcium entry. Two different probability density functions for the distribution of synaptic delays suggest at least two independent processes controlling the time course of secretion, one of them potentially involving two stages. The relative contribution of these processes to the total number of mediator quanta released depends differently on the motor nerve stimulation pattern and on calcium ion entry into nerve endings.
Collapse
Affiliation(s)
- Anatoly Saveliev
- Kazan Federal University, Kremlevskaya St. 18, Kazan, 420008, Russia
| | - Venera Khuzakhmetova
- Kazan Federal University, Kremlevskaya St. 18, Kazan, 420008, Russia.,Laboratory of the Biophysics of Synaptic Processes, Kazan Institute of Biochemistry and Biophysics, Russian Academy of Sciences, P.O. Box 30, Kazan, 420111, Russia
| | - Dmitry Samigullin
- Kazan Federal University, Kremlevskaya St. 18, Kazan, 420008, Russia.,Laboratory of the Biophysics of Synaptic Processes, Kazan Institute of Biochemistry and Biophysics, Russian Academy of Sciences, P.O. Box 30, Kazan, 420111, Russia.,Kazan National Research Technical University named after A. N. Tupolev, K. Marx St. 10, Kazan, 420111, Russia
| | - Andrey Skorinkin
- Kazan Federal University, Kremlevskaya St. 18, Kazan, 420008, Russia.,Laboratory of the Biophysics of Synaptic Processes, Kazan Institute of Biochemistry and Biophysics, Russian Academy of Sciences, P.O. Box 30, Kazan, 420111, Russia
| | - Irina Kovyazina
- Kazan Federal University, Kremlevskaya St. 18, Kazan, 420008, Russia.,Laboratory of the Biophysics of Synaptic Processes, Kazan Institute of Biochemistry and Biophysics, Russian Academy of Sciences, P.O. Box 30, Kazan, 420111, Russia
| | - Eugeny Nikolsky
- Kazan Federal University, Kremlevskaya St. 18, Kazan, 420008, Russia.,Laboratory of the Biophysics of Synaptic Processes, Kazan Institute of Biochemistry and Biophysics, Russian Academy of Sciences, P.O. Box 30, Kazan, 420111, Russia.,Kazan State Medical University, Butlerov St. 49, Kazan, 420012, Russia
| | - Ellya Bukharaeva
- Kazan Federal University, Kremlevskaya St. 18, Kazan, 420008, Russia. .,Laboratory of the Biophysics of Synaptic Processes, Kazan Institute of Biochemistry and Biophysics, Russian Academy of Sciences, P.O. Box 30, Kazan, 420111, Russia.
| |
Collapse
|
15
|
Strength and duration of perisomatic GABAergic inhibition depend on distance between synaptically connected cells. Proc Natl Acad Sci U S A 2015; 112:1220-5. [PMID: 25583495 DOI: 10.1073/pnas.1412996112] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/21/2023] Open
Abstract
GABAergic perisoma-inhibiting fast-spiking interneurons (PIIs) effectively control the activity of large neuron populations by their wide axonal arborizations. It is generally assumed that the output of one PII to its target cells is strong and rapid. Here, we show that, unexpectedly, both strength and time course of PII-mediated perisomatic inhibition change with distance between synaptically connected partners in the rodent hippocampus. Synaptic signals become weaker due to lower contact numbers and decay more slowly with distance, very likely resulting from changes in GABAA receptor subunit composition. When distance-dependent synaptic inhibition is introduced to a rhythmically active neuronal network model, randomly driven principal cell assemblies are strongly synchronized by the PIIs, leading to higher precision in principal cell spike times than in a network with uniform synaptic inhibition.
Collapse
|
16
|
Rothman JS, Silver RA. Data-driven modeling of synaptic transmission and integration. PROGRESS IN MOLECULAR BIOLOGY AND TRANSLATIONAL SCIENCE 2014; 123:305-50. [PMID: 24560150 PMCID: PMC4748401 DOI: 10.1016/b978-0-12-397897-4.00004-8] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
In this chapter, we describe how to create mathematical models of synaptic transmission and integration. We start with a brief synopsis of the experimental evidence underlying our current understanding of synaptic transmission. We then describe synaptic transmission at a particular glutamatergic synapse in the mammalian cerebellum, the mossy fiber to granule cell synapse, since data from this well-characterized synapse can provide a benchmark comparison for how well synaptic properties are captured by different mathematical models. This chapter is structured by first presenting the simplest mathematical description of an average synaptic conductance waveform and then introducing methods for incorporating more complex synaptic properties such as nonlinear voltage dependence of ionotropic receptors, short-term plasticity, and stochastic fluctuations. We restrict our focus to excitatory synaptic transmission, but most of the modeling approaches discussed here can be equally applied to inhibitory synapses. Our data-driven approach will be of interest to those wishing to model synaptic transmission and network behavior in health and disease.
Collapse
Affiliation(s)
- Jason S Rothman
- Department of Neuroscience, Physiology & Pharmacology, University College London, London, UK
| | - R Angus Silver
- Department of Neuroscience, Physiology & Pharmacology, University College London, London, UK
| |
Collapse
|