1
|
Guo L, Zhang S, Wu Y, Xu G. Complex spiking neural networks with synaptic time-delay based on anti-interference function. Cogn Neurodyn 2022; 16:1485-1503. [PMID: 36408076 PMCID: PMC9666611 DOI: 10.1007/s11571-022-09803-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2021] [Revised: 02/13/2022] [Accepted: 03/21/2022] [Indexed: 01/16/2023] Open
Abstract
The research on a brain-like model with bio-interpretability is conductive to promoting its information processing ability in the field of artificial intelligence. Biological results show that the synaptic time-delay can improve the information processing abilities of the nervous system, which are an important factor related to the formation of brain cognitive functions. However, the synaptic plasticity with time-delay of a brain-like model still lacks bio-interpretability. In this study, combining excitatory and inhibitory synapses, we construct the complex spiking neural networks (CSNNs) with synaptic time-delay that more conforms biological characteristics, in which the topology has scale-free property and small-world property, and the nodes are represented by an Izhikevich neuron model. Then, the information processing abilities of CSNNs with different types of synaptic time-delay are comparatively evaluated based on the anti-interference function, and the mechanism of this function is discussed. Using two indicators of the anti-interference function and three kinds of noise, our simulation results consistently verify that: (i) From the perspective of anti-interference function, an CSNN with synaptic random time-delay outperforms an CSNN with synaptic fixed time-delay, which in turn outperforms an CSNN with synaptic none time-delay. The results imply that brain-like networks with more bio-interpretable synaptic time-delay have stronger information processing abilities. (ii) The synaptic plasticity is the intrinsic factor of the anti-interference function of CSNNs with different types of synaptic time-delay. (iii) The synaptic random time-delay makes an CSNN present better topological characteristics, which can improve the information processing ability of a brain-like network. It implies that synaptic time-delay is a factor that affects the anti-interference function at the level of performance.
Collapse
Affiliation(s)
- Lei Guo
- State Key Laboratory of Reliability and Intelligence of Electrical Equipment, Hebei University of Technology, Tianjin, 300130 China
- Hebei Key Laboratory of Bioelectromagnetics and Neuroengineering, Hebei University of Technology, Tianjin, 300130 China
| | - Sijia Zhang
- State Key Laboratory of Reliability and Intelligence of Electrical Equipment, Hebei University of Technology, Tianjin, 300130 China
- Hebei Key Laboratory of Bioelectromagnetics and Neuroengineering, Hebei University of Technology, Tianjin, 300130 China
| | - Youxi Wu
- School of Artificial Intelligence, Hebei University of Technology, Tianjin, 300130 China
| | - Guizhi Xu
- State Key Laboratory of Reliability and Intelligence of Electrical Equipment, Hebei University of Technology, Tianjin, 300130 China
- Hebei Key Laboratory of Bioelectromagnetics and Neuroengineering, Hebei University of Technology, Tianjin, 300130 China
| |
Collapse
|
2
|
Peng X, Lin W. Complex Dynamics of Noise-Perturbed Excitatory-Inhibitory Neural Networks With Intra-Correlative and Inter-Independent Connections. Front Physiol 2022; 13:915511. [PMID: 35812336 PMCID: PMC9263264 DOI: 10.3389/fphys.2022.915511] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2022] [Accepted: 05/09/2022] [Indexed: 11/24/2022] Open
Abstract
Real neural system usually contains two types of neurons, i.e., excitatory neurons and inhibitory ones. Analytical and numerical interpretation of dynamics induced by different types of interactions among the neurons of two types is beneficial to understanding those physiological functions of the brain. Here, we articulate a model of noise-perturbed random neural networks containing both excitatory and inhibitory (E&I) populations. Particularly, both intra-correlatively and inter-independently connected neurons in two populations are taken into account, which is different from the most existing E&I models only considering the independently-connected neurons. By employing the typical mean-field theory, we obtain an equivalent system of two dimensions with an input of stationary Gaussian process. Investigating the stationary autocorrelation functions along the obtained system, we analytically find the parameters’ conditions under which the synchronized behaviors between the two populations are sufficiently emergent. Taking the maximal Lyapunov exponent as an index, we also find different critical values of the coupling strength coefficients for the chaotic excitatory neurons and for the chaotic inhibitory ones. Interestingly, we reveal that the noise is able to suppress chaotic dynamics of the random neural networks having neurons in two populations, while an appropriate amount of correlation coefficient in intra-coupling strengths can enhance chaos occurrence. Finally, we also detect a previously-reported phenomenon where the parameters region corresponds to neither linearly stable nor chaotic dynamics; however, the size of the region area crucially depends on the populations’ parameters.
Collapse
Affiliation(s)
- Xiaoxiao Peng
- Shanghai Center for Mathematical Sciences, School of Mathematical Sciences, and LMNS, Fudan University, Shanghai, China
- Research Institute of Intelligent Complex Systemsand Center for Computational Systems Biology, Fudan University, Shanghai, China
- *Correspondence: Xiaoxiao Peng, ; Wei Lin,
| | - Wei Lin
- Shanghai Center for Mathematical Sciences, School of Mathematical Sciences, and LMNS, Fudan University, Shanghai, China
- Research Institute of Intelligent Complex Systemsand Center for Computational Systems Biology, Fudan University, Shanghai, China
- State Key Laboratory of Medical Neurobiology, MOE Frontiers Center for Brain Science, and Institutes of Brain Science, Fudan University, Shanghai, China
- *Correspondence: Xiaoxiao Peng, ; Wei Lin,
| |
Collapse
|
3
|
Introducing principles of synaptic integration in the optimization of deep neural networks. Nat Commun 2022; 13:1885. [PMID: 35393422 PMCID: PMC8989917 DOI: 10.1038/s41467-022-29491-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2021] [Accepted: 03/15/2022] [Indexed: 11/17/2022] Open
Abstract
Plasticity circuits in the brain are known to be influenced by the distribution of the synaptic weights through the mechanisms of synaptic integration and local regulation of synaptic strength. However, the complex interplay of stimulation-dependent plasticity with local learning signals is disregarded by most of the artificial neural network training algorithms devised so far. Here, we propose a novel biologically inspired optimizer for artificial and spiking neural networks that incorporates key principles of synaptic plasticity observed in cortical dendrites: GRAPES (Group Responsibility for Adjusting the Propagation of Error Signals). GRAPES implements a weight-distribution-dependent modulation of the error signal at each node of the network. We show that this biologically inspired mechanism leads to a substantial improvement of the performance of artificial and spiking networks with feedforward, convolutional, and recurrent architectures, it mitigates catastrophic forgetting, and it is optimally suited for dedicated hardware implementations. Overall, our work indicates that reconciling neurophysiology insights with machine intelligence is key to boosting the performance of neural networks. Tasks involving continual learning and adaptation to real-time scenarios remain challenging for artificial neural networks in contrast to real brain. The authors propose here a brain-inspired optimizer based on mechanisms of synaptic integration and strength regulation for improved performance of both artificial and spiking neural networks.
Collapse
|
4
|
Roy A, Narayanan R. Spatial information transfer in hippocampal place cells depends on trial-to-trial variability, symmetry of place-field firing, and biophysical heterogeneities. Neural Netw 2021; 142:636-660. [PMID: 34399375 PMCID: PMC7611579 DOI: 10.1016/j.neunet.2021.07.026] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2020] [Revised: 03/25/2021] [Accepted: 07/21/2021] [Indexed: 11/19/2022]
Abstract
The relationship between the feature-tuning curve and information transfer profile of individual neurons provides vital insights about neural encoding. However, the relationship between the spatial tuning curve and spatial information transfer of hippocampal place cells remains unexplored. Here, employing a stochastic search procedure spanning thousands of models, we arrived at 127 conductance-based place-cell models that exhibited signature electrophysiological characteristics and sharp spatial tuning, with parametric values that exhibited neither clustering nor strong pairwise correlations. We introduced trial-to-trial variability in responses and computed model tuning curves and information transfer profiles, using stimulus-specific (SSI) and mutual (MI) information metrics, across locations within the place field. We found spatial information transfer to be heterogeneous across models, but to reduce consistently with increasing levels of variability. Importantly, whereas reliable low-variability responses implied that maximal information transfer occurred at high-slope regions of the tuning curve, increase in variability resulted in maximal transfer occurring at the peak-firing location in a subset of models. Moreover, experience-dependent asymmetry in place-field firing introduced asymmetries in the information transfer computed through MI, but not SSI, and the impact of activity-dependent variability on information transfer was minimal compared to activity-independent variability. We unveiled ion-channel degeneracy in the regulation of spatial information transfer, and demonstrated critical roles for N-methyl-d-aspartate receptors, transient potassium and dendritic sodium channels in regulating information transfer. Our results demonstrate that trial-to-trial variability, tuning-curve shape and biological heterogeneities critically regulate the relationship between the spatial tuning curve and spatial information transfer in hippocampal place cells.
Collapse
Affiliation(s)
- Ankit Roy
- Cellular Neurophysiology Laboratory, Molecular Biophysics Unit, Indian Institute of Science, Bangalore, India; Undergraduate program, Indian Institute of Science, Bangalore, India
| | - Rishikesh Narayanan
- Cellular Neurophysiology Laboratory, Molecular Biophysics Unit, Indian Institute of Science, Bangalore, India.
| |
Collapse
|
5
|
Ma H, Jia B, Li Y, Gu H. Excitability and Threshold Mechanism for Enhanced Neuronal Response Induced by Inhibition Preceding Excitation. Neural Plast 2021; 2021:6692411. [PMID: 33531892 PMCID: PMC7837794 DOI: 10.1155/2021/6692411] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2020] [Revised: 12/02/2020] [Accepted: 01/06/2021] [Indexed: 11/18/2022] Open
Abstract
Postinhibitory facilitation (PIF) of neural firing presents a paradoxical phenomenon that the inhibitory effect induces enhancement instead of reduction of the firing activity, which plays important roles in sound location of the auditory nervous system, awaited theoretical explanations. In the present paper, excitability and threshold mechanism for the PIF phenomenon is presented in the Morris-Lecar model with type I, II, and III excitabilities. Firstly, compared with the purely excitatory stimulations applied to the steady state, the inhibitory preceding excitatory stimulation to form pairs induces the firing rate increased for type II and III excitabilities instead of type I excitability, when the interval between the inhibitory and excitatory stimulation within each pair is suitable. Secondly, the threshold mechanism for the PIF phenomenon is acquired. For type II and III excitabilities, the inhibitory stimulation induces subthreshold oscillations around the steady state. During the middle and ending phase of the ascending part and the beginning phase of the descending part within a period of the subthreshold oscillations, the threshold to evoke an action potential by an excitatory stimulation becomes weaker, which is the cause for the PIF phenomenon. Last, a theoretical estimation for the range of the interval between the inhibitory and excitatory stimulation for the PIF phenomenon is acquired, which approximates half of the intrinsic period of the subthreshold oscillations for the relatively strong stimulations and becomes narrower for the relatively weak stimulations. The interval for the PIF phenomenon is much shorter for type III excitability, which is closer to the experiment observation, due to the shorter period of the subthreshold oscillations. The results present the excitability and threshold mechanism for the PIF phenomenon, which provide comprehensive and deep explanations to the PIF phenomenon.
Collapse
Affiliation(s)
- Hanqing Ma
- School of Aerospace Engineering and Applied Mechanics, Tongji University, Shanghai 200092, China
| | - Bing Jia
- School of Aerospace Engineering and Applied Mechanics, Tongji University, Shanghai 200092, China
| | - Yuye Li
- College of Mathematics and Computer Science, Chifeng University, Chifeng 024000, China
| | - Huaguang Gu
- School of Aerospace Engineering and Applied Mechanics, Tongji University, Shanghai 200092, China
| |
Collapse
|
6
|
Guo L, Kan E, Wu Y, Lv H, Xu G. Noise suppression ability and its mechanism analysis of scale-free spiking neural network under white Gaussian noise. PLoS One 2021; 15:e0244683. [PMID: 33382788 PMCID: PMC7774963 DOI: 10.1371/journal.pone.0244683] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2020] [Accepted: 12/14/2020] [Indexed: 11/24/2022] Open
Abstract
With the continuous improvement of automation and informatization, the electromagnetic environment has become increasingly complex. Traditional protection methods for electronic systems are facing with serious challenges. Biological nervous system has the self-adaptive advantages under the regulation of the nervous system. It is necessary to explore a new thought on electromagnetic protection by drawing from the self-adaptive advantage of the biological nervous system. In this study, the scale-free spiking neural network (SFSNN) is constructed, in which the Izhikevich neuron model is employed as a node, and the synaptic plasticity model including excitatory and inhibitory synapses is employed as an edge. Under white Gaussian noise, the noise suppression abilities of the SFSNNs with the high average clustering coefficient (ACC) and the SFSNNs with the low ACC are studied comparatively. The noise suppression mechanism of the SFSNN is explored. The experiment results demonstrate that the following. (1) The SFSNN has a certain degree of noise suppression ability, and the SFSNNs with the high ACC have higher noise suppression performance than the SFSNNs with the low ACC. (2) The neural information processing of the SFSNN is the linkage effect of dynamic changes in neuron firing, synaptic weight and topological characteristics. (3) The synaptic plasticity is the intrinsic factor of the noise suppression ability of the SFSNN.
Collapse
Affiliation(s)
- Lei Guo
- State Key Laboratory of Reliability and Intelligence of Electrical Equipment, School of Electrical Engineering, Hebei University of Technology, Tianjin, China
- Hebei Key Laboratory of Bioelectromagnetics and Neuroengineering, Hebei University of Technology, Tianjin, China
- * E-mail:
| | - Enyu Kan
- State Key Laboratory of Reliability and Intelligence of Electrical Equipment, School of Electrical Engineering, Hebei University of Technology, Tianjin, China
- Hebei Key Laboratory of Bioelectromagnetics and Neuroengineering, Hebei University of Technology, Tianjin, China
| | - Youxi Wu
- School of Artificial Intelligence, Hebei University of Technology, Tianjin, China
| | - Huan Lv
- State Key Laboratory of Reliability and Intelligence of Electrical Equipment, School of Electrical Engineering, Hebei University of Technology, Tianjin, China
- Hebei Key Laboratory of Bioelectromagnetics and Neuroengineering, Hebei University of Technology, Tianjin, China
| | - Guizhi Xu
- State Key Laboratory of Reliability and Intelligence of Electrical Equipment, School of Electrical Engineering, Hebei University of Technology, Tianjin, China
- Hebei Key Laboratory of Bioelectromagnetics and Neuroengineering, Hebei University of Technology, Tianjin, China
| |
Collapse
|
7
|
Pattern Recognition of Spiking Neural Networks Based on Visual Mechanism and Supervised Synaptic Learning. Neural Plast 2020; 2020:8851351. [PMID: 33193755 PMCID: PMC7641668 DOI: 10.1155/2020/8851351] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2020] [Revised: 10/09/2020] [Accepted: 10/14/2020] [Indexed: 11/18/2022] Open
Abstract
Electrophysiological studies have shown that mammalian primary visual cortex are selective for the orientations of visual stimuli. Inspired by this mechanism, we propose a hierarchical spiking neural network (SNN) for image classification. Grayscale input images are fed through a feed-forward network consisting of orientation-selective neurons, which then projected to a layer of downstream classifier neurons through the spiking-based supervised tempotron learning rule. Based on the orientation-selective mechanism of the visual cortex and tempotron learning rule, the network can effectively classify images of the extensively studied MNIST database of handwritten digits, which achieves 96% classification accuracy based on only 2000 training samples (traditional training set is 60000). Compared with other classification methods, our model not only guarantees the biological plausibility and the accuracy of image classification but also significantly reduces the needed training samples. Considering the fact that the most commonly used deep learning neural networks need big data samples and high power consumption in image recognition, this brain-inspired computational neural network model based on the layer-by-layer hierarchical image processing mechanism of the visual cortex may provide a basis for the wide application of spiking neural networks in the field of intelligent computing.
Collapse
|
8
|
Di Maio V, Santillo S, Ventriglia F. Synaptic dendritic activity modulates the single synaptic event. Cogn Neurodyn 2020; 15:279-297. [PMID: 33854645 DOI: 10.1007/s11571-020-09607-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2019] [Revised: 05/23/2020] [Accepted: 06/09/2020] [Indexed: 01/28/2023] Open
Abstract
Synaptic transmission is the key system for the information transfer and elaboration among neurons. Nevertheless, a synapse is not a standing alone structure but it is a part of a population of synapses inputting the information from several neurons on a specific area of the dendritic tree of a single neuron. This population consists of excitatory and inhibitory synapses the inputs of which drive the postsynaptic membrane potential in the depolarizing (excitatory synapses) or depolarizing (inhibitory synapses) direction modulating in such a way the postsynaptic membrane potential. The postsynaptic response of a single synapse depends on several biophysical factors the most important of which is the value of the membrane potential at which the response occurs. The concurrence in a specific time window of inputs by several synapses located in a specific area of the dendritic tree can, consequently, modulate the membrane potential such to severely influence the single postsynaptic response. The degree of modulation operated by the synaptic population depends on the number of synapses active, on the relative proportion between excitatory and inbibitory synapses belonging to the population and on their specific mean firing frequencies. In the present paper we show results obtained by the simulation of the activity of a single Glutamatergic excitatory synapse under the influence of two different populations composed of the same proportion of excitatory and inhibitory synapses but having two different sizes (total number of synapses). The most relevant conclusion of the present simulations is that the information transferred by the single synapse is not and independent simple transition between a pre- and a postsynaptic neuron but is the result of the cooperation of all the synapses which concurrently try to transfer the information to the postsynaptic neuron in a given time window. This cooperativeness is mainly operated by a simple mechanism of modulation of the postsynaptic membrane potential which influences the amplitude of the different components forming the postsynaptic excitatory response.
Collapse
Affiliation(s)
- Vito Di Maio
- Institute of Applied Science and Intelligent Systems (ISASI) of CNR, Pozzuoli, Italy
| | - Silvia Santillo
- Institute of Applied Science and Intelligent Systems (ISASI) of CNR, Pozzuoli, Italy
| | - Francesco Ventriglia
- Institute of Applied Science and Intelligent Systems (ISASI) of CNR, Pozzuoli, Italy
| |
Collapse
|