1
|
Prathapan V, Eipert P, Wigger N, Kipp M, Appali R, Schmitt O. Modeling and simulation for prediction of multiple sclerosis progression. Comput Biol Med 2024; 175:108416. [PMID: 38657465 DOI: 10.1016/j.compbiomed.2024.108416] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2023] [Revised: 03/28/2024] [Accepted: 04/03/2024] [Indexed: 04/26/2024]
Abstract
In light of extensive work that has created a wide range of techniques for predicting the course of multiple sclerosis (MS) disease, this paper attempts to provide an overview of these approaches and put forth an alternative way to predict the disease progression. For this purpose, the existing methods for estimating and predicting the course of the disease have been categorized into clinical, radiological, biological, and computational or artificial intelligence-based markers. Weighing the weaknesses and strengths of these prognostic groups is a profound method that is yet in need and works directly at the level of diseased connectivity. Therefore, we propose using the computational models in combination with established connectomes as a predictive tool for MS disease trajectories. The fundamental conduction-based Hodgkin-Huxley model emerged as promising from examining these studies. The advantage of the Hodgkin-Huxley model is that certain properties of connectomes, such as neuronal connection weights, spatial distances, and adjustments of signal transmission rates, can be taken into account. It is precisely these properties that are particularly altered in MS and that have strong implications for processing, transmission, and interactions of neuronal signaling patterns. The Hodgkin-Huxley (HH) equations as a point-neuron model are used for signal propagation inside a small network. The objective is to change the conduction parameter of the neuron model, replicate the changes in myelin properties in MS and observe the dynamics of the signal propagation across the network. The model is initially validated for different lengths, conduction values, and connection weights through three nodal connections. Later, these individual factors are incorporated into a small network and simulated to mimic the condition of MS. The signal propagation pattern is observed after inducing changes in conduction parameters at certain nodes in the network and compared against a control model pattern obtained before the changes are applied to the network. The signal propagation pattern varies as expected by adapting to the input conditions. Similarly, when the model is applied to a connectome, the pattern changes could give an insight into disease progression. This approach has opened up a new path to explore the progression of the disease in MS. The work is in its preliminary state, but with a future vision to apply this method in a connectome, providing a better clinical tool.
Collapse
Affiliation(s)
- Vishnu Prathapan
- Medical School Hamburg University of Applied Sciences and Medical University, Am Kaiserkai 1, 20457, Hamburg, Germany.
| | - Peter Eipert
- Medical School Hamburg University of Applied Sciences and Medical University, Am Kaiserkai 1, 20457, Hamburg, Germany.
| | - Nicole Wigger
- Department of Anatomy, University of Rostock Gertrudenstr 9, 18057, Rostock, Germany.
| | - Markus Kipp
- Department of Anatomy, University of Rostock Gertrudenstr 9, 18057, Rostock, Germany.
| | - Revathi Appali
- Institute of General Electrical Engineering, University of Rostock, Albert-Einstein-Straße 2, 18059, Rostock, Germany; Department of Aging of Individuals and Society, Interdisciplinary Faculty, University of Rostock, Universitätsplatz 1, 18055, Rostock, Germany.
| | - Oliver Schmitt
- Medical School Hamburg University of Applied Sciences and Medical University, Am Kaiserkai 1, 20457, Hamburg, Germany; Department of Anatomy, University of Rostock Gertrudenstr 9, 18057, Rostock, Germany.
| |
Collapse
|
2
|
Lin A, Akafia C, Dal Monte O, Fan S, Fagan N, Putnam P, Tye KM, Chang S, Ba D, Allsop AZAS. An unbiased method to partition diverse neuronal responses into functional ensembles reveals interpretable population dynamics during innate social behavior. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.05.08.593229. [PMID: 38766234 PMCID: PMC11100741 DOI: 10.1101/2024.05.08.593229] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/22/2024]
Abstract
In neuroscience, understanding how single-neuron firing contributes to distributed neural ensembles is crucial. Traditional methods of analysis have been limited to descriptions of whole population activity, or, when analyzing individual neurons, criteria for response categorization varied significantly across experiments. Current methods lack scalability for large datasets, fail to capture temporal changes and rely on parametric assumptions. There's a need for a robust, scalable, and non-parametric functional clustering approach to capture interpretable dynamics. To address this challenge, we developed a model-based, statistical framework for unsupervised clustering of multiple time series datasets that exhibit nonlinear dynamics into an a-priori-unknown number of parameterized ensembles called Functional Encoding Units (FEUs). FEU outperforms existing techniques in accuracy and benchmark scores. Here, we apply this FEU formalism to single-unit recordings collected during social behaviors in rodents and primates and demonstrate its hypothesis-generating and testing capacities. This novel pipeline serves as an analytic bridge, translating neural ensemble codes across model systems.
Collapse
Affiliation(s)
- Alexander Lin
- School of Engineering and Applied Sciences, Harvard University, Cambridge, Massachusetts, USA
| | - Cyril Akafia
- Department of Psychiatry, Yale University, New Haven, Connecticut, USA
| | - Olga Dal Monte
- Department of Psychology, Yale University, New Haven, Connecticut, USA
| | - Siqi Fan
- Department of Psychology, Yale University, New Haven, Connecticut, USA
| | - Nicholas Fagan
- Department of Psychology, Yale University, New Haven, Connecticut, USA
| | - Philip Putnam
- Department of Psychology, Yale University, New Haven, Connecticut, USA
| | - Kay M. Tye
- Salk Institute for Biological Studies, La Jolla, California, USA
- Howard Hughes Medical Institute, La Jolla, California, USA
- Kavli Institute for the Brain and Mind, La Jolla, California, USA
| | - Steve Chang
- Department of Psychology, Yale University, New Haven, Connecticut, USA
| | - Demba Ba
- School of Engineering and Applied Sciences, Harvard University, Cambridge, Massachusetts, USA
- Center for Brain Sciences, Harvard University, Cambridge, Massachusetts, USA
- Kempner Institute for the Study of Artificial and Natural Intelligence, Harvard University, Cambridge, Massachusetts, USA
| | - AZA Stephen Allsop
- Center for Collective Healing, Department of Psychiatry and Behavioral Sciences, Howard University, Washington DC, USA
- Department of Psychiatry, Yale University, New Haven, Connecticut, USA
| |
Collapse
|
3
|
Puttkammer F, Lindner B. Fluctuation-response relations for integrate-and-fire models with an absolute refractory period. BIOLOGICAL CYBERNETICS 2024; 118:7-19. [PMID: 38261004 PMCID: PMC11068698 DOI: 10.1007/s00422-023-00982-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/09/2023] [Accepted: 12/11/2023] [Indexed: 01/24/2024]
Abstract
We study the problem of relating the spontaneous fluctuations of a stochastic integrate-and-fire (IF) model to the response of the instantaneous firing rate to time-dependent stimulation if the IF model is endowed with a non-vanishing refractory period and a finite (stereotypical) spike shape. This seemingly harmless addition to the model is shown to complicate the analysis put forward by Lindner Phys. Rev. Lett. (2022), i.e., the incorporation of the reset into the model equation, the Rice-like averaging of the stochastic differential equation, and the application of the Furutsu-Novikov theorem. We derive a still exact (although more complicated) fluctuation-response relation (FRR) for an IF model with refractory state and a white Gaussian background noise. We also briefly discuss an approximation for the case of a colored Gaussian noise and conclude with a summary and outlook on open problems.
Collapse
Affiliation(s)
- Friedrich Puttkammer
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany.
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany.
| |
Collapse
|
4
|
Richardson MJE. Linear and nonlinear integrate-and-fire neurons driven by synaptic shot noise with reversal potentials. Phys Rev E 2024; 109:024407. [PMID: 38491664 DOI: 10.1103/physreve.109.024407] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2023] [Accepted: 12/18/2023] [Indexed: 03/18/2024]
Abstract
The steady-state firing rate and firing-rate response of the leaky and exponential integrate-and-fire models receiving synaptic shot noise with excitatory and inhibitory reversal potentials is examined. For the particular case where the underlying synaptic conductances are exponentially distributed, it is shown that the master equation for a population of such model neurons can be reduced from an integrodifferential form to a more tractable set of three differential equations. The system is nevertheless more challenging analytically than for current-based synapses: where possible, analytical results are provided with an efficient numerical scheme and code provided for other quantities. The increased tractability of the framework developed supports an ongoing critical comparison between models in which synapses are treated with and without reversal potentials, such as recently in the context of networks with balanced excitatory and inhibitory conductances.
Collapse
Affiliation(s)
- Magnus J E Richardson
- Warwick Mathematics Institute, University of Warwick, Coventry CV4 7AL, United Kingdom
| |
Collapse
|
5
|
Munn BR, Müller EJ, Aru J, Whyte CJ, Gidon A, Larkum ME, Shine JM. A thalamocortical substrate for integrated information via critical synchronous bursting. Proc Natl Acad Sci U S A 2023; 120:e2308670120. [PMID: 37939085 PMCID: PMC10655573 DOI: 10.1073/pnas.2308670120] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2023] [Accepted: 09/21/2023] [Indexed: 11/10/2023] Open
Abstract
Understanding the neurobiological mechanisms underlying consciousness remains a significant challenge. Recent evidence suggests that the coupling between distal-apical and basal-somatic dendrites in thick-tufted layer 5 pyramidal neurons (L5PN), regulated by the nonspecific-projecting thalamus, is crucial for consciousness. Yet, it is uncertain whether this thalamocortical mechanism can support emergent signatures of consciousness, such as integrated information. To address this question, we constructed a biophysical network of dual-compartment thick-tufted L5PN, with dendrosomatic coupling controlled by thalamic inputs. Our findings demonstrate that integrated information is maximized when nonspecific thalamic inputs drive the system into a regime of time-varying synchronous bursting. Here, the system exhibits variable spiking dynamics with broad pairwise correlations, supporting the enhanced integrated information. Further, the observed peak in integrated information aligns with criticality signatures and empirically observed layer 5 pyramidal bursting rates. These results suggest that the thalamocortical core of the mammalian brain may be evolutionarily configured to optimize effective information processing, providing a potential neuronal mechanism that integrates microscale theories with macroscale signatures of consciousness.
Collapse
Affiliation(s)
- Brandon R. Munn
- Brain and Mind Centre, School of Medical Sciences, Faculty of Medicine and Health, University of Sydney, Sydney2050, Australia
- Complex Systems, School of Physics, Faculty of Science, University of Sydney, Sydney2050, Australia
| | - Eli J. Müller
- Brain and Mind Centre, School of Medical Sciences, Faculty of Medicine and Health, University of Sydney, Sydney2050, Australia
- Complex Systems, School of Physics, Faculty of Science, University of Sydney, Sydney2050, Australia
| | - Jaan Aru
- Institute of Computer Science, University of Tartu, Tartu51009, Estonia
| | - Christopher J. Whyte
- Brain and Mind Centre, School of Medical Sciences, Faculty of Medicine and Health, University of Sydney, Sydney2050, Australia
- Complex Systems, School of Physics, Faculty of Science, University of Sydney, Sydney2050, Australia
| | - Albert Gidon
- Institute of Biology, Humboldt University of Berlin, Berlin10099, Germany
- NeuroCure Center of Excellence, Charité Universitätsmedizin Berlin, Berlin10099, Germany
| | - Matthew E. Larkum
- Institute of Biology, Humboldt University of Berlin, Berlin10099, Germany
- NeuroCure Center of Excellence, Charité Universitätsmedizin Berlin, Berlin10099, Germany
| | - James M. Shine
- Brain and Mind Centre, School of Medical Sciences, Faculty of Medicine and Health, University of Sydney, Sydney2050, Australia
- Complex Systems, School of Physics, Faculty of Science, University of Sydney, Sydney2050, Australia
| |
Collapse
|
6
|
Park TJ, Deng S, Manna S, Islam ANMN, Yu H, Yuan Y, Fong DD, Chubykin AA, Sengupta A, Sankaranarayanan SKRS, Ramanathan S. Complex Oxides for Brain-Inspired Computing: A Review. ADVANCED MATERIALS (DEERFIELD BEACH, FLA.) 2023; 35:e2203352. [PMID: 35723973 DOI: 10.1002/adma.202203352] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/13/2022] [Revised: 06/02/2022] [Indexed: 06/15/2023]
Abstract
The fields of brain-inspired computing, robotics, and, more broadly, artificial intelligence (AI) seek to implement knowledge gleaned from the natural world into human-designed electronics and machines. In this review, the opportunities presented by complex oxides, a class of electronic ceramic materials whose properties can be elegantly tuned by doping, electron interactions, and a variety of external stimuli near room temperature, are discussed. The review begins with a discussion of natural intelligence at the elementary level in the nervous system, followed by collective intelligence and learning at the animal colony level mediated by social interactions. An important aspect highlighted is the vast spatial and temporal scales involved in learning and memory. The focus then turns to collective phenomena, such as metal-to-insulator transitions (MITs), ferroelectricity, and related examples, to highlight recent demonstrations of artificial neurons, synapses, and circuits and their learning. First-principles theoretical treatments of the electronic structure, and in situ synchrotron spectroscopy of operating devices are then discussed. The implementation of the experimental characteristics into neural networks and algorithm design is then revewed. Finally, outstanding materials challenges that require a microscopic understanding of the physical mechanisms, which will be essential for advancing the frontiers of neuromorphic computing, are highlighted.
Collapse
Affiliation(s)
- Tae Joon Park
- School of Materials Engineering, Purdue University, West Lafayette, IN, 47907, USA
| | - Sunbin Deng
- School of Materials Engineering, Purdue University, West Lafayette, IN, 47907, USA
| | - Sukriti Manna
- Center for Nanoscale Materials, Argonne National Laboratory, Argonne, IL, 60439, USA
| | - A N M Nafiul Islam
- Department of Electrical Engineering, The Pennsylvania State University, University Park, PA, 16802, USA
| | - Haoming Yu
- School of Materials Engineering, Purdue University, West Lafayette, IN, 47907, USA
| | - Yifan Yuan
- School of Materials Engineering, Purdue University, West Lafayette, IN, 47907, USA
| | - Dillon D Fong
- Materials Science Division, Argonne National Laboratory, Lemont, IL, 60439, USA
| | - Alexander A Chubykin
- Department of Biological Sciences, Purdue Institute for Integrative Neuroscience, Purdue University, West Lafayette, IN, 47907, USA
| | - Abhronil Sengupta
- Department of Electrical Engineering, The Pennsylvania State University, University Park, PA, 16802, USA
| | - Subramanian K R S Sankaranarayanan
- Center for Nanoscale Materials, Argonne National Laboratory, Argonne, IL, 60439, USA
- Department of Mechanical and Industrial Engineering, University of Illinois Chicago, Chicago, IL, 60607, USA
| | - Shriram Ramanathan
- School of Materials Engineering, Purdue University, West Lafayette, IN, 47907, USA
| |
Collapse
|
7
|
Liu H, Qin Y, Chen HY, Wu J, Ma J, Du Z, Wang N, Zou J, Lin S, Zhang X, Zhang Y, Wang H. Artificial Neuronal Devices Based on Emerging Materials: Neuronal Dynamics and Applications. ADVANCED MATERIALS (DEERFIELD BEACH, FLA.) 2023; 35:e2205047. [PMID: 36609920 DOI: 10.1002/adma.202205047] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/03/2022] [Revised: 12/02/2022] [Indexed: 06/17/2023]
Abstract
Artificial neuronal devices are critical building blocks of neuromorphic computing systems and currently the subject of intense research motivated by application needs from new computing technology and more realistic brain emulation. Researchers have proposed a range of device concepts that can mimic neuronal dynamics and functions. Although the switching physics and device structures of these artificial neurons are largely different, their behaviors can be described by several neuron models in a more unified manner. In this paper, the reports of artificial neuronal devices based on emerging volatile switching materials are reviewed from the perspective of the demonstrated neuron models, with a focus on the neuronal functions implemented in these devices and the exploitation of these functions for computational and sensing applications. Furthermore, the neuroscience inspirations and engineering methods to enrich the neuronal dynamics that remain to be implemented in artificial neuronal devices and networks toward realizing the full functionalities of biological neurons are discussed.
Collapse
Affiliation(s)
- Hefei Liu
- Ming Hsieh Department of Electrical and Computer Engineering, University of Southern California, Los Angeles, CA, 90089, USA
| | - Yuan Qin
- Center for Power Electronics Systems, Bradley Department of Electrical and Computer Engineering, Virginia Polytechnic Institute and State University, Blacksburg, VA, 24060, USA
| | - Hung-Yu Chen
- Ming Hsieh Department of Electrical and Computer Engineering, University of Southern California, Los Angeles, CA, 90089, USA
| | - Jiangbin Wu
- Ming Hsieh Department of Electrical and Computer Engineering, University of Southern California, Los Angeles, CA, 90089, USA
| | - Jiahui Ma
- Ming Hsieh Department of Electrical and Computer Engineering, University of Southern California, Los Angeles, CA, 90089, USA
| | - Zhonghao Du
- Ming Hsieh Department of Electrical and Computer Engineering, University of Southern California, Los Angeles, CA, 90089, USA
| | - Nan Wang
- Mork Family Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, CA, 90089, USA
| | - Jingyi Zou
- Department of Electrical and Computer Engineering, Carnegie Mellon University, Pittsburgh, PA, 15213, USA
| | - Sen Lin
- Department of Electrical and Computer Engineering, Carnegie Mellon University, Pittsburgh, PA, 15213, USA
| | - Xu Zhang
- Department of Electrical and Computer Engineering, Carnegie Mellon University, Pittsburgh, PA, 15213, USA
| | - Yuhao Zhang
- Center for Power Electronics Systems, Bradley Department of Electrical and Computer Engineering, Virginia Polytechnic Institute and State University, Blacksburg, VA, 24060, USA
| | - Han Wang
- Ming Hsieh Department of Electrical and Computer Engineering, University of Southern California, Los Angeles, CA, 90089, USA
- Mork Family Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, CA, 90089, USA
| |
Collapse
|
8
|
Ortone A, Vergani AA, Ahmadipour M, Mannella R, Mazzoni A. Dopamine depletion leads to pathological synchronization of distinct basal ganglia loops in the beta band. PLoS Comput Biol 2023; 19:e1010645. [PMID: 37104542 PMCID: PMC10168586 DOI: 10.1371/journal.pcbi.1010645] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2022] [Revised: 05/09/2023] [Accepted: 04/12/2023] [Indexed: 04/28/2023] Open
Abstract
Motor symptoms of Parkinson's Disease (PD) are associated with dopamine deficits and pathological oscillation of basal ganglia (BG) neurons in the β range ([12-30] Hz). However, how dopamine depletion affects the oscillation dynamics of BG nuclei is still unclear. With a spiking neurons model, we here capture the features of BG nuclei interactions leading to oscillations in dopamine-depleted condition. We highlight that both the loop between subthalamic nucleus (STN) and Globus Pallidus pars externa (GPe) and the loop between striatal fast spiking and medium spiny neurons and GPe display resonances in the β range, and synchronize to a common β frequency through interaction. Crucially, the synchronization depends on dopamine depletion: the two loops are largely independent for high levels of dopamine, but progressively synchronize as dopamine is depleted due to the increased strength of the striatal loop. The model is validated against recent experimental reports on the role of cortical inputs, STN and GPe activity in the generation of β oscillations. Our results highlight the role of the interplay between the GPe-STN and the GPe-striatum loop in generating sustained β oscillations in PD subjects, and explain how this interplay depends on the level of dopamine. This paves the way to the design of therapies specifically addressing the onset of pathological β oscillations.
Collapse
Affiliation(s)
- Andrea Ortone
- Dipartimento di Fisica, Università di Pisa, Pisa, Italy
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pontedera, Italy
- Department of Excellence in Robotics and AI, Scuola Superiore Sant’Anna, Pisa, Italy
| | - Alberto Arturo Vergani
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pontedera, Italy
- Department of Excellence in Robotics and AI, Scuola Superiore Sant’Anna, Pisa, Italy
| | - Mahboubeh Ahmadipour
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pontedera, Italy
- Department of Excellence in Robotics and AI, Scuola Superiore Sant’Anna, Pisa, Italy
| | | | - Alberto Mazzoni
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pontedera, Italy
- Department of Excellence in Robotics and AI, Scuola Superiore Sant’Anna, Pisa, Italy
| |
Collapse
|
9
|
Ramlow L, Falcke M, Lindner B. An integrate-and-fire approach to Ca 2+ signaling. Part I: Renewal model. Biophys J 2023; 122:713-736. [PMID: 36635961 PMCID: PMC9989887 DOI: 10.1016/j.bpj.2023.01.007] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2022] [Revised: 12/13/2022] [Accepted: 01/06/2023] [Indexed: 01/13/2023] Open
Abstract
In computational neuroscience integrate-and-fire models capture the spike generation by a subthreshold dynamics supplemented by a simple fire-and-reset rule; they allow for a numerically efficient and analytically tractable description of stochastic single cell as well as network dynamics. Stochastic spiking is also a prominent feature of Ca2+ signaling which suggests to adopt the integrate-and-fire approach for this fundamental biophysical process. The model introduced here consists of two components describing 1) activity of clusters of inositol-trisphosphate receptor channels and 2) dynamics of the global Ca2+ concentrations in the cytosol. The cluster dynamics is given in terms of a cyclic Markov chain, capturing the puff, i.e., the punctuated release of Ca2+ from intracellular stores. The cytosolic Ca2+ concentration is described by an integrate-and-fire dynamics driven by the puff current. For the cyclic Markov chain we derive expressions for the statistics of the interpuff interval, the single-puff strength and the puff current assuming constant cytosolic Ca2+. The latter condition is often well approximated because cytosolic Ca2+ varies much slower than the cluster activity does. Furthermore, because the detailed two-component model is numerically expensive to simulate and difficult to treat analytically, we develop an analytical framework to approximate the driving puff current of the stochastic cytosolic Ca2+ dynamics by a temporally uncorrelated Gaussian noise. This approximation reduces our two-component system to an integrate-and-fire model with a nonlinear drift function and a multiplicative Gaussian white noise, a model that is known to generate a renewal spike train, i.e., a point process with statistically independent interspike intervals. The model allows for fast numerical simulations, permits to derive analytical expressions for the rate of Ca2+ spiking and the coefficient of variation of the interspike interval, and to approximate the interspike interval density and the spike train power spectrum. Comparison of these statistics to experimental data is discussed.
Collapse
Affiliation(s)
- Lukas Ramlow
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany; Physics Department of Humboldt University Berlin, Berlin, Germany.
| | - Martin Falcke
- Physics Department of Humboldt University Berlin, Berlin, Germany; Max Delbrück Center for Molecular Medicine, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany; Physics Department of Humboldt University Berlin, Berlin, Germany
| |
Collapse
|
10
|
Dehghani-Habibabadi M, Pawelzik K. Synaptic self-organization of spatio-temporal pattern selectivity. PLoS Comput Biol 2023; 19:e1010876. [PMID: 36780564 PMCID: PMC9977062 DOI: 10.1371/journal.pcbi.1010876] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2022] [Revised: 03/01/2023] [Accepted: 01/17/2023] [Indexed: 02/15/2023] Open
Abstract
Spiking model neurons can be set up to respond selectively to specific spatio-temporal spike patterns by optimization of their input weights. It is unknown, however, if existing synaptic plasticity mechanisms can achieve this temporal mode of neuronal coding and computation. Here it is shown that changes of synaptic efficacies which tend to balance excitatory and inhibitory synaptic inputs can make neurons sensitive to particular input spike patterns. Simulations demonstrate that a combination of Hebbian mechanisms, hetero-synaptic plasticity and synaptic scaling is sufficient for self-organizing sensitivity for spatio-temporal spike patterns that repeat in the input. In networks inclusion of hetero-synaptic plasticity that depends on the pre-synaptic neurons leads to specialization and faithful representation of pattern sequences by a group of target neurons. Pattern detection is robust against a range of distortions and noise. The proposed combination of Hebbian mechanisms, hetero-synaptic plasticity and synaptic scaling is found to protect the memories for specific patterns from being overwritten by ongoing learning during extended periods when the patterns are not present. This suggests a novel explanation for the long term robustness of memory traces despite ongoing activity with substantial synaptic plasticity. Taken together, our results promote the plausibility of precise temporal coding in the brain.
Collapse
Affiliation(s)
| | - Klaus Pawelzik
- Institute for Theoretical Physics, University of Bremen, Bremen, Germany
| |
Collapse
|
11
|
Persistence in a large network of sparsely interacting neurons. J Math Biol 2022; 86:16. [PMID: 36534174 DOI: 10.1007/s00285-022-01844-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2021] [Revised: 09/26/2022] [Accepted: 11/22/2022] [Indexed: 12/23/2022]
Abstract
This article presents a biological neural network model driven by inhomogeneous Poisson processes accounting for the intrinsic randomness of synapses. The main novelty is the introduction of sparse interactions: each firing neuron triggers an instantaneous increase in electric potential to a fixed number of randomly chosen neurons. We prove that, as the number of neurons approaches infinity, the finite network converges to a nonlinear mean-field process characterised by a jump-type stochastic differential equation. We show that this process displays a phase transition: the activity of a typical neuron in the infinite network either rapidly dies out, or persists forever, depending on the global parameters describing the intensity of interconnection. This provides a way to understand the emergence of persistent activity triggered by weak input signals in large neural networks.
Collapse
|
12
|
Privault N, Thieullen M. Closed-form modeling of neuronal spike train statistics using multivariate Hawkes cumulants. Phys Rev E 2022; 106:054410. [PMID: 36559454 DOI: 10.1103/physreve.106.054410] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2022] [Accepted: 10/27/2022] [Indexed: 06/17/2023]
Abstract
We derive exact analytical expressions for the cumulants of any orders of neuronal membrane potentials driven by spike trains in a multivariate Hawkes process model with excitation and inhibition. Such expressions can be used for the prediction and sensitivity analysis of the statistical behavior of the model over time and to estimate the probability densities of neuronal membrane potentials using Gram-Charlier expansions. Our results are shown to provide a better alternative to Monte Carlo estimates via stochastic simulations and computer codes based on combinatorial recursions are included.
Collapse
Affiliation(s)
- Nicolas Privault
- Division of Mathematical Sciences, School of Physical and Mathematical Sciences, Nanyang Technological University, 21 Nanyang Link, Singapore 637371, Singapore and LPSM - UMR 8001, Sorbonne Université, 4 Place Jussieu, 75252 Paris, France
| | - Michèle Thieullen
- Division of Mathematical Sciences, School of Physical and Mathematical Sciences, Nanyang Technological University, 21 Nanyang Link, Singapore 637371, Singapore and LPSM - UMR 8001, Sorbonne Université, 4 Place Jussieu, 75252 Paris, France
| |
Collapse
|
13
|
Wu Z, Zhang H, Lin Y, Li G, Wang M, Tang Y. LIAF-Net: Leaky Integrate and Analog Fire Network for Lightweight and Efficient Spatiotemporal Information Processing. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2022; 33:6249-6262. [PMID: 33979292 DOI: 10.1109/tnnls.2021.3073016] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Spiking neural networks (SNNs) based on the leaky integrate and fire (LIF) model have been applied to energy-efficient temporal and spatiotemporal processing tasks. Due to the bioplausible neuronal dynamics and simplicity, LIF-SNN benefits from event-driven processing, however, usually face the embarrassment of reduced performance. This may because, in LIF-SNN, the neurons transmit information via spikes. To address this issue, in this work, we propose a leaky integrate and analog fire (LIAF) neuron model so that analog values can be transmitted among neurons, and a deep network termed LIAF-Net is built on it for efficient spatiotemporal processing. In the temporal domain, LIAF follows the traditional LIF dynamics to maintain its temporal processing capability. In the spatial domain, LIAF is able to integrate spatial information through convolutional integration or fully connected integration. As a spatiotemporal layer, LIAF can also be used with traditional artificial neural network (ANN) layers jointly. In addition, the built network can be trained with backpropagation through time (BPTT) directly, which avoids the performance loss caused by ANN to SNN conversion. Experiment results indicate that LIAF-Net achieves comparable performance to the gated recurrent unit (GRU) and long short-term memory (LSTM) on bAbI question answering (QA) tasks and achieves state-of-the-art performance on spatiotemporal dynamic vision sensor (DVS) data sets, including MNIST-DVS, CIFAR10-DVS, and DVS128 Gesture, with much less number of synaptic weights and computational overhead compared with traditional networks built by LSTM, GRU, convolutional LSTM (ConvLSTM), or 3-D convolution (Conv3D). Compared with traditional LIF-SNN, LIAF-Net also shows dramatic accuracy gain on all these experiments. In conclusion, LIAF-Net provides a framework combining the advantages of both ANNs and SNNs for lightweight and efficient spatiotemporal information processing.
Collapse
|
14
|
Tomar R, Smith CE, Lansky P. A simple neuronal model with intrinsic saturation of the firing frequency. Biosystems 2022; 222:104780. [PMID: 36179938 DOI: 10.1016/j.biosystems.2022.104780] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2022] [Revised: 09/01/2022] [Accepted: 09/13/2022] [Indexed: 11/02/2022]
Abstract
We present a comparison of the intrinsic saturation of firing frequency in four simple neural models: leaky integrate-and-fire model, leaky integrate-and-fire model with reversal potentials, two-point leaky integrate-and-fire model, and a two-point leaky integrate-and-fire model with reversal potentials. "Two-point" means that the equivalent circuit has two nodes (dendritic and somatic) instead of one (somatic only). The results suggest that the reversal potential increases the slope of the "firing rate vs input" curve due to a smaller effective membrane time constant, but does not necessarily induce saturation of the firing rate. The two-point model without the reversal potential does not limit the voltage or the firing rate. In contrast to the previous models, the two-point model with the reversal potential limits the asymptotic voltage and the firing rate, which is the main result of this paper. The case of excitatory inputs is considered first and followed by the case of both excitatory and inhibitory inputs.
Collapse
Affiliation(s)
- Rimjhim Tomar
- Institute of Physiology of the Czech Academy of Sciences, Videnska 1083, 14220 Prague 4, Czech Republic; Second Medical Faculty, Charles University, V Uvalu 84, 15006 Prague 6, Czech Republic.
| | - Charles E Smith
- Department of Statistics, North Carolina State University, 2311 Stinson Drive, Raleigh, NC 27695-8203, United States of America
| | - Petr Lansky
- Institute of Physiology of the Czech Academy of Sciences, Videnska 1083, 14220 Prague 4, Czech Republic
| |
Collapse
|
15
|
Lu S, Xu F. Linear leaky-integrate-and-fire neuron model based spiking neural networks and its mapping relationship to deep neural networks. Front Neurosci 2022; 16:857513. [PMID: 36090262 PMCID: PMC9448910 DOI: 10.3389/fnins.2022.857513] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2022] [Accepted: 07/27/2022] [Indexed: 11/13/2022] Open
Abstract
Spiking neural networks (SNNs) are brain-inspired machine learning algorithms with merits such as biological plausibility and unsupervised learning capability. Previous works have shown that converting Artificial Neural Networks (ANNs) into SNNs is a practical and efficient approach for implementing an SNN. However, the basic principle and theoretical groundwork are lacking for training a non-accuracy-loss SNN. This paper establishes a precise mathematical mapping between the biological parameters of the Linear Leaky-Integrate-and-Fire model (LIF)/SNNs and the parameters of ReLU-AN/Deep Neural Networks (DNNs). Such mapping relationship is analytically proven under certain conditions and demonstrated by simulation and real data experiments. It can serve as the theoretical basis for the potential combination of the respective merits of the two categories of neural networks.
Collapse
|
16
|
Lu HY, Lorenc ES, Zhu H, Kilmarx J, Sulzer J, Xie C, Tobler PN, Watrous AJ, Orsborn AL, Lewis-Peacock J, Santacruz SR. Multi-scale neural decoding and analysis. J Neural Eng 2021; 18. [PMID: 34284369 PMCID: PMC8840800 DOI: 10.1088/1741-2552/ac160f] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2020] [Accepted: 07/20/2021] [Indexed: 12/15/2022]
Abstract
Objective. Complex spatiotemporal neural activity encodes rich information related to behavior and cognition. Conventional research has focused on neural activity acquired using one of many different measurement modalities, each of which provides useful but incomplete assessment of the neural code. Multi-modal techniques can overcome tradeoffs in the spatial and temporal resolution of a single modality to reveal deeper and more comprehensive understanding of system-level neural mechanisms. Uncovering multi-scale dynamics is essential for a mechanistic understanding of brain function and for harnessing neuroscientific insights to develop more effective clinical treatment. Approach. We discuss conventional methodologies used for characterizing neural activity at different scales and review contemporary examples of how these approaches have been combined. Then we present our case for integrating activity across multiple scales to benefit from the combined strengths of each approach and elucidate a more holistic understanding of neural processes. Main results. We examine various combinations of neural activity at different scales and analytical techniques that can be used to integrate or illuminate information across scales, as well the technologies that enable such exciting studies. We conclude with challenges facing future multi-scale studies, and a discussion of the power and potential of these approaches. Significance. This roadmap will lead the readers toward a broad range of multi-scale neural decoding techniques and their benefits over single-modality analyses. This Review article highlights the importance of multi-scale analyses for systematically interrogating complex spatiotemporal mechanisms underlying cognition and behavior.
Collapse
Affiliation(s)
- Hung-Yun Lu
- The University of Texas at Austin, Biomedical Engineering, Austin, TX, United States of America
| | - Elizabeth S Lorenc
- The University of Texas at Austin, Psychology, Austin, TX, United States of America.,The University of Texas at Austin, Institute for Neuroscience, Austin, TX, United States of America
| | - Hanlin Zhu
- Rice University, Electrical and Computer Engineering, Houston, TX, United States of America
| | - Justin Kilmarx
- The University of Texas at Austin, Mechanical Engineering, Austin, TX, United States of America
| | - James Sulzer
- The University of Texas at Austin, Mechanical Engineering, Austin, TX, United States of America.,The University of Texas at Austin, Institute for Neuroscience, Austin, TX, United States of America
| | - Chong Xie
- Rice University, Electrical and Computer Engineering, Houston, TX, United States of America
| | - Philippe N Tobler
- University of Zurich, Neuroeconomics and Social Neuroscience, Zurich, Switzerland
| | - Andrew J Watrous
- The University of Texas at Austin, Neurology, Austin, TX, United States of America
| | - Amy L Orsborn
- University of Washington, Electrical and Computer Engineering, Seattle, WA, United States of America.,University of Washington, Bioengineering, Seattle, WA, United States of America.,Washington National Primate Research Center, Seattle, WA, United States of America
| | - Jarrod Lewis-Peacock
- The University of Texas at Austin, Psychology, Austin, TX, United States of America.,The University of Texas at Austin, Institute for Neuroscience, Austin, TX, United States of America
| | - Samantha R Santacruz
- The University of Texas at Austin, Biomedical Engineering, Austin, TX, United States of America.,The University of Texas at Austin, Institute for Neuroscience, Austin, TX, United States of America
| |
Collapse
|
17
|
Nobukawa S, Nishimura H, Wagatsuma N, Ando S, Yamanishi T. Long-Tailed Characteristic of Spiking Pattern Alternation Induced by Log-Normal Excitatory Synaptic Distribution. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2021; 32:3525-3537. [PMID: 32822305 DOI: 10.1109/tnnls.2020.3015208] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Studies of structural connectivity at the synaptic level show that in synaptic connections of the cerebral cortex, the excitatory postsynaptic potential (EPSP) in most synapses exhibits sub-mV values, while a small number of synapses exhibit large EPSPs ( >~1.0 [mV]). This means that the distribution of EPSP fits a log-normal distribution. While not restricting structural connectivity, skewed and long-tailed distributions have been widely observed in neural activities, such as the occurrences of spiking rates and the size of a synchronously spiking population. Many studies have been modeled this long-tailed EPSP neural activity distribution; however, its causal factors remain controversial. This study focused on the long-tailed EPSP distributions and interlateral synaptic connections primarily observed in the cortical network structures, thereby having constructed a spiking neural network consistent with these features. Especially, we constructed two coupled modules of spiking neural networks with excitatory and inhibitory neural populations with a log-normal EPSP distribution. We evaluated the spiking activities for different input frequencies and with/without strong synaptic connections. These coupled modules exhibited intermittent intermodule-alternative behavior, given moderate input frequency and the existence of strong synaptic and intermodule connections. Moreover, the power analysis, multiscale entropy analysis, and surrogate data analysis revealed that the long-tailed EPSP distribution and intermodule connections enhanced the complexity of spiking activity at large temporal scales and induced nonlinear dynamics and neural activity that followed the long-tailed distribution.
Collapse
|
18
|
Ramlow L, Lindner B. Interspike interval correlations in neuron models with adaptation and correlated noise. PLoS Comput Biol 2021; 17:e1009261. [PMID: 34449771 PMCID: PMC8428727 DOI: 10.1371/journal.pcbi.1009261] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2020] [Revised: 09/09/2021] [Accepted: 07/08/2021] [Indexed: 11/19/2022] Open
Abstract
The generation of neural action potentials (spikes) is random but nevertheless may result in a rich statistical structure of the spike sequence. In particular, contrary to the popular renewal assumption of theoreticians, the intervals between adjacent spikes are often correlated. Experimentally, different patterns of interspike-interval correlations have been observed and computational studies have identified spike-frequency adaptation and correlated noise as the two main mechanisms that can lead to such correlations. Analytical studies have focused on the single cases of either correlated (colored) noise or adaptation currents in combination with uncorrelated (white) noise. For low-pass filtered noise or adaptation, the serial correlation coefficient can be approximated as a single geometric sequence of the lag between the intervals, providing an explanation for some of the experimentally observed patterns. Here we address the problem of interval correlations for a widely used class of models, multidimensional integrate-and-fire neurons subject to a combination of colored and white noise sources and a spike-triggered adaptation current. Assuming weak noise, we derive a simple formula for the serial correlation coefficient, a sum of two geometric sequences, which accounts for a large class of correlation patterns. The theory is confirmed by means of numerical simulations in a number of special cases including the leaky, quadratic, and generalized integrate-and-fire models with colored noise and spike-frequency adaptation. Furthermore we study the case in which the adaptation current and the colored noise share the same time scale, corresponding to a slow stochastic population of adaptation channels; we demonstrate that our theory can account for a nonmonotonic dependence of the correlation coefficient on the channel's time scale. Another application of the theory is a neuron driven by network-noise-like fluctuations (green noise). We also discuss the range of validity of our weak-noise theory and show that by changing the relative strength of white and colored noise sources, we can change the sign of the correlation coefficient. Finally, we apply our theory to a conductance-based model which demonstrates its broad applicability.
Collapse
Affiliation(s)
- Lukas Ramlow
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
| |
Collapse
|
19
|
Pietras B, Gallice N, Schwalger T. Low-dimensional firing-rate dynamics for populations of renewal-type spiking neurons. Phys Rev E 2021; 102:022407. [PMID: 32942450 DOI: 10.1103/physreve.102.022407] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2020] [Accepted: 06/29/2020] [Indexed: 11/07/2022]
Abstract
The macroscopic dynamics of large populations of neurons can be mathematically analyzed using low-dimensional firing-rate or neural-mass models. However, these models fail to capture spike synchronization effects and nonstationary responses of the population activity to rapidly changing stimuli. Here we derive low-dimensional firing-rate models for homogeneous populations of neurons modeled as time-dependent renewal processes. The class of renewal neurons includes integrate-and-fire models driven by white noise and has been frequently used to model neuronal refractoriness and spike synchronization dynamics. The derivation is based on an eigenmode expansion of the associated refractory density equation, which generalizes previous spectral methods for Fokker-Planck equations to arbitrary renewal models. We find a simple relation between the eigenvalues characterizing the timescales of the firing rate dynamics and the Laplace transform of the interspike interval density, for which explicit expressions are available for many renewal models. Retaining only the first eigenmode already yields a reliable low-dimensional approximation of the firing-rate dynamics that captures spike synchronization effects and fast transient dynamics at stimulus onset. We explicitly demonstrate the validity of our model for a large homogeneous population of Poisson neurons with absolute refractoriness and other renewal models that admit an explicit analytical calculation of the eigenvalues. The eigenmode expansion presented here provides a systematic framework for alternative firing-rate models in computational neuroscience based on spiking neuron dynamics with refractoriness.
Collapse
Affiliation(s)
- Bastian Pietras
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Noé Gallice
- Brain Mind Institute, École polytechnique fédérale de Lausanne (EPFL), Station 15, CH-1015 Lausanne, Switzerland
| | - Tilo Schwalger
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| |
Collapse
|
20
|
Mankin R, Rekker A, Paekivi S. Statistical moments of the interspike intervals for a neuron model driven by trichotomous noise. Phys Rev E 2021; 103:062201. [PMID: 34271748 DOI: 10.1103/physreve.103.062201] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2021] [Accepted: 05/14/2021] [Indexed: 11/07/2022]
Abstract
The influence of a colored three-level input noise (trichotomous noise) on the spike generation of a perfect integrate-and-fire (PIF) model of neurons is studied. Using a first-passage-time formulation, exact expressions for the Laplace transform of the output interspike interval (ISI) density and for the statistical moments of the ISIs (such as the coefficient of variation, the skewness, the serial correlation coefficient, and the Fano factor) are derived. To model the anomalous subdiffusion that can arise from, e.g., the trapping properties of dendritic spines, the model is extended by including a random operational time in the form of an inverse strictly increasing Lévy-type subordinator, and exact formulas for ISI statistics are given for this case as well. Particularly, it is shown that at some parameter regimes, the ISI density exhibits a three-modal structure. The results for the extended model show that the ISI serial correlation coefficient and the Fano factor are nonmonotonic with respect to the input current, which indicates that at an intermediate value of the input current the variability of the output spike trains is minimal. Similarities and differences between the behavior of the presented models and the previously investigated PIF models driven by dichotomous noise are also discussed.
Collapse
Affiliation(s)
- Romi Mankin
- School of Natural Sciences and Health, Tallinn University, 29 Narva Road, 10120 Tallinn, Estonia
| | - Astrid Rekker
- School of Natural Sciences and Health, Tallinn University, 29 Narva Road, 10120 Tallinn, Estonia
| | - Sander Paekivi
- School of Natural Sciences and Health, Tallinn University, 29 Narva Road, 10120 Tallinn, Estonia
| |
Collapse
|
21
|
Network mechanism for insect olfaction. Cogn Neurodyn 2021; 15:103-129. [PMID: 33786083 DOI: 10.1007/s11571-020-09640-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2020] [Revised: 08/25/2020] [Accepted: 09/30/2020] [Indexed: 10/22/2022] Open
Abstract
Early olfactory pathway responses to the presentation of an odor exhibit remarkably similar dynamical behavior across phyla from insects to mammals, and frequently involve transitions among quiescence, collective network oscillations, and asynchronous firing. We hypothesize that the time scales of fast excitation and fast and slow inhibition present in these networks may be the essential element underlying this similar behavior, and design an idealized, conductance-based integrate-and-fire model to verify this hypothesis via numerical simulations. To better understand the mathematical structure underlying the common dynamical behavior across species, we derive a firing-rate model and use it to extract a slow passage through a saddle-node-on-an-invariant-circle bifurcation structure. We expect this bifurcation structure to provide new insights into the understanding of the dynamical behavior of neuronal assemblies and that a similar structure can be found in other sensory systems.
Collapse
|
22
|
Tian Y, Sun P. Characteristics of the neural coding of causality. Phys Rev E 2021; 103:012406. [PMID: 33601638 DOI: 10.1103/physreve.103.012406] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2020] [Accepted: 12/21/2020] [Indexed: 02/02/2023]
Abstract
While causality processing is an essential cognitive capacity of the neural system, a systematic understanding of the neural coding of causality is still elusive. We propose a physically fundamental analysis of this issue and demonstrate that the neural dynamics encodes the original causality between external events near homomorphically. The causality coding is memory robust for the amount of historical information and features high precision but low recall. This coding process creates a sparser representation for the external causality. Finally, we propose a statistic characterization for the neural coding mapping from the original causality to the coded causality in neural dynamics.
Collapse
Affiliation(s)
- Yang Tian
- Department of Psychology, Tsinghua University, Beijing 100084, China and Tsinghua Brain and Intelligence Lab, Beijing 100084, China
| | - Pei Sun
- Department of Psychology, Tsinghua University, Beijing 100084, China and Tsinghua Brain and Intelligence Lab, Beijing 100084, China
| |
Collapse
|
23
|
A Focal Inactivation and Computational Study of Ventrolateral Periaqueductal Gray and Deep Mesencephalic Reticular Nucleus Involvement in Sleep State Switching and Bistability. eNeuro 2020; 7:ENEURO.0451-19.2020. [PMID: 33055199 PMCID: PMC7768273 DOI: 10.1523/eneuro.0451-19.2020] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2019] [Revised: 10/02/2020] [Accepted: 10/06/2020] [Indexed: 11/21/2022] Open
Abstract
Neurons of the ventrolateral periaqueductal gray (vlPAG) and adjacent deep mesencephalic reticular nucleus (DpMe) are implicated in the control of sleep-wake state and are hypothesized components of a flip-flop circuit that maintains sleep bistability by preventing the overexpression of non-rapid eye movement (NREM)/REM sleep intermediary states (NRt). To determine the contribution of vlPAG/DpMe neurons in maintaining sleep bistability we combined computer simulations of flip-flop circuitry with focal inactivation of vlPAG/DpMe neurons by microdialysis delivery of the GABAA receptor agonist muscimol in freely behaving male rats (n = 25) instrumented for electroencephalographic and electromyographic recording. REM sleep was enhanced by muscimol at the vlPAG/DpMe, consistent with previous studies; however, our analyses of NRt dynamics in vivo and those produced by flop-flop circuit simulations show that current thinking is too narrowly focused on the contribution of REM sleep-inactive populations toward vlPAG/DpMe involvement in REM sleep control. We found that much of the muscimol-mediated increase in REM sleep was more appropriately classified as NRt. This loss of sleep bistability was accompanied by fragmentation of REM sleep, as evidenced by an increased number of short REM sleep bouts. REM sleep fragmentation stemmed from an increased number and duration of NRt bouts originating in REM sleep. By contrast, NREM sleep bouts were not likewise fragmented by vlPAG/DpMe inactivation. In flip-flop circuit simulations, these changes could not be replicated through inhibition of the REM sleep-inactive population alone. Instead, combined suppression of REM sleep active and inactive vlPAG/DpMe subpopulations was required to replicate the changes in NRt dynamics.
Collapse
|
24
|
Privault N. Nonstationary shot noise modeling of neuron membrane potentials by closed-form moments and Gram-Charlier expansions. BIOLOGICAL CYBERNETICS 2020; 114:499-518. [PMID: 32955621 DOI: 10.1007/s00422-020-00844-8] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/28/2020] [Accepted: 09/04/2020] [Indexed: 06/11/2023]
Abstract
We present exact analytical expressions of moments of all orders for neuronal membrane potentials in the multiplicative nonstationary Poisson shot noise model. As an application, we derive closed-form Gram-Charlier density expansions that show how the probability density functions of potentials in such models differ from their Gaussian diffusion approximations. This approach extends the results of Brigham and Destexhe (Preprint, 2015a; Phys Rev E 91:062102, 2015b) by the use of exact combinatorial expressions for the moments of multiplicative nonstationary filtered shot noise processes. Our results are confirmed by stochastic simulations and apply to single- and multiple-noise-source models.
Collapse
Affiliation(s)
- Nicolas Privault
- Division of Mathematical Sciences, School of Physical and Mathematical Sciences, Nanyang Technological University, 21 Nanyang Link, Singapore, 637371, Singapore.
| |
Collapse
|
25
|
Cheng H, Cai D, Zhou D. The extended Granger causality analysis for Hodgkin-Huxley neuronal models. CHAOS (WOODBURY, N.Y.) 2020; 30:103102. [PMID: 33138445 DOI: 10.1063/5.0006349] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/01/2020] [Accepted: 09/14/2020] [Indexed: 06/11/2023]
Abstract
How to extract directions of information flow in dynamical systems based on empirical data remains a key challenge. The Granger causality (GC) analysis has been identified as a powerful method to achieve this capability. However, the framework of the GC theory requires that the dynamics of the investigated system can be statistically linearized; i.e., the dynamics can be effectively modeled by linear regressive processes. Under such conditions, the causal connectivity can be directly mapped to the structural connectivity that mediates physical interactions within the system. However, for nonlinear dynamical systems such as the Hodgkin-Huxley (HH) neuronal circuit, the validity of the GC analysis has yet been addressed; namely, whether the constructed causal connectivity is still identical to the synaptic connectivity between neurons remains unknown. In this work, we apply the nonlinear extension of the GC analysis, i.e., the extended GC analysis, to the voltage time series obtained by evolving the HH neuronal network. In addition, we add a certain amount of measurement or observational noise to the time series to take into account the realistic situation in data acquisition in the experiment. Our numerical results indicate that the causal connectivity obtained through the extended GC analysis is consistent with the underlying synaptic connectivity of the system. This consistency is also insensitive to dynamical regimes, e.g., a chaotic or non-chaotic regime. Since the extended GC analysis could in principle be applied to any nonlinear dynamical system as long as its attractor is low dimensional, our results may potentially be extended to the GC analysis in other settings.
Collapse
Affiliation(s)
- Hong Cheng
- School of Statistics and Mathematics, Shanghai Lixin University of Accounting and Finance, Shanghai 201209, China
| | - David Cai
- School of Mathematical Sciences, MOE-LSC, and Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai 200240, China
| | - Douglas Zhou
- School of Mathematical Sciences, MOE-LSC, and Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai 200240, China
| |
Collapse
|
26
|
Mankin R, Rekker A. Effects of transient subordinators on the firing statistics of a neuron model driven by dichotomous noise. Phys Rev E 2020; 102:012103. [PMID: 32794976 DOI: 10.1103/physreve.102.012103] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2020] [Accepted: 06/15/2020] [Indexed: 06/11/2023]
Abstract
The behavior of a stochastic perfect integrate-and-fire (PIF) model of neurons is considered. The effect of temporally correlated random activity of synaptic inputs is modeled as a combination of an asymmetric dichotomous noise and a random operation time in the form of an inverse strictly increasing Lévy-type subordinator. Using a first-passage-time formulation, we find exact expressions for the output interspike interval (ISI) statistics. Particularly, it is shown that at some parameter regimes the ISI density exhibits a multimodal structure. Moreover, it is demonstrated that the coefficient of variation, the serial correlation coefficient, and the Fano factor display a nonmonotonic dependence on the mean input current μ, i.e., the ISI's regularity is maximized at an intermediate value of μ. The features of spike statistics, analytically revealed in our study, are compared with previously obtained results for a perfect integrate-and-fire neuron model driven by dichotomous noise (without subordination).
Collapse
Affiliation(s)
- Romi Mankin
- School of Natural Sciences and Health, Tallinn University, 29 Narva Road, 10120 Tallinn, Estonia
| | - Astrid Rekker
- School of Natural Sciences and Health, Tallinn University, 29 Narva Road, 10120 Tallinn, Estonia
| |
Collapse
|
27
|
Kheradpisheh SR, Masquelier T. Temporal Backpropagation for Spiking Neural Networks with One Spike per Neuron. Int J Neural Syst 2020; 30:2050027. [DOI: 10.1142/s0129065720500276] [Citation(s) in RCA: 57] [Impact Index Per Article: 14.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
We propose a new supervised learning rule for multilayer spiking neural networks (SNNs) that use a form of temporal coding known as rank-order-coding. With this coding scheme, all neurons fire exactly one spike per stimulus, but the firing order carries information. In particular, in the readout layer, the first neuron to fire determines the class of the stimulus. We derive a new learning rule for this sort of network, named S4NN, akin to traditional error backpropagation, yet based on latencies. We show how approximated error gradients can be computed backward in a feedforward network with any number of layers. This approach reaches state-of-the-art performance with supervised multi-fully connected layer SNNs: test accuracy of 97.4% for the MNIST dataset, and 99.2% for the Caltech Face/Motorbike dataset. Yet, the neuron model that we use, nonleaky integrate-and-fire, is much simpler than the one used in all previous works. The source codes of the proposed S4NN are publicly available at https://github.com/SRKH/S4NN .
Collapse
Affiliation(s)
- Saeed Reza Kheradpisheh
- Department of Computer and Data Sciences, Faculty of Mathematical Sciences, Shahid Beheshti University, Tehran, Iran
| | | |
Collapse
|
28
|
Gowers RP, Timofeeva Y, Richardson MJE. Low-rate firing limit for neurons with axon, soma and dendrites driven by spatially distributed stochastic synapses. PLoS Comput Biol 2020; 16:e1007175. [PMID: 32310936 PMCID: PMC7217482 DOI: 10.1371/journal.pcbi.1007175] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2019] [Revised: 05/12/2020] [Accepted: 01/27/2020] [Indexed: 11/18/2022] Open
Abstract
Analytical forms for neuronal firing rates are important theoretical tools for the analysis of network states. Since the 1960s, the majority of approaches have treated neurons as being electrically compact and therefore isopotential. These approaches have yielded considerable insight into how single-cell properties affect network activity; however, many neuronal classes, such as cortical pyramidal cells, are electrically extended objects. Calculation of the complex flow of electrical activity driven by stochastic spatio-temporal synaptic input streams in these structures has presented a significant analytical challenge. Here we demonstrate that an extension of the level-crossing method of Rice, previously used for compact cells, provides a general framework for approximating the firing rate of neurons with spatial structure. Even for simple models, the analytical approximations derived demonstrate a surprising richness including: independence of the firing rate to the electrotonic length for certain models, but with a form distinct to the point-like leaky integrate-and-fire model; a non-monotonic dependence of the firing rate on the number of dendrites receiving synaptic drive; a significant effect of the axonal and somatic load on the firing rate; and the role that the trigger position on the axon for spike initiation has on firing properties. The approach necessitates only calculating the mean and variances of the non-thresholded voltage and its rate of change in neuronal structures subject to spatio-temporal synaptic fluctuations. The combination of simplicity and generality promises a framework that can be built upon to incorporate increasing levels of biophysical detail and extend beyond the low-rate firing limit treated in this paper.
Collapse
Affiliation(s)
- Robert P. Gowers
- Mathematics for Real-World Systems Centre for Doctoral Training, University of Warwick, Coventry, United Kingdom
- Institute for Theoretical Biology, Department of Biology, Humboldt-Universität zu Berlin, Philippstrasse 13, Haus 4, Berlin, Germany
| | - Yulia Timofeeva
- Department of Computer Science, University of Warwick, Coventry, United Kingdom
- Department of Clinical and Experimental Epilepsy, UCL Queen Square Institute of Neurology, University College London, London, United Kingdom
| | | |
Collapse
|
29
|
Abundo M. Randomization of a linear boundary in the first-passage problem of Brownian motion. STOCHASTIC ANALYSIS AND APPLICATIONS 2020; 38:343-351. [DOI: 10.1080/07362994.2019.1695629] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/14/2019] [Revised: 11/14/2019] [Accepted: 11/18/2019] [Indexed: 09/02/2023]
Affiliation(s)
- Mario Abundo
- Dipartimento di Matematica, Università “Tor Vergata”, Rome, Italy
| |
Collapse
|
30
|
Wang X, Lin X, Dang X. Supervised learning in spiking neural networks: A review of algorithms and evaluations. Neural Netw 2020; 125:258-280. [PMID: 32146356 DOI: 10.1016/j.neunet.2020.02.011] [Citation(s) in RCA: 49] [Impact Index Per Article: 12.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2019] [Revised: 12/15/2019] [Accepted: 02/20/2020] [Indexed: 01/08/2023]
Abstract
As a new brain-inspired computational model of the artificial neural network, a spiking neural network encodes and processes neural information through precisely timed spike trains. Spiking neural networks are composed of biologically plausible spiking neurons, which have become suitable tools for processing complex temporal or spatiotemporal information. However, because of their intricately discontinuous and implicit nonlinear mechanisms, the formulation of efficient supervised learning algorithms for spiking neural networks is difficult, and has become an important problem in this research field. This article presents a comprehensive review of supervised learning algorithms for spiking neural networks and evaluates them qualitatively and quantitatively. First, a comparison between spiking neural networks and traditional artificial neural networks is provided. The general framework and some related theories of supervised learning for spiking neural networks are then introduced. Furthermore, the state-of-the-art supervised learning algorithms in recent years are reviewed from the perspectives of applicability to spiking neural network architecture and the inherent mechanisms of supervised learning algorithms. A performance comparison of spike train learning of some representative algorithms is also made. In addition, we provide five qualitative performance evaluation criteria for supervised learning algorithms for spiking neural networks and further present a new taxonomy for supervised learning algorithms depending on these five performance evaluation criteria. Finally, some future research directions in this research field are outlined.
Collapse
Affiliation(s)
- Xiangwen Wang
- College of Computer Science and Engineering, Northwest Normal University, Lanzhou, 730070, People's Republic of China
| | - Xianghong Lin
- College of Computer Science and Engineering, Northwest Normal University, Lanzhou, 730070, People's Republic of China.
| | - Xiaochao Dang
- College of Computer Science and Engineering, Northwest Normal University, Lanzhou, 730070, People's Republic of China
| |
Collapse
|
31
|
Faqeeh A, Osat S, Radicchi F, Gleeson JP. Emergence of power laws in noncritical neuronal systems. Phys Rev E 2020; 100:010401. [PMID: 31499795 PMCID: PMC7217540 DOI: 10.1103/physreve.100.010401] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2019] [Indexed: 11/07/2022]
Abstract
Experimental and computational studies provide compelling evidence that neuronal systems are characterized by power-law distributions of neuronal avalanche sizes. This fact is interpreted as an indication that these systems are operating near criticality, and, in turn, typical properties of critical dynamical processes, such as optimal information transmission and stability, are attributed to neuronal systems. The purpose of this Rapid Communication is to show that the presence of power-law distributions for the size of neuronal avalanches is not a sufficient condition for the system to operate near criticality. Specifically, we consider a simplistic model of neuronal dynamics on networks and show that the degree distribution of the underlying neuronal network may trigger power-law distributions for neuronal avalanches even when the system is not in its critical regime. To certify and explain our findings we develop an analytical approach based on percolation theory and branching processes techniques.
Collapse
Affiliation(s)
- Ali Faqeeh
- Mathematics Consortium for Science and Industry, Department of Mathematics and Statistics, University of Limerick, Limerick V94 T9PX, Ireland.,Center for Complex Networks and Systems Research, School of Informatics, Computing, and Engineering, Indiana University, Bloomington, Indiana 47408, USA
| | - Saeed Osat
- Deep Quantum Labs, Skolkovo Institute of Science and Technology, Moscow 143026, Russia
| | - Filippo Radicchi
- Center for Complex Networks and Systems Research, School of Informatics, Computing, and Engineering, Indiana University, Bloomington, Indiana 47408, USA
| | - James P Gleeson
- Mathematics Consortium for Science and Industry, Department of Mathematics and Statistics, University of Limerick, Limerick V94 T9PX, Ireland
| |
Collapse
|
32
|
Jordan J, Petrovici MA, Breitwieser O, Schemmel J, Meier K, Diesmann M, Tetzlaff T. Deterministic networks for probabilistic computing. Sci Rep 2019; 9:18303. [PMID: 31797943 PMCID: PMC6893033 DOI: 10.1038/s41598-019-54137-7] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2018] [Accepted: 11/06/2019] [Indexed: 01/13/2023] Open
Abstract
Neuronal network models of high-level brain functions such as memory recall and reasoning often rely on the presence of some form of noise. The majority of these models assumes that each neuron in the functional network is equipped with its own private source of randomness, often in the form of uncorrelated external noise. In vivo, synaptic background input has been suggested to serve as the main source of noise in biological neuronal networks. However, the finiteness of the number of such noise sources constitutes a challenge to this idea. Here, we show that shared-noise correlations resulting from a finite number of independent noise sources can substantially impair the performance of stochastic network models. We demonstrate that this problem is naturally overcome by replacing the ensemble of independent noise sources by a deterministic recurrent neuronal network. By virtue of inhibitory feedback, such networks can generate small residual spatial correlations in their activity which, counter to intuition, suppress the detrimental effect of shared input. We exploit this mechanism to show that a single recurrent network of a few hundred neurons can serve as a natural noise source for a large ensemble of functional networks performing probabilistic computations, each comprising thousands of units.
Collapse
Affiliation(s)
- Jakob Jordan
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA Institute Brain-Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany.
- Department of Physiology, University of Bern, Bern, Switzerland.
| | - Mihai A Petrovici
- Department of Physiology, University of Bern, Bern, Switzerland
- Kirchhoff Institute for Physics, Ruprecht-Karls-University Heidelberg, Heidelberg, Germany
| | - Oliver Breitwieser
- Kirchhoff Institute for Physics, Ruprecht-Karls-University Heidelberg, Heidelberg, Germany
| | - Johannes Schemmel
- Kirchhoff Institute for Physics, Ruprecht-Karls-University Heidelberg, Heidelberg, Germany
| | - Karlheinz Meier
- Kirchhoff Institute for Physics, Ruprecht-Karls-University Heidelberg, Heidelberg, Germany
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA Institute Brain-Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen University, Aachen, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| | - Tom Tetzlaff
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA Institute Brain-Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| |
Collapse
|
33
|
Rotermund D, Pawelzik KR. Back-Propagation Learning in Deep Spike-By-Spike Networks. Front Comput Neurosci 2019; 13:55. [PMID: 31456677 PMCID: PMC6700320 DOI: 10.3389/fncom.2019.00055] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2019] [Accepted: 07/24/2019] [Indexed: 01/26/2023] Open
Abstract
Artificial neural networks (ANNs) are important building blocks in technical applications. They rely on noiseless continuous signals in stark contrast to the discrete action potentials stochastically exchanged among the neurons in real brains. We propose to bridge this gap with Spike-by-Spike (SbS) networks which represent a compromise between non-spiking and spiking versions of generative models. What is missing, however, are algorithms for finding weight sets that would optimize the output performances of deep SbS networks with many layers. Here, a learning rule for feed-forward SbS networks is derived. The properties of this approach are investigated and its functionality is demonstrated by simulations. In particular, a Deep Convolutional SbS network for classifying handwritten digits achieves a classification performance of roughly 99.3% on the MNIST test data when the learning rule is applied together with an optimizer. Thereby it approaches the benchmark results of ANNs without extensive parameter optimization. We envision this learning rule for SBS networks to provide a new basis for research in neuroscience and for technical applications, especially when they become implemented on specialized computational hardware.
Collapse
Affiliation(s)
- David Rotermund
- Institute for Theoretical Physics, University of Bremen, Bremen, Germany
| | | |
Collapse
|
34
|
Budak M, Zochowski M. Synaptic Failure Differentially Affects Pattern Formation in Heterogenous Networks. Front Neural Circuits 2019; 13:31. [PMID: 31139055 PMCID: PMC6519395 DOI: 10.3389/fncir.2019.00031] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2018] [Accepted: 04/11/2019] [Indexed: 02/05/2023] Open
Abstract
The communication of neurons is primarily maintained by synapses, which play a crucial role in the functioning of the nervous system. Therefore, synaptic failure may critically impair information processing in the brain and may underlie many neurodegenerative diseases. A number of studies have suggested that synaptic failure may preferentially target neurons with high connectivity (i.e., network hubs). As a result, the activity of these highly connected neurons can be significantly affected. It has been speculated that anesthetics regulate conscious state by affecting synaptic transmission at these network hubs and subsequently reducing overall coherence in the network activity. In addition, hubs in cortical networks are shown to be more vulnerable to amyloid deposition because of their higher activity within the network, causing decrease in coherence patterns and eventually Alzheimer’s disease (AD). Here, we investigate how synaptic failure can affect spatio-temporal dynamics of scale free networks, having a power law scaling of number of connections per neuron – a relatively few neurons (hubs) with a lot of emanating or incoming connections and many cells with low connectivity. We studied two types of synaptic failure: activity-independent and targeted, activity-dependent synaptic failure. We defined scale-free network structures based on the dominating direction of the connections at the hub neurons: incoming and outgoing. We found that the two structures have significantly different dynamical properties. We show that synaptic failure may not only lead to the loss of coherence but unintuitively also can facilitate its emergence. We show that this is because activity-dependent synaptic failure homogenizes the activity levels in the network creating a dynamical substrate for the observed coherence increase. Obtained results may lead to better understanding of changes in large-scale pattern formation during progression of neuro-degenerative diseases targeting synaptic transmission.
Collapse
Affiliation(s)
- Maral Budak
- Biophysics Program, University of Michigan, Ann Arbor, MI, United States
| | - Michal Zochowski
- Biophysics Program, University of Michigan, Ann Arbor, MI, United States.,Department of Physics, University of Michigan, Ann Arbor, MI, United States
| |
Collapse
|
35
|
An L, Tang Y, Wang Q, Pei Q, Wei R, Duan H, Liu JK. Coding Capacity of Purkinje Cells With Different Schemes of Morphological Reduction. Front Comput Neurosci 2019; 13:29. [PMID: 31156415 PMCID: PMC6530636 DOI: 10.3389/fncom.2019.00029] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2019] [Accepted: 04/24/2019] [Indexed: 12/15/2022] Open
Abstract
The brain as a neuronal system has very complex structures with a large diversity of neuronal types. The most basic complexity is seen from the structure of neuronal morphology, which usually has a complex tree-like structure with dendritic spines distributed in branches. To simulate a large-scale network with spiking neurons, the simple point neuron, such as the integrate-and-fire neuron, is often used. However, recent experimental evidence suggests that the computational ability of a single neuron is largely enhanced by its morphological structure, in particular, by various types of dendritic dynamics. As the morphology reduction of detailed biophysical models is a classic question in systems neuroscience, much effort has been taken to simulate a neuron with a few compartments to include the interaction between the soma and dendritic spines. Yet, novel reduction methods are still needed to deal with the complex dendritic tree. Here, using 10 individual Purkinje cells of the cerebellum from three species of guinea-pig, mouse and rat, we consider four types of reduction methods and study their effects on the coding capacity of Purkinje cells in terms of firing rate, timing coding, spiking pattern, and modulated firing under different stimulation protocols. We found that there is a variation of reduction performance depending on individual cells and species, however, all reduction methods can preserve, to some degree, firing activity of the full model of Purkinje cell. Therefore, when stimulating large-scale network of neurons, one has to choose a proper type of reduced neuronal model depending on the questions addressed. Among these reduction schemes, Branch method, that preserves the geometrical volume of neurons, can achieve the best balance among different performance measures of accuracy, simplification, and computational efficiency, and reproduce various phenomena shown in the full morphology model of Purkinje cells. Altogether, these results suggest that the Branch reduction scheme seems to provide a general guideline for reducing complex morphology into a few compartments without the loss of basic characteristics of the firing properties of neurons.
Collapse
Affiliation(s)
- Lingling An
- School of Computer Science and Technology, Xidian University, Xi'an, China
| | - Yuanhong Tang
- School of Computer Science and Technology, Xidian University, Xi'an, China
| | - Quan Wang
- School of Computer Science and Technology, Xidian University, Xi'an, China
| | - Qingqi Pei
- School of Computer Science and Technology, Xidian University, Xi'an, China
| | - Ran Wei
- School of Computer Science and Technology, Xidian University, Xi'an, China
| | - Huiyuan Duan
- School of Computer Science and Technology, Xidian University, Xi'an, China
| | - Jian K. Liu
- Department of Neuroscience, Psychology and Behaviour, Centre for Systems Neuroscience, University of Leicester, Leicester, United Kingdom
| |
Collapse
|
36
|
Bossy M, Fontbona J, Olivero H. Synchronization of stochastic mean field networks of Hodgkin-Huxley neurons with noisy channels. J Math Biol 2019; 78:1771-1820. [PMID: 30734076 DOI: 10.1007/s00285-019-01326-7] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2018] [Revised: 12/31/2018] [Indexed: 11/24/2022]
Abstract
In this work we are interested in a mathematical model of the collective behavior of a fully connected network of finitely many neurons, when their number and when time go to infinity. We assume that every neuron follows a stochastic version of the Hodgkin-Huxley model, and that pairs of neurons interact through both electrical and chemical synapses, the global connectivity being of mean field type. When the leak conductance is strictly positive, we prove that if the initial voltages are uniformly bounded and the electrical interaction between neurons is strong enough, then, uniformly in the number of neurons, the whole system synchronizes exponentially fast as time goes to infinity, up to some error controlled by (and vanishing with) the channels noise level. Moreover, we prove that if the random initial condition is exchangeable, on every bounded time interval the propagation of chaos property for this system holds (regardless of the interaction intensities). Combining these results, we deduce that the nonlinear McKean-Vlasov equation describing an infinite network of such neurons concentrates, as time goes to infinity, around the dynamics of a single Hodgkin-Huxley neuron with chemical neurotransmitter channels. Our results are illustrated and complemented with numerical simulations.
Collapse
Affiliation(s)
- Mireille Bossy
- INRIA Sophia Antipolis Méditerranée, Sophia Antipolis, France
| | - Joaquín Fontbona
- Department of Mathematical Engineering and Center for Mathematical Modeling, UMI(2807) UCHILE-CNRS, University of Chile, Santiago, Chile
| | - Héctor Olivero
- CIMFAV, Facultad de Ingeniería, Universidad de Valparaíso, Valparaíso, Chile.
| |
Collapse
|
37
|
Seifozzakerini S, Yau WY, Mao K, Nejati H. Hough Transform Implementation For Event-Based Systems: Concepts and Challenges. Front Comput Neurosci 2018; 12:103. [PMID: 30622466 PMCID: PMC6308381 DOI: 10.3389/fncom.2018.00103] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2018] [Accepted: 12/05/2018] [Indexed: 11/13/2022] Open
Abstract
Hough transform (HT) is one of the most well-known techniques in computer vision that has been the basis of many practical image processing algorithms. HT however is designed to work for frame-based systems such as conventional digital cameras. Recently, event-based systems such as Dynamic Vision Sensor (DVS) cameras, has become popular among researchers. Event-based cameras have a significantly high temporal resolution (1 μs), but each pixel can only detect change and not color. As such, the conventional image processing algorithms cannot be readily applied to event-based output streams. Therefore, it is necessary to adapt the conventional image processing algorithms for event-based cameras. This paper provides a systematic explanation, starting from extending conventional HT to 3D HT, adaptation to event-based systems, and the implementation of the 3D HT using Spiking Neural Networks (SNNs). Using SNN enables the proposed solution to be easily realized on hardware using FPGA, without requiring CPU or additional memory. In addition, we also discuss techniques for optimal SNN-based implementation using efficient number of neurons for the required accuracy and resolution along each dimension, without increasing the overall computational complexity. We hope that this will help to reduce the gap between event-based and frame-based systems.
Collapse
Affiliation(s)
- Sajjad Seifozzakerini
- Institute for Infocomm Research, Agency for Science, Technology and Research (ASTAR), Singapore, Singapore.,School of Electrical and Electronic Engineering, Nanyang Technological University (NTU), Singapore, Singapore
| | - Wei-Yun Yau
- Institute for Infocomm Research, Agency for Science, Technology and Research (ASTAR), Singapore, Singapore
| | - Kezhi Mao
- School of Electrical and Electronic Engineering, Nanyang Technological University (NTU), Singapore, Singapore
| | - Hossein Nejati
- Information Systems Technology and Design (ISTD), Singapore University of Technology and Design (SUTD), Singapore, Singapore
| |
Collapse
|
38
|
Luccioli S, Angulo-Garcia D, Cossart R, Malvache A, Módol L, Sousa VH, Bonifazi P, Torcini A. Modeling driver cells in developing neuronal networks. PLoS Comput Biol 2018; 14:e1006551. [PMID: 30388120 PMCID: PMC6235603 DOI: 10.1371/journal.pcbi.1006551] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2018] [Revised: 11/14/2018] [Accepted: 10/07/2018] [Indexed: 12/17/2022] Open
Abstract
Spontaneous emergence of synchronized population activity is a characteristic feature of developing brain circuits. Recent experiments in the developing neo-cortex showed the existence of driver cells able to impact the synchronization dynamics when single-handedly stimulated. We have developed a spiking network model capable to reproduce the experimental results, thus identifying two classes of driver cells: functional hubs and low functionally connected (LC) neurons. The functional hubs arranged in a clique orchestrated the synchronization build-up, while the LC drivers were lately or not at all recruited in the synchronization process. Notwithstanding, they were able to alter the network state when stimulated by modifying the temporal activation of the functional clique or even its composition. LC drivers can lead either to higher population synchrony or even to the arrest of population dynamics, upon stimulation. Noticeably, some LC driver can display both effects depending on the received stimulus. We show that in the model the presence of inhibitory neurons together with the assumption that younger cells are more excitable and less connected is crucial for the emergence of LC drivers. These results provide a further understanding of the structural-functional mechanisms underlying synchronized firings in developing circuits possibly related to the coordinated activity of cell assemblies in the adult brain.
Collapse
Affiliation(s)
- Stefano Luccioli
- CNR - Consiglio Nazionale delle Ricerche - Istituto dei Sistemi Complessi, Sesto Fiorentino, Italy
- INFN Sez. Firenze, Sesto Fiorentino, Italy
| | - David Angulo-Garcia
- Grupo de Modelado Computacional - Dinámica y Complejidad de Sistemas, Instituto de Matemáticas Aplicadas, Universidad de Cartagena, Cartagena de Indias, Colombia
| | - Rosa Cossart
- Aix Marseille Univ, INSERM, INMED, Marseille, France
| | | | - Laura Módol
- Aix Marseille Univ, INSERM, INMED, Marseille, France
| | | | - Paolo Bonifazi
- Biocruces Health Research Institute, Bilbao, Vizcaya, Spain
- Ikerbasque: The Basque Foundation for Science, Bilbao, Spain
| | - Alessandro Torcini
- CNR - Consiglio Nazionale delle Ricerche - Istituto dei Sistemi Complessi, Sesto Fiorentino, Italy
- Aix Marseille Univ, INSERM, INMED, Marseille, France
- Laboratoire de Physique Théorique et Modélisation, Université de Cergy-Pontoise, CNRS, UMR 8089, Cergy-Pontoise, France
| |
Collapse
|
39
|
Heiberg T, Kriener B, Tetzlaff T, Einevoll GT, Plesser HE. Firing-rate models for neurons with a broad repertoire of spiking behaviors. J Comput Neurosci 2018; 45:103-132. [PMID: 30146661 PMCID: PMC6208914 DOI: 10.1007/s10827-018-0693-9] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2018] [Revised: 08/01/2018] [Accepted: 08/02/2018] [Indexed: 11/29/2022]
Abstract
Capturing the response behavior of spiking neuron models with rate-based models facilitates the investigation of neuronal networks using powerful methods for rate-based network dynamics. To this end, we investigate the responses of two widely used neuron model types, the Izhikevich and augmented multi-adapative threshold (AMAT) models, to a range of spiking inputs ranging from step responses to natural spike data. We find (i) that linear-nonlinear firing rate models fitted to test data can be used to describe the firing-rate responses of AMAT and Izhikevich spiking neuron models in many cases; (ii) that firing-rate responses are generally too complex to be captured by first-order low-pass filters but require bandpass filters instead; (iii) that linear-nonlinear models capture the response of AMAT models better than of Izhikevich models; (iv) that the wide range of response types evoked by current-injection experiments collapses to few response types when neurons are driven by stationary or sinusoidally modulated Poisson input; and (v) that AMAT and Izhikevich models show different responses to spike input despite identical responses to current injections. Together, these findings suggest that rate-based models of network dynamics may capture a wider range of neuronal response properties by incorporating second-order bandpass filters fitted to responses of spiking model neurons. These models may contribute to bringing rate-based network modeling closer to the reality of biological neuronal networks.
Collapse
Affiliation(s)
- Thomas Heiberg
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| | - Birgit Kriener
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway.,Institute of Basic Medical Sciences, University of Oslo, Oslo, Norway
| | - Tom Tetzlaff
- Institute of Neuroscience and Medicine (INM-6), Jülich Research Centre, Jülich, Germany.,Institute for Advanced Simulation (IAS-6), Jülich Research Centre, Jülich, Germany.,JARA Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| | - Gaute T Einevoll
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway.,Department of Physics, University of Oslo, Oslo, Norway
| | - Hans E Plesser
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway. .,Institute of Neuroscience and Medicine (INM-6), Jülich Research Centre, Jülich, Germany.
| |
Collapse
|
40
|
Ge M, Xu Y, Lu L, Zhao Y, Yang L, Zhan X, Gao K, Li A, Jia Y. Effect of external periodic signals and electromagnetic radiation on autaptic regulation of neuronal firing. IET Syst Biol 2018; 12:177-184. [PMID: 33451180 DOI: 10.1049/iet-syb.2017.0069] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2017] [Revised: 03/22/2018] [Accepted: 03/25/2018] [Indexed: 01/20/2023] Open
Abstract
An improved Hindmarsh-Rose (HR) neuron model, where the memristor is a bridge between membrane potential and magnetic flux, can be used to investigate the effect of periodic signals on autaptic regulation of neurons under electromagnetic radiation. Based on the improved HR model driven by periodic high-low-frequency current and electromagnetic radiation, the responses of electrical autaptic regulation with diverse high-low-frequency signals are investigated using bifurcation analysis. It is found that the electrical modes of neurons are determined by the selecting parameters of both periodic high and low-frequency current and electromagnetic radiation, and the Hamiltonian energy depends on the neuronal firing modes. The effects of Gaussian white noise on the membrane potential are discussed using numerical simulations. It is demonstrated that external high-low-frequency stimulus plays a significant role in the autaptic regulation of neural firing mode, and the electrical mode of neurons can be affected by the angular frequency of both high-low-frequency forcing current and electromagnetic radiation. The mechanism of neuronal firing regulated by high-low-frequency signal and electromagnetic radiation discussed here could be applied to research neuronal networks and synchronisation modes.
Collapse
Affiliation(s)
- Mengyan Ge
- Department of Physics and Institute of Biophysics, Central China Normal University, Wuhan, 430079, People's Republic of China
| | - Ying Xu
- Department of Physics and Institute of Biophysics, Central China Normal University, Wuhan, 430079, People's Republic of China
| | - Lulu Lu
- Department of Physics and Institute of Biophysics, Central China Normal University, Wuhan, 430079, People's Republic of China
| | - Yunjie Zhao
- Department of Physics and Institute of Biophysics, Central China Normal University, Wuhan, 430079, People's Republic of China
| | - Lijian Yang
- Department of Physics and Institute of Biophysics, Central China Normal University, Wuhan, 430079, People's Republic of China
| | - Xuan Zhan
- Department of Physics and Institute of Biophysics, Central China Normal University, Wuhan, 430079, People's Republic of China
| | - Kaifu Gao
- Department of Physics and Institute of Biophysics, Central China Normal University, Wuhan, 430079, People's Republic of China
| | - Anbang Li
- Department of Physics and Institute of Biophysics, Central China Normal University, Wuhan, 430079, People's Republic of China
| | - Ya Jia
- Department of Physics and Institute of Biophysics, Central China Normal University, Wuhan, 430079, People's Republic of China
| |
Collapse
|
41
|
John RA, Liu F, Chien NA, Kulkarni MR, Zhu C, Fu Q, Basu A, Liu Z, Mathews N. Synergistic Gating of Electro-Iono-Photoactive 2D Chalcogenide Neuristors: Coexistence of Hebbian and Homeostatic Synaptic Metaplasticity. ADVANCED MATERIALS (DEERFIELD BEACH, FLA.) 2018; 30:e1800220. [PMID: 29726076 DOI: 10.1002/adma.201800220] [Citation(s) in RCA: 43] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/10/2018] [Revised: 02/25/2018] [Indexed: 05/22/2023]
Abstract
Emulation of brain-like signal processing with thin-film devices can lay the foundation for building artificially intelligent learning circuitry in future. Encompassing higher functionalities into single artificial neural elements will allow the development of robust neuromorphic circuitry emulating biological adaptation mechanisms with drastically lesser neural elements, mitigating strict process challenges and high circuit density requirements necessary to match the computational complexity of the human brain. Here, 2D transition metal di-chalcogenide (MoS2 ) neuristors are designed to mimic intracellular ion endocytosis-exocytosis dynamics/neurotransmitter-release in chemical synapses using three approaches: (i) electronic-mode: a defect modulation approach where the traps at the semiconductor-dielectric interface are perturbed; (ii) ionotronic-mode: where electronic responses are modulated via ionic gating; and (iii) photoactive-mode: harnessing persistent photoconductivity or trap-assisted slow recombination mechanisms. Exploiting a novel multigated architecture incorporating electrical and optical biases, this incarnation not only addresses different charge-trapping probabilities to finely modulate the synaptic weights, but also amalgamates neuromodulation schemes to achieve "plasticity of plasticity-metaplasticity" via dynamic control of Hebbian spike-time dependent plasticity and homeostatic regulation. Coexistence of such multiple forms of synaptic plasticity increases the efficacy of memory storage and processing capacity of artificial neuristors, enabling design of highly efficient novel neural architectures.
Collapse
Affiliation(s)
- Rohit Abraham John
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798
| | - Fucai Liu
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798
| | - Nguyen Anh Chien
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798
| | - Mohit R Kulkarni
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798
| | - Chao Zhu
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798
| | - Qundong Fu
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798
| | - Arindam Basu
- School of Electrical and Electronic Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798
| | - Zheng Liu
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798
| | - Nripan Mathews
- School of Materials Science and Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore, 639798
- Energy Research Institute @ NTU (ERI@N), Nanyang Technological University, Singapore, 637553
| |
Collapse
|
42
|
Mankin R, Paekivi S. Memory-induced resonancelike suppression of spike generation in a resonate-and-fire neuron model. Phys Rev E 2018; 97:012125. [PMID: 29448468 DOI: 10.1103/physreve.97.012125] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2017] [Indexed: 06/08/2023]
Abstract
The behavior of a stochastic resonate-and-fire neuron model based on a reduction of a fractional noise-driven generalized Langevin equation (GLE) with a power-law memory kernel is considered. The effect of temporally correlated random activity of synaptic inputs, which arise from other neurons forming local and distant networks, is modeled as an additive fractional Gaussian noise in the GLE. Using a first-passage-time formulation, in certain system parameter domains exact expressions for the output interspike interval (ISI) density and for the survival probability (the probability that a spike is not generated) are derived and their dependence on input parameters, especially on the memory exponent, is analyzed. In the case of external white noise, it is shown that at intermediate values of the memory exponent the survival probability is significantly enhanced in comparison with the cases of strong and weak memory, which causes a resonancelike suppression of the probability of spike generation as a function of the memory exponent. Moreover, an examination of the dependence of multimodality in the ISI distribution on input parameters shows that there exists a critical memory exponent α_{c}≈0.402, which marks a dynamical transition in the behavior of the system. That phenomenon is illustrated by a phase diagram describing the emergence of three qualitatively different structures of the ISI distribution. Similarities and differences between the behavior of the model at internal and external noises are also discussed.
Collapse
Affiliation(s)
- Romi Mankin
- School of Natural Sciences and Health, Tallinn University, 29 Narva Road, 10120 Tallinn, Estonia
| | - Sander Paekivi
- School of Natural Sciences and Health, Tallinn University, 29 Narva Road, 10120 Tallinn, Estonia
| |
Collapse
|
43
|
Ashida G, Tollin DJ, Kretzberg J. Physiological models of the lateral superior olive. PLoS Comput Biol 2017; 13:e1005903. [PMID: 29281618 PMCID: PMC5744914 DOI: 10.1371/journal.pcbi.1005903] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2017] [Accepted: 11/28/2017] [Indexed: 01/09/2023] Open
Abstract
In computational biology, modeling is a fundamental tool for formulating, analyzing and predicting complex phenomena. Most neuron models, however, are designed to reproduce certain small sets of empirical data. Hence their outcome is usually not compatible or comparable with other models or datasets, making it unclear how widely applicable such models are. In this study, we investigate these aspects of modeling, namely credibility and generalizability, with a specific focus on auditory neurons involved in the localization of sound sources. The primary cues for binaural sound localization are comprised of interaural time and level differences (ITD/ILD), which are the timing and intensity differences of the sound waves arriving at the two ears. The lateral superior olive (LSO) in the auditory brainstem is one of the locations where such acoustic information is first computed. An LSO neuron receives temporally structured excitatory and inhibitory synaptic inputs that are driven by ipsi- and contralateral sound stimuli, respectively, and changes its spike rate according to binaural acoustic differences. Here we examine seven contemporary models of LSO neurons with different levels of biophysical complexity, from predominantly functional ones (‘shot-noise’ models) to those with more detailed physiological components (variations of integrate-and-fire and Hodgkin-Huxley-type). These models, calibrated to reproduce known monaural and binaural characteristics of LSO, generate largely similar results to each other in simulating ITD and ILD coding. Our comparisons of physiological detail, computational efficiency, predictive performances, and further expandability of the models demonstrate (1) that the simplistic, functional LSO models are suitable for applications where low computational costs and mathematical transparency are needed, (2) that more complex models with detailed membrane potential dynamics are necessary for simulation studies where sub-neuronal nonlinear processes play important roles, and (3) that, for general purposes, intermediate models might be a reasonable compromise between simplicity and biological plausibility. Computational models help our understanding of complex biological systems, by identifying their key elements and revealing their operational principles. Close comparisons between model predictions and empirical observations ensure our confidence in a model as a building block for further applications. Most current neuronal models, however, are constructed to replicate only a small specific set of experimental data. Thus, it is usually unclear how these models can be generalized to different datasets and how they compare with each other. In this paper, seven neuronal models are examined that are designed to reproduce known physiological characteristics of auditory neurons involved in the detection of sound source location. Despite their different levels of complexity, the models generate largely similar results when their parameters are tuned with common criteria. Comparisons show that simple models are computationally more efficient and theoretically transparent, and therefore suitable for rigorous mathematical analyses and engineering applications including real-time simulations. In contrast, complex models are necessary for investigating the relationship between underlying biophysical processes and sub- and suprathreshold spiking properties, although they have a large number of unconstrained, unverified parameters. Having identified their advantages and drawbacks, these auditory neuron models may readily be used for future studies and applications.
Collapse
Affiliation(s)
- Go Ashida
- Cluster of Excellence "Hearing4all", Department of Neuroscience, University of Oldenburg, Oldenburg, Germany
| | - Daniel J Tollin
- Department of Physiology and Biophysics, University of Colorado School of Medicine, Aurora, Colorado, United States of America
| | - Jutta Kretzberg
- Cluster of Excellence "Hearing4all", Department of Neuroscience, University of Oldenburg, Oldenburg, Germany
| |
Collapse
|
44
|
Shomali SR, Ahmadabadi MN, Shimazaki H, Rasuli SN. How does transient signaling input affect the spike timing of postsynaptic neuron near the threshold regime: an analytical study. J Comput Neurosci 2017; 44:147-171. [PMID: 29192377 PMCID: PMC5851711 DOI: 10.1007/s10827-017-0664-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2016] [Revised: 07/14/2017] [Accepted: 09/11/2017] [Indexed: 11/05/2022]
Abstract
The noisy threshold regime, where even a small set of presynaptic neurons can significantly affect postsynaptic spike-timing, is suggested as a key requisite for computation in neurons with high variability. It also has been proposed that signals under the noisy conditions are successfully transferred by a few strong synapses and/or by an assembly of nearly synchronous synaptic activities. We analytically investigate the impact of a transient signaling input on a leaky integrate-and-fire postsynaptic neuron that receives background noise near the threshold regime. The signaling input models a single strong synapse or a set of synchronous synapses, while the background noise represents a lot of weak synapses. We find an analytic solution that explains how the first-passage time (ISI) density is changed by transient signaling input. The analysis allows us to connect properties of the signaling input like spike timing and amplitude with postsynaptic first-passage time density in a noisy environment. Based on the analytic solution, we calculate the Fisher information with respect to the signaling input’s amplitude. For a wide range of amplitudes, we observe a non-monotonic behavior for the Fisher information as a function of background noise. Moreover, Fisher information non-trivially depends on the signaling input’s amplitude; changing the amplitude, we observe one maximum in the high level of the background noise. The single maximum splits into two maximums in the low noise regime. This finding demonstrates the benefit of the analytic solution in investigating signal transfer by neurons.
Collapse
Affiliation(s)
- Safura Rashid Shomali
- School of Cognitive Sciences, Institute for Research in Fundamental Sciences (IPM), P.O. Box 19395-5746 (1954851167), Tehran, Iran.
| | - Majid Nili Ahmadabadi
- Control and Intelligent Processing Center of Excellence, School of Electrical and Computer Engineering, College of Engineering, University of Tehran, Tehran, 14395-515, Iran
| | - Hideaki Shimazaki
- Graduate School of Informatics, Kyoto University, Yoshida-honmachi, Sakyo-ku, Kyoto, 606-8501, Japan.,Honda Research Institute Japan, Honcho 8-1, Wako-shi, Saitama, 351-0188, Japan
| | - Seyyed Nader Rasuli
- Department of Physics, University of Guilan, Rasht, 41335-1914, Iran.,School of Physics, Institute for Research in Fundamental Sciences (IPM), P.O. Box 19395-5531, Tehran, Iran
| |
Collapse
|
45
|
McDonnell MD, Graham BP. Phase changes in neuronal postsynaptic spiking due to short term plasticity. PLoS Comput Biol 2017; 13:e1005634. [PMID: 28937977 PMCID: PMC5627952 DOI: 10.1371/journal.pcbi.1005634] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2016] [Revised: 10/04/2017] [Accepted: 06/08/2017] [Indexed: 02/03/2023] Open
Abstract
In the brain, the postsynaptic response of a neuron to time-varying inputs is determined by the interaction of presynaptic spike times with the short-term dynamics of each synapse. For a neuron driven by stochastic synapses, synaptic depression results in a quite different postsynaptic response to a large population input depending on how correlated in time the spikes across individual synapses are. Here we show using both simulations and mathematical analysis that not only the rate but the phase of the postsynaptic response to a rhythmic population input varies as a function of synaptic dynamics and synaptic configuration. Resultant phase leads may compensate for transmission delays and be predictive of rhythmic changes. This could be particularly important for sensory processing and motor rhythm generation in the nervous system. The synapses that connect neurons in the brain are far from being simple relay points that pass a signal from one neuron to another. There is now much evidence that long term changes in the strength of such connections, which determines the amplitude of the received signal, underpin learning and memory in the brain. However, signal amplitudes also fluctuate on fast time scales of milliseconds to seconds due to a variety of particular presynaptic mechanisms that regulate the release of neurotransmitter from the presynaptic terminal. Understanding the signal filtering properties of this short-term plasticity (STP) is a challenge and requires theoretical models. Aspects such as rate filtering and information transfer have been studied. Here we explore the effects of STP on the phase of a receiving neuron’s response to oscillating input and show that short-term depression can result in a frequency-dependent phase lead. This may be particularly important in the processing of rhythmic visual and auditory signals and producing rhythmic motor outputs.
Collapse
Affiliation(s)
- Mark D. McDonnell
- Computational Learning Systems Laboratory, School of Information Technology and Mathematical Sciences, University of South Australia, Mawson Lakes, Australia
- * E-mail: (MDM); (BPG)
| | - Bruce P. Graham
- Computing Science & Mathematics, Faculty of Natural Sciences, University of Stirling, Stirling, United Kingdom
- * E-mail: (MDM); (BPG)
| |
Collapse
|
46
|
Wong-Lin K, Wang DH, Moustafa AA, Cohen JY, Nakamura K. Toward a multiscale modeling framework for understanding serotonergic function. J Psychopharmacol 2017; 31:1121-1136. [PMID: 28417684 PMCID: PMC5606304 DOI: 10.1177/0269881117699612] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Despite its importance in regulating emotion and mental wellbeing, the complex structure and function of the serotonergic system present formidable challenges toward understanding its mechanisms. In this paper, we review studies investigating the interactions between serotonergic and related brain systems and their behavior at multiple scales, with a focus on biologically-based computational modeling. We first discuss serotonergic intracellular signaling and neuronal excitability, followed by neuronal circuit and systems levels. At each level of organization, we will discuss the experimental work accompanied by related computational modeling work. We then suggest that a multiscale modeling approach that integrates the various levels of neurobiological organization could potentially transform the way we understand the complex functions associated with serotonin.
Collapse
Affiliation(s)
- KongFatt Wong-Lin
- Intelligent Systems Research Centre, School of Computing and Intelligent Systems, University of Ulster, Magee Campus, Derry~Londonderry, UK
| | - Da-Hui Wang
- School of Systems Science, and National Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, China
| | - Ahmed A Moustafa
- School of Social Sciences and Psychology, and Marcs Institute for Brain and Behaviour, University of Western Sydney, Sydney, Australia
| | - Jeremiah Y Cohen
- Solomon H. Snyder Department of Neuroscience, Brain Science Institute, Johns Hopkins University School of Medicine, Baltimore, USA
| | - Kae Nakamura
- Department of Physiology, Kansai Medical University, Hirakata, Osaka, Japan
| |
Collapse
|
47
|
Tan TL, Cheong SA. Statistical complexity is maximized in a small-world brain. PLoS One 2017; 12:e0183918. [PMID: 28850587 PMCID: PMC5574548 DOI: 10.1371/journal.pone.0183918] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2017] [Accepted: 08/14/2017] [Indexed: 01/03/2023] Open
Abstract
In this paper, we study a network of Izhikevich neurons to explore what it means for a brain to be at the edge of chaos. To do so, we first constructed the phase diagram of a single Izhikevich excitatory neuron, and identified a small region of the parameter space where we find a large number of phase boundaries to serve as our edge of chaos. We then couple the outputs of these neurons directly to the parameters of other neurons, so that the neuron dynamics can drive transitions from one phase to another on an artificial energy landscape. Finally, we measure the statistical complexity of the parameter time series, while the network is tuned from a regular network to a random network using the Watts-Strogatz rewiring algorithm. We find that the statistical complexity of the parameter dynamics is maximized when the neuron network is most small-world-like. Our results suggest that the small-world architecture of neuron connections in brains is not accidental, but may be related to the information processing that they do.
Collapse
Affiliation(s)
- Teck Liang Tan
- Division of Physics and Applied Physics, School of Physical and Mathematical Sciences, Nanyang Technological University, 21 Nanyang Link, Singapore 637371, Republic of Singapore
- Complexity Institute, Nanyang Technological University, Block 2 Innovation Centre, Level 2 Unit 245, 18 Nanyang Drive, Singapore 637723, Republic of Singapore
- * E-mail:
| | - Siew Ann Cheong
- Division of Physics and Applied Physics, School of Physical and Mathematical Sciences, Nanyang Technological University, 21 Nanyang Link, Singapore 637371, Republic of Singapore
- Complexity Institute, Nanyang Technological University, Block 2 Innovation Centre, Level 2 Unit 245, 18 Nanyang Drive, Singapore 637723, Republic of Singapore
| |
Collapse
|
48
|
A real-time FPGA implementation of a biologically inspired central pattern generator network. Neurocomputing 2017. [DOI: 10.1016/j.neucom.2017.03.028] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
49
|
Einarsson H, Gauy MM, Lengler J, Steger A. A Model of Fast Hebbian Spike Latency Normalization. Front Comput Neurosci 2017; 11:33. [PMID: 28555102 PMCID: PMC5430963 DOI: 10.3389/fncom.2017.00033] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2016] [Accepted: 04/13/2017] [Indexed: 11/13/2022] Open
Abstract
Hebbian changes of excitatory synapses are driven by and enhance correlations between pre- and postsynaptic neuronal activations, forming a positive feedback loop that can lead to instability in simulated neural networks. Because Hebbian learning may occur on time scales of seconds to minutes, it is conjectured that some form of fast stabilization of neural firing is necessary to avoid runaway of excitation, but both the theoretical underpinning and the biological implementation for such homeostatic mechanism are to be fully investigated. Supported by analytical and computational arguments, we show that a Hebbian spike-timing-dependent metaplasticity rule, accounts for inherently-stable, quick tuning of the total input weight of a single neuron in the general scenario of asynchronous neural firing characterized by UP and DOWN states of activity.
Collapse
Affiliation(s)
- Hafsteinn Einarsson
- Department of Computer Science, Institute of Theoretical Computer Science, ETH ZurichZurich, Switzerland
| | - Marcelo M. Gauy
- Department of Computer Science, Institute of Theoretical Computer Science, ETH ZurichZurich, Switzerland
| | - Johannes Lengler
- Department of Computer Science, Institute of Theoretical Computer Science, ETH ZurichZurich, Switzerland
| | - Angelika Steger
- Department of Computer Science, Institute of Theoretical Computer Science, ETH ZurichZurich, Switzerland
- Collegium HelveticumZurich, Switzerland
| |
Collapse
|
50
|
Exact firing time statistics of neurons driven by discrete inhibitory noise. Sci Rep 2017; 7:1577. [PMID: 28484244 PMCID: PMC5431561 DOI: 10.1038/s41598-017-01658-8] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2017] [Accepted: 03/29/2017] [Indexed: 12/15/2022] Open
Abstract
Neurons in the intact brain receive a continuous and irregular synaptic bombardment from excitatory and inhibitory pre- synaptic neurons, which determines the firing activity of the stimulated neuron. In order to investigate the influence of inhibitory stimulation on the firing time statistics, we consider Leaky Integrate-and-Fire neurons subject to inhibitory instantaneous post- synaptic potentials. In particular, we report exact results for the firing rate, the coefficient of variation and the spike train spectrum for various synaptic weight distributions. Our results are not limited to stimulations of infinitesimal amplitude, but they apply as well to finite amplitude post-synaptic potentials, thus being able to capture the effect of rare and large spikes. The developed methods are able to reproduce also the average firing properties of heterogeneous neuronal populations.
Collapse
|