1
|
Chialva U, González Boscá V, Rotstein HG. Low-dimensional models of single neurons: a review. BIOLOGICAL CYBERNETICS 2023; 117:163-183. [PMID: 37060453 DOI: 10.1007/s00422-023-00960-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/13/2022] [Accepted: 03/05/2023] [Indexed: 06/13/2023]
Abstract
The classical Hodgkin-Huxley (HH) point-neuron model of action potential generation is four-dimensional. It consists of four ordinary differential equations describing the dynamics of the membrane potential and three gating variables associated to a transient sodium and a delayed-rectifier potassium ionic currents. Conductance-based models of HH type are higher-dimensional extensions of the classical HH model. They include a number of supplementary state variables associated with other ionic current types, and are able to describe additional phenomena such as subthreshold oscillations, mixed-mode oscillations (subthreshold oscillations interspersed with spikes), clustering and bursting. In this manuscript we discuss biophysically plausible and phenomenological reduced models that preserve the biophysical and/or dynamic description of models of HH type and the ability to produce complex phenomena, but the number of effective dimensions (state variables) is lower. We describe several representative models. We also describe systematic and heuristic methods of deriving reduced models from models of HH type.
Collapse
Affiliation(s)
- Ulises Chialva
- Departamento de Matemática, Universidad Nacional del Sur and CONICET, Bahía Blanca, Buenos Aires, Argentina
| | | | - Horacio G Rotstein
- Federated Department of Biological Sciences, New Jersey Institute of Technology and Rutgers University, Newark, New Jersey, USA.
- Behavioral Neurosciences Program, Rutgers University, Newark, NJ, USA.
- Corresponding Investigators Group, CONICET, Buenos Aires, Argentina.
| |
Collapse
|
2
|
Reconstruction of sparse recurrent connectivity and inputs from the nonlinear dynamics of neuronal networks. J Comput Neurosci 2023; 51:43-58. [PMID: 35849304 DOI: 10.1007/s10827-022-00831-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2020] [Revised: 06/16/2022] [Accepted: 07/13/2022] [Indexed: 01/18/2023]
Abstract
Reconstructing the recurrent structural connectivity of neuronal networks is a challenge crucial to address in characterizing neuronal computations. While directly measuring the detailed connectivity structure is generally prohibitive for large networks, we develop a novel framework for reverse-engineering large-scale recurrent network connectivity matrices from neuronal dynamics by utilizing the widespread sparsity of neuronal connections. We derive a linear input-output mapping that underlies the irregular dynamics of a model network composed of both excitatory and inhibitory integrate-and-fire neurons with pulse coupling, thereby relating network inputs to evoked neuronal activity. Using this embedded mapping and experimentally feasible measurements of the firing rate as well as voltage dynamics in response to a relatively small ensemble of random input stimuli, we efficiently reconstruct the recurrent network connectivity via compressive sensing techniques. Through analogous analysis, we then recover high dimensional natural stimuli from evoked neuronal network dynamics over a short time horizon. This work provides a generalizable methodology for rapidly recovering sparse neuronal network data and underlines the natural role of sparsity in facilitating the efficient encoding of network data in neuronal dynamics.
Collapse
|
3
|
Barkdoll K, Lu Y, Barranca VJ. New insights into binocular rivalry from the reconstruction of evolving percepts using model network dynamics. Front Comput Neurosci 2023; 17:1137015. [PMID: 37034441 PMCID: PMC10079880 DOI: 10.3389/fncom.2023.1137015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2023] [Accepted: 03/07/2023] [Indexed: 04/11/2023] Open
Abstract
When the two eyes are presented with highly distinct stimuli, the resulting visual percept generally switches every few seconds between the two monocular images in an irregular fashion, giving rise to a phenomenon known as binocular rivalry. While a host of theoretical studies have explored potential mechanisms for binocular rivalry in the context of evoked model dynamics in response to simple stimuli, here we investigate binocular rivalry directly through complex stimulus reconstructions based on the activity of a two-layer neuronal network model with competing downstream pools driven by disparate monocular stimuli composed of image pixels. To estimate the dynamic percept, we derive a linear input-output mapping rooted in the non-linear network dynamics and iteratively apply compressive sensing techniques for signal recovery. Utilizing a dominance metric, we are able to identify when percept alternations occur and use data collected during each dominance period to generate a sequence of percept reconstructions. We show that despite the approximate nature of the input-output mapping and the significant reduction in neurons downstream relative to stimulus pixels, the dominant monocular image is well-encoded in the network dynamics and improvements are garnered when realistic spatial receptive field structure is incorporated into the feedforward connectivity. Our model demonstrates gamma-distributed dominance durations and well obeys Levelt's four laws for how dominance durations change with stimulus strength, agreeing with key recurring experimental observations often used to benchmark rivalry models. In light of evidence that individuals with autism exhibit relatively slow percept switching in binocular rivalry, we corroborate the ubiquitous hypothesis that autism manifests from reduced inhibition in the brain by systematically probing our model alternation rate across choices of inhibition strength. We exhibit sufficient conditions for producing binocular rivalry in the context of natural scene stimuli, opening a clearer window into the dynamic brain computations that vary with the generated percept and a potential path toward further understanding neurological disorders.
Collapse
|
4
|
Wang J, Xu J, Wu J, Xu Q. Geometric characterization of dynamical structure for neural firing activities induced by inhibitory pulse. Cogn Neurodyn 2022; 16:1505-1524. [PMID: 36408077 PMCID: PMC9666638 DOI: 10.1007/s11571-022-09799-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2021] [Revised: 02/28/2022] [Accepted: 03/08/2022] [Indexed: 11/25/2022] Open
Abstract
In general, inhibitory stimuli are thought to inhibit neuronal firing, but they may actually enhance firing sometimes, such as post-inhibitory rebound spike (PIR spike) and post-inhibitory facilitation (PIF) phenomena, which play an important role in human neuronal activities. We study responses to inhibitory pulse in a classical neuron model (Quartic adaptive Integrate-and-fire model) well known to reproduce a number of biologically realistic behaviors. The three phenomena that we study are PIR, in which a neuron fires after an inhibitory pulse, and PIF, in which a subthreshold excitatory input can induce a spike if it is applied with proper timing after an inhibitory pulse, as well as period firing after inhibitory pulse. When the system features focus and saddle two equilibriums, the three phenomena will be occurred under the inhibitory pulse, while all three phenomena will not be induced when the system features node and saddle two equilibriums. Using dynamical systems theory, we explain the threshold mechanism of enhancement of neural firing response induced by inhibitory pulse and analyze the origin of these phenomena from several factors. We also describe the geometric characterization of dynamical structures of these three phenomena. This study therefore enrich the paradoxical phenomena that induced by inhibitory input and advance our understanding of its role.
Collapse
Affiliation(s)
- Junjie Wang
- School of Mathematics and Information Science, Guangxi University, Nanning, 530004 China
| | - Jieqiong Xu
- School of Mathematics and Information Science, Guangxi University, Nanning, 530004 China
- Scientific Research Center of Engineering Mechanics, Guangxi University, Nanning, 530004 China
| | - Jianmei Wu
- School of Mathematics and Information Science, Guangxi University, Nanning, 530004 China
| | - Qixiang Xu
- School of Mathematics and Information Science, Guangxi University, Nanning, 530004 China
| |
Collapse
|
5
|
Bou A, Bisquert J. Impedance Spectroscopy Dynamics of Biological Neural Elements: From Memristors to Neurons and Synapses. J Phys Chem B 2021; 125:9934-9949. [PMID: 34436891 DOI: 10.1021/acs.jpcb.1c03905] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
Understanding the operation of neurons and synapses is essential to reproducing biological computation. Building artificial neuromorphic networks opens the door to a new generation of faster and low-energy-consuming electronic circuits for computation. The main candidates to imitate the natural biocomputation processes, such as the generation of action potentials and spiking, are memristors. Generally, the study of the performance of material neuromorphic elements is done by the analysis of time transient signals. Here, we present an analysis of neural systems in the frequency domain by small-amplitude ac impedance spectroscopy. We start from the constitutive equations for the conductance and memory effect, and we derive and classify the impedance spectroscopy spectra. We first provide a general analysis of a memristor and demonstrate that this element can be expressed as a combination of simple parts. In particular, we derive a basic equivalent circuit where the memory effect is represented by an RL branch. We show that this ac model is quite general and describes the inductive/negative capacitance response in many systems such as halide perovskites and organic LEDs. Thereafter, we derive the impedance response of the integrate-and-fire exponential adaptative neuron model that introduces a negative differential resistance and a richer set of spectra. On the basis of these insights, we provide an interpretation of the varied spectra that appear in the more general Hodgkin-Huxley neuron model. Our work provides important criteria to determine the properties that must be found in material realizations of neuronal elements. This approach has the great advantage that the analysis of highly complex phenomena can be based purely on the shape of experimental impedance spectra, avoiding the need for specific modeling of rather involved material processes that produce the required response.
Collapse
Affiliation(s)
- Agustín Bou
- Institute of Advanced Materials (INAM), Universitat Jaume I, 12006 Castelló, Spain
| | - Juan Bisquert
- Institute of Advanced Materials (INAM), Universitat Jaume I, 12006 Castelló, Spain
| |
Collapse
|
6
|
Cheng H, Cai D, Zhou D. The extended Granger causality analysis for Hodgkin-Huxley neuronal models. CHAOS (WOODBURY, N.Y.) 2020; 30:103102. [PMID: 33138445 DOI: 10.1063/5.0006349] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/01/2020] [Accepted: 09/14/2020] [Indexed: 06/11/2023]
Abstract
How to extract directions of information flow in dynamical systems based on empirical data remains a key challenge. The Granger causality (GC) analysis has been identified as a powerful method to achieve this capability. However, the framework of the GC theory requires that the dynamics of the investigated system can be statistically linearized; i.e., the dynamics can be effectively modeled by linear regressive processes. Under such conditions, the causal connectivity can be directly mapped to the structural connectivity that mediates physical interactions within the system. However, for nonlinear dynamical systems such as the Hodgkin-Huxley (HH) neuronal circuit, the validity of the GC analysis has yet been addressed; namely, whether the constructed causal connectivity is still identical to the synaptic connectivity between neurons remains unknown. In this work, we apply the nonlinear extension of the GC analysis, i.e., the extended GC analysis, to the voltage time series obtained by evolving the HH neuronal network. In addition, we add a certain amount of measurement or observational noise to the time series to take into account the realistic situation in data acquisition in the experiment. Our numerical results indicate that the causal connectivity obtained through the extended GC analysis is consistent with the underlying synaptic connectivity of the system. This consistency is also insensitive to dynamical regimes, e.g., a chaotic or non-chaotic regime. Since the extended GC analysis could in principle be applied to any nonlinear dynamical system as long as its attractor is low dimensional, our results may potentially be extended to the GC analysis in other settings.
Collapse
Affiliation(s)
- Hong Cheng
- School of Statistics and Mathematics, Shanghai Lixin University of Accounting and Finance, Shanghai 201209, China
| | - David Cai
- School of Mathematical Sciences, MOE-LSC, and Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai 200240, China
| | - Douglas Zhou
- School of Mathematical Sciences, MOE-LSC, and Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai 200240, China
| |
Collapse
|
7
|
Marín M, Sáez-Lara MJ, Ros E, Garrido JA. Optimization of Efficient Neuron Models With Realistic Firing Dynamics. The Case of the Cerebellar Granule Cell. Front Cell Neurosci 2020; 14:161. [PMID: 32765220 PMCID: PMC7381211 DOI: 10.3389/fncel.2020.00161] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2019] [Accepted: 05/13/2020] [Indexed: 11/17/2022] Open
Abstract
Biologically relevant large-scale computational models currently represent one of the main methods in neuroscience for studying information processing primitives of brain areas. However, biologically realistic neuron models tend to be computationally heavy and thus prevent these models from being part of brain-area models including thousands or even millions of neurons. The cerebellar input layer represents a canonical example of large scale networks. In particular, the cerebellar granule cells, the most numerous cells in the whole mammalian brain, have been proposed as playing a pivotal role in the creation of somato-sensorial information representations. Enhanced burst frequency (spiking resonance) in the granule cells has been proposed as facilitating the input signal transmission at the theta-frequency band (4–12 Hz), but the functional role of this cell feature in the operation of the granular layer remains largely unclear. This study aims to develop a methodological pipeline for creating neuron models that maintain biological realism and computational efficiency whilst capturing essential aspects of single-neuron processing. Therefore, we selected a light computational neuron model template (the adaptive-exponential integrate-and-fire model), whose parameters were progressively refined using an automatic parameter tuning with evolutionary algorithms (EAs). The resulting point-neuron models are suitable for reproducing the main firing properties of a realistic granule cell from electrophysiological measurements, including the spiking resonance at the theta-frequency band, repetitive firing according to a specified intensity-frequency (I-F) curve and delayed firing under current-pulse stimulation. Interestingly, the proposed model also reproduced some other emergent properties (namely, silent at rest, rheobase and negligible adaptation under depolarizing currents) even though these properties were not set in the EA as a target in the fitness function (FF), proving that these features are compatible even in computationally simple models. The proposed methodology represents a valuable tool for adjusting AdEx models according to a FF defined in the spiking regime and based on biological data. These models are appropriate for future research of the functional implication of bursting resonance at the theta band in large-scale granular layer network models.
Collapse
Affiliation(s)
- Milagros Marín
- Department of Computer Architecture and Technology-CITIC, University of Granada, Granada, Spain.,Department of Biochemistry and Molecular Biology I, University of Granada, Granada, Spain
| | - María José Sáez-Lara
- Department of Biochemistry and Molecular Biology I, University of Granada, Granada, Spain
| | - Eduardo Ros
- Department of Computer Architecture and Technology-CITIC, University of Granada, Granada, Spain
| | - Jesús A Garrido
- Department of Computer Architecture and Technology-CITIC, University of Granada, Granada, Spain
| |
Collapse
|
8
|
Barranca VJ, Zhou D. Compressive Sensing Inference of Neuronal Network Connectivity in Balanced Neuronal Dynamics. Front Neurosci 2019; 13:1101. [PMID: 31680835 PMCID: PMC6811502 DOI: 10.3389/fnins.2019.01101] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2019] [Accepted: 09/30/2019] [Indexed: 12/30/2022] Open
Abstract
Determining the structure of a network is of central importance to understanding its function in both neuroscience and applied mathematics. However, recovering the structural connectivity of neuronal networks remains a fundamental challenge both theoretically and experimentally. While neuronal networks operate in certain dynamical regimes, which may influence their connectivity reconstruction, there is widespread experimental evidence of a balanced neuronal operating state in which strong excitatory and inhibitory inputs are dynamically adjusted such that neuronal voltages primarily remain near resting potential. Utilizing the dynamics of model neurons in such a balanced regime in conjunction with the ubiquitous sparse connectivity structure of neuronal networks, we develop a compressive sensing theoretical framework for efficiently reconstructing network connections by measuring individual neuronal activity in response to a relatively small ensemble of random stimuli injected over a short time scale. By tuning the network dynamical regime, we determine that the highest fidelity reconstructions are achievable in the balanced state. We hypothesize the balanced dynamics observed in vivo may therefore be a result of evolutionary selection for optimal information encoding and expect the methodology developed to be generalizable for alternative model networks as well as experimental paradigms.
Collapse
Affiliation(s)
- Victor J Barranca
- Department of Mathematics and Statistics, Swarthmore College, Swarthmore, PA, United States
| | - Douglas Zhou
- School of Mathematical Sciences, Shanghai Jiao Tong University, Shanghai, China.,Ministry of Education Key Laboratory of Scientific and Engineering Computing, Shanghai Jiao Tong University, Shanghai, China.,Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| |
Collapse
|
9
|
Crodelle J, Zhou D, Kovačič G, Cai D. A Role for Electrotonic Coupling Between Cortical Pyramidal Cells. Front Comput Neurosci 2019; 13:33. [PMID: 31191280 PMCID: PMC6546902 DOI: 10.3389/fncom.2019.00033] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2019] [Accepted: 05/03/2019] [Indexed: 11/18/2022] Open
Abstract
Many brain regions communicate information through synchronized network activity. Electrical coupling among the dendrites of interneurons in the cortex has been implicated in forming and sustaining such activity in the cortex. Evidence for the existence of electrical coupling among cortical pyramidal cells, however, has been largely absent. A recent experimental study measured properties of electrical connections between pyramidal cells in the cortex deemed “electrotonic couplings.” These junctions were seen to occur pair-wise, sparsely, and often coexist with electrically-coupled interneurons. Here, we construct a network model to investigate possible roles for these rare, electrotonically-coupled pyramidal-cell pairs. Through simulations, we show that electrical coupling among pyramidal-cell pairs significantly enhances coincidence-detection capabilities and increases network spike-timing precision. Further, a network containing multiple pairs exhibits large variability in its firing pattern, possessing a rich coding structure.
Collapse
Affiliation(s)
- Jennifer Crodelle
- Courant Institute of Mathematical Sciences, New York University, New York, NY, United States
| | - Douglas Zhou
- School of Mathematical Sciences, MOE-LSC, and Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| | - Gregor Kovačič
- Department of Mathematical Sciences, Rensselaer Polytechnic Institute, Troy, NY, United States
| | - David Cai
- Courant Institute of Mathematical Sciences, New York University, New York, NY, United States.,School of Mathematical Sciences, MOE-LSC, and Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| |
Collapse
|
10
|
Barranca VJ, Zhu XG. A computational study of the role of spatial receptive field structure in processing natural and non-natural scenes. J Theor Biol 2018; 454:268-277. [PMID: 29908188 DOI: 10.1016/j.jtbi.2018.06.011] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2018] [Revised: 05/30/2018] [Accepted: 06/12/2018] [Indexed: 10/14/2022]
Abstract
The center-surround receptive field structure, ubiquitous in the visual system, is hypothesized to be evolutionarily advantageous in image processing tasks. We address the potential functional benefits and shortcomings of spatial localization and center-surround antagonism in the context of an integrate-and-fire neuronal network model with image-based forcing. Utilizing the sparsity of natural scenes, we derive a compressive-sensing framework for input image reconstruction utilizing evoked neuronal firing rates. We investigate how the accuracy of input encoding depends on the receptive field architecture, and demonstrate that spatial localization in visual stimulus sampling facilitates marked improvements in natural scene processing beyond uniformly-random excitatory connectivity. However, for specific classes of images, we show that spatial localization inherent in physiological receptive fields combined with information loss through nonlinear neuronal network dynamics may underlie common optical illusions, giving a novel explanation for their manifestation. In the context of signal processing, we expect this work may suggest new sampling protocols useful for extending conventional compressive sensing theory.
Collapse
Affiliation(s)
| | - Xiuqi George Zhu
- Swarthmore College, 500 College Avenue, Swarthmore, PA 19081, USA
| |
Collapse
|
11
|
Spike-Conducting Integrate-and-Fire Model. eNeuro 2018; 5:eN-TNC-0112-18. [PMID: 30225348 PMCID: PMC6140110 DOI: 10.1523/eneuro.0112-18.2018] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2018] [Revised: 08/13/2018] [Accepted: 08/14/2018] [Indexed: 11/29/2022] Open
Abstract
Modeling is a useful tool for investigating various biophysical characteristics of neurons. Recent simulation studies of propagating action potentials (spike conduction) along axons include the investigation of neuronal activity evoked by electrical stimulation from implantable prosthetic devices. In contrast to point-neuron simulations, where a large variety of models are readily available, Hodgkin–Huxley-type conductance-based models have been almost the only option for simulating axonal spike conduction, as simpler models cannot faithfully replicate the waveforms of propagating spikes. Since the amount of available physiological data, especially in humans, is usually limited, calibration, and justification of the large number of parameters of a complex model is generally difficult. In addition, not all simulation studies of axons require detailed descriptions of nonlinear ionic dynamics. In this study, we construct a simple model of spike generation and conduction based on the exponential integrate-and-fire model, which can simulate the rapid growth of the membrane potential at spike initiation. In terms of the number of parameters and equations, this model is much more compact than conventional models, but can still reliably simulate spike conduction along myelinated and unmyelinated axons that are stimulated intracellularly or extracellularly. Our simulations of auditory nerve fibers with this new model suggest that, because of the difference in intrinsic membrane properties, the axonal spike conduction of high-frequency nerve fibers is faster than that of low-frequency fibers. The simple model developed in this study can serve as a computationally efficient alternative to more complex models for future studies, including simulations of neuroprosthetic devices.
Collapse
|
12
|
The impact of spike-frequency adaptation on balanced network dynamics. Cogn Neurodyn 2018; 13:105-120. [PMID: 30728874 DOI: 10.1007/s11571-018-9504-2] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2018] [Revised: 07/20/2018] [Accepted: 08/28/2018] [Indexed: 10/28/2022] Open
Abstract
A dynamic balance between strong excitatory and inhibitory neuronal inputs is hypothesized to play a pivotal role in information processing in the brain. While there is evidence of the existence of a balanced operating regime in several cortical areas and idealized neuronal network models, it is important for the theory of balanced networks to be reconciled with more physiological neuronal modeling assumptions. In this work, we examine the impact of spike-frequency adaptation, observed widely across neurons in the brain, on balanced dynamics. We incorporate adaptation into binary and integrate-and-fire neuronal network models, analyzing the theoretical effect of adaptation in the large network limit and performing an extensive numerical investigation of the model adaptation parameter space. Our analysis demonstrates that balance is well preserved for moderate adaptation strength even if the entire network exhibits adaptation. In the common physiological case in which only excitatory neurons undergo adaptation, we show that the balanced operating regime in fact widens relative to the non-adaptive case. We hypothesize that spike-frequency adaptation may have been selected through evolution to robustly facilitate balanced dynamics across diverse cognitive operating states.
Collapse
|
13
|
Barranca VJ, Zhou D, Cai D. Compressive sensing reconstruction of feed-forward connectivity in pulse-coupled nonlinear networks. Phys Rev E 2016; 93:060201. [PMID: 27415190 DOI: 10.1103/physreve.93.060201] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2015] [Indexed: 06/06/2023]
Abstract
Utilizing the sparsity ubiquitous in real-world network connectivity, we develop a theoretical framework for efficiently reconstructing sparse feed-forward connections in a pulse-coupled nonlinear network through its output activities. Using only a small ensemble of random inputs, we solve this inverse problem through the compressive sensing theory based on a hidden linear structure intrinsic to the nonlinear network dynamics. The accuracy of the reconstruction is further verified by the fact that complex inputs can be well recovered using the reconstructed connectivity. We expect this Rapid Communication provides a new perspective for understanding the structure-function relationship as well as compressive sensing principle in nonlinear network dynamics.
Collapse
Affiliation(s)
- Victor J Barranca
- Department of Mathematics and Statistics, Swarthmore College, Swarthmore, Pennsylvania 19081, USA
| | - Douglas Zhou
- Department of Mathematics, MOE-LSC, Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai 200240, China
| | - David Cai
- Department of Mathematics, MOE-LSC, Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai 200240, China
- Courant Institute of Mathematical Sciences and Center for Neural Science, New York University, New York, New York 10012, USA
- NYUAD Institute, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates
| |
Collapse
|
14
|
Barranca VJ, Kovačič G, Zhou D, Cai D. Efficient image processing via compressive sensing of integrate-and-fire neuronal network dynamics. Neurocomputing 2016. [DOI: 10.1016/j.neucom.2015.07.067] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
15
|
Steimer A, Schindler K. Random Sampling with Interspike-Intervals of the Exponential Integrate and Fire Neuron: A Computational Interpretation of UP-States. PLoS One 2015. [PMID: 26203657 PMCID: PMC4512685 DOI: 10.1371/journal.pone.0132906] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Oscillations between high and low values of the membrane potential (UP and DOWN states respectively) are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon’s implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs) of the exponential integrate and fire (EIF) model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike’s preceding ISI. As we show, the EIF’s exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron’s ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing computational theories about UP states during slow wave sleep and present possible extensions of the model in the context of spike-frequency adaptation.
Collapse
Affiliation(s)
- Andreas Steimer
- Department of Neurology, Inselspital\Bern University Hospital\University Bern, Bern, Switzerland
- * E-mail:
| | - Kaspar Schindler
- Department of Neurology, Inselspital\Bern University Hospital\University Bern, Bern, Switzerland
| |
Collapse
|
16
|
Barranca VJ, Kovačič G, Zhou D, Cai D. Network dynamics for optimal compressive-sensing input-signal recovery. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2014; 90:042908. [PMID: 25375568 DOI: 10.1103/physreve.90.042908] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/04/2014] [Indexed: 06/04/2023]
Abstract
By using compressive sensing (CS) theory, a broad class of static signals can be reconstructed through a sequence of very few measurements in the framework of a linear system. For networks with nonlinear and time-evolving dynamics, is it similarly possible to recover an unknown input signal from only a small number of network output measurements? We address this question for pulse-coupled networks and investigate the network dynamics necessary for successful input signal recovery. Determining the specific network characteristics that correspond to a minimal input reconstruction error, we are able to achieve high-quality signal reconstructions with few measurements of network output. Using various measures to characterize dynamical properties of network output, we determine that networks with highly variable and aperiodic output can successfully encode network input information with high fidelity and achieve the most accurate CS input reconstructions. For time-varying inputs, we also find that high-quality reconstructions are achievable by measuring network output over a relatively short time window. Even when network inputs change with time, the same optimal choice of network characteristics and corresponding dynamics apply as in the case of static inputs.
Collapse
Affiliation(s)
- Victor J Barranca
- Courant Institute of Mathematical Sciences & Center for Neural Science, New York University, New York, New York 10012, USA and NYUAD Institute, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates
| | - Gregor Kovačič
- Mathematical Sciences Department, Rensselaer Polytechnic Institute, Troy, New York 12180, USA
| | - Douglas Zhou
- Department of Mathematics, MOE-LSC, and Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| | - David Cai
- Courant Institute of Mathematical Sciences & Center for Neural Science, New York University, New York, New York 10012, USA and NYUAD Institute, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates and Department of Mathematics, MOE-LSC, and Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| |
Collapse
|