1
|
Kraikivski P. A Mechanistic Model of Perceptual Binding Predicts That Binding Mechanism Is Robust against Noise. ENTROPY (BASEL, SWITZERLAND) 2024; 26:133. [PMID: 38392388 PMCID: PMC10888151 DOI: 10.3390/e26020133] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/11/2023] [Revised: 01/28/2024] [Accepted: 01/30/2024] [Indexed: 02/24/2024]
Abstract
The concept of the brain's own time and space is central to many models and theories that aim to explain how the brain generates consciousness. For example, the temporo-spatial theory of consciousness postulates that the brain implements its own inner time and space for conscious processing of the outside world. Furthermore, our perception and cognition of time and space can be different from actual time and space. This study presents a mechanistic model of mutually connected processes that encode phenomenal representations of space and time. The model is used to elaborate the binding mechanism between two sets of processes representing internal space and time, respectively. Further, a stochastic version of the model is developed to investigate the interplay between binding strength and noise. Spectral entropy is used to characterize noise effects on the systems of interacting processes when the binding strength between them is varied. The stochastic modeling results reveal that the spectral entropy values for strongly bound systems are similar to those for weakly bound or even decoupled systems. Thus, the analysis performed in this study allows us to conclude that the binding mechanism is noise-resilient.
Collapse
Affiliation(s)
- Pavel Kraikivski
- Division of Systems Biology, Academy of Integrated Science, Virginia Polytechnic Institute and State University, Blacksburg, VA 24061, USA
| |
Collapse
|
2
|
Noda T, Takahashi H. Stochastic resonance in sparse neuronal network: functional role of ongoing activity to detect weak sensory input in awake auditory cortex of rat. Cereb Cortex 2024; 34:bhad428. [PMID: 37955660 PMCID: PMC10793590 DOI: 10.1093/cercor/bhad428] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2022] [Revised: 10/10/2023] [Accepted: 10/25/2023] [Indexed: 11/14/2023] Open
Abstract
The awake cortex is characterized by a higher level of ongoing spontaneous activity, but it has a better detectability of weak sensory inputs than the anesthetized cortex. However, the computational mechanism underlying this paradoxical nature of awake neuronal activity remains to be elucidated. Here, we propose a hypothetical stochastic resonance, which improves the signal-to-noise ratio (SNR) of weak sensory inputs through nonlinear relations between ongoing spontaneous activities and sensory-evoked activities. Prestimulus and tone-evoked activities were investigated via in vivo extracellular recording with a dense microelectrode array covering the entire auditory cortex in rats in both awake and anesthetized states. We found that tone-evoked activities increased supralinearly with the prestimulus activity level in the awake state and that the SNR of weak stimulus representation was optimized at an intermediate level of prestimulus ongoing activity. Furthermore, the temporally intermittent firing pattern, but not the trial-by-trial reliability or the fluctuation of local field potential, was identified as a relevant factor for SNR improvement. Since ongoing activity differs among neurons, hypothetical stochastic resonance or "sparse network stochastic resonance" might offer beneficial SNR improvement at the single-neuron level, which is compatible with the sparse representation in the sensory cortex.
Collapse
Affiliation(s)
- Takahiro Noda
- Department of Mechano-informatics, Graduate School of Information Science and Technology, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan
| | - Hirokazu Takahashi
- Department of Mechano-informatics, Graduate School of Information Science and Technology, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan
| |
Collapse
|
3
|
Haufler D, Ito S, Koch C, Arkhipov A. Simulations of cortical networks using spatially extended conductance-based neuronal models. J Physiol 2023; 601:3123-3139. [PMID: 36567262 PMCID: PMC10290729 DOI: 10.1113/jp284030] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2022] [Accepted: 12/19/2022] [Indexed: 12/27/2022] Open
Abstract
The Hodgkin-Huxley model of action potential generation and propagation, published in the Journal of Physiology in 1952, initiated the field of biophysically detailed computational modelling in neuroscience, which has expanded to encompass a variety of species and components of the nervous system. Here we review the developments in this area with a focus on efforts in the community towards modelling the mammalian neocortex using spatially extended conductance-based neuronal models. The Hodgkin-Huxley formalism and related foundational contributions, such as Rall's cable theory, remain widely used in these efforts to the current day. We argue that at present the field is undergoing a qualitative change due to new very rich datasets describing the composition, connectivity and functional activity of cortical circuits, which are being integrated systematically into large-scale network models. This trend, combined with the accelerating development of convenient software tools supporting such complex modelling projects, is giving rise to highly detailed models of the cortex that are extensively constrained by the data, enabling computational investigation of a multitude of questions about cortical structure and function.
Collapse
Affiliation(s)
| | - Shinya Ito
- Mindscope Program, Allen Institute, Seattle, 98109
| | | | | |
Collapse
|
4
|
Padamsey Z, Katsanevaki D, Dupuy N, Rochefort NL. Neocortex saves energy by reducing coding precision during food scarcity. Neuron 2022; 110:280-296.e10. [PMID: 34741806 PMCID: PMC8788933 DOI: 10.1016/j.neuron.2021.10.024] [Citation(s) in RCA: 35] [Impact Index Per Article: 17.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2021] [Revised: 09/07/2021] [Accepted: 10/15/2021] [Indexed: 11/21/2022]
Abstract
Information processing is energetically expensive. In the mammalian brain, it is unclear how information coding and energy use are regulated during food scarcity. Using whole-cell recordings and two-photon imaging in layer 2/3 mouse visual cortex, we found that food restriction reduced AMPA receptor conductance, reducing synaptic ATP use by 29%. Neuronal excitability was nonetheless preserved by a compensatory increase in input resistance and a depolarized resting potential. Consequently, neurons spiked at similar rates as controls but spent less ATP on underlying excitatory currents. This energy-saving strategy had a cost because it amplified the variability of visually-evoked subthreshold responses, leading to a 32% broadening of orientation tuning and impaired fine visual discrimination. This reduction in coding precision was associated with reduced levels of the fat mass-regulated hormone leptin and was restored by exogenous leptin supplementation. Our findings reveal that metabolic state dynamically regulates the energy spent on coding precision in neocortex.
Collapse
Affiliation(s)
- Zahid Padamsey
- Centre for Discovery Brain Sciences, School of Biomedical Sciences, University of Edinburgh, Edinburgh EH8 9XD, UK.
| | - Danai Katsanevaki
- Centre for Discovery Brain Sciences, School of Biomedical Sciences, University of Edinburgh, Edinburgh EH8 9XD, UK
| | - Nathalie Dupuy
- Centre for Discovery Brain Sciences, School of Biomedical Sciences, University of Edinburgh, Edinburgh EH8 9XD, UK
| | - Nathalie L Rochefort
- Centre for Discovery Brain Sciences, School of Biomedical Sciences, University of Edinburgh, Edinburgh EH8 9XD, UK; Simons Initiative for the Developing Brain, University of Edinburgh, Edinburgh EH8 9XD, UK.
| |
Collapse
|
5
|
Knoll G, Lindner B. Recurrence-mediated suprathreshold stochastic resonance. J Comput Neurosci 2021; 49:407-418. [PMID: 34003421 PMCID: PMC8556192 DOI: 10.1007/s10827-021-00788-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2021] [Revised: 04/21/2021] [Accepted: 04/26/2021] [Indexed: 11/29/2022]
Abstract
It has previously been shown that the encoding of time-dependent signals by feedforward networks (FFNs) of processing units exhibits suprathreshold stochastic resonance (SSR), which is an optimal signal transmission for a finite level of independent, individual stochasticity in the single units. In this study, a recurrent spiking network is simulated to demonstrate that SSR can be also caused by network noise in place of intrinsic noise. The level of autonomously generated fluctuations in the network can be controlled by the strength of synapses, and hence the coding fraction (our measure of information transmission) exhibits a maximum as a function of the synaptic coupling strength. The presence of a coding peak at an optimal coupling strength is robust over a wide range of individual, network, and signal parameters, although the optimal strength and peak magnitude depend on the parameter being varied. We also perform control experiments with an FFN illustrating that the optimized coding fraction is due to the change in noise level and not from other effects entailed when changing the coupling strength. These results also indicate that the non-white (temporally correlated) network noise in general provides an extra boost to encoding performance compared to the FFN driven by intrinsic white noise fluctuations.
Collapse
Affiliation(s)
- Gregory Knoll
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115, Germany. .,Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany.
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115, Germany.,Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany
| |
Collapse
|
6
|
Implications of Noise on Neural Correlates of Consciousness: A Computational Analysis of Stochastic Systems of Mutually Connected Processes. ENTROPY 2021; 23:e23050583. [PMID: 34066824 PMCID: PMC8151615 DOI: 10.3390/e23050583] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/01/2021] [Revised: 04/26/2021] [Accepted: 04/28/2021] [Indexed: 11/16/2022]
Abstract
Random fluctuations in neuronal processes may contribute to variability in perception and increase the information capacity of neuronal networks. Various sources of random processes have been characterized in the nervous system on different levels. However, in the context of neural correlates of consciousness, the robustness of mechanisms of conscious perception against inherent noise in neural dynamical systems is poorly understood. In this paper, a stochastic model is developed to study the implications of noise on dynamical systems that mimic neural correlates of consciousness. We computed power spectral densities and spectral entropy values for dynamical systems that contain a number of mutually connected processes. Interestingly, we found that spectral entropy decreases linearly as the number of processes within the system doubles. Further, power spectral density frequencies shift to higher values as system size increases, revealing an increasing impact of negative feedback loops and regulations on the dynamics of larger systems. Overall, our stochastic modeling and analysis results reveal that large dynamical systems of mutually connected and negatively regulated processes are more robust against inherent noise than small systems.
Collapse
|
7
|
Lu P, Veletić M, Bergsland J, Balasingham I. Theoretical Aspects of Resting-State Cardiomyocyte Communication for Multi-Nodal Nano-Actuator Pacemakers. SENSORS (BASEL, SWITZERLAND) 2020; 20:E2792. [PMID: 32422981 PMCID: PMC7285237 DOI: 10.3390/s20102792] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/20/2020] [Revised: 05/10/2020] [Accepted: 05/12/2020] [Indexed: 11/24/2022]
Abstract
The heart consists of billions of cardiac muscle cells-cardiomyocytes-that work in a coordinated fashion to supply oxygen and nutrients to the body. Inter-connected specialized cardiomyocytes form signaling channels through which the electrical signals are propagated throughout the heart, controlling the heart's beat to beat function of the other cardiac cells. In this paper, we study to what extent it is possible to use ordinary cardiomyocytes as communication channels between components of a recently proposed multi-nodal leadless pacemaker, to transmit data encoded in subthreshold membrane potentials. We analyze signal propagation in the cardiac infrastructure considering noise in the communication channel by performing numerical simulations based on the Luo-Rudy computational model. The Luo-Rudy model is an action potential model but describes the potential changes with time including membrane potential and action potential stages, separated by the thresholding mechanism. Demonstrating system performance, we show that cardiomyocytes can be used to establish an artificial communication system where data are reliably transmitted between 10 s of cells. The proposed subthreshold cardiac communication lays the foundation for a new intra-cardiac communication technique.
Collapse
Affiliation(s)
- Pengfei Lu
- The Intervention Centre, Oslo University Hospital, 0372 Oslo, Norway; (M.V.); (J.B.); (I.B.)
- Computer College, Weinan Normal University, Weinan 714099, China
- Faculty of Medicine, University of Oslo, 0315 Oslo, Norway
| | - Mladen Veletić
- The Intervention Centre, Oslo University Hospital, 0372 Oslo, Norway; (M.V.); (J.B.); (I.B.)
- Faculty of Electrical Engineering, University of Banja Luka, 78000 Banja Luka, Bosnia and Herzegovina
| | - Jacob Bergsland
- The Intervention Centre, Oslo University Hospital, 0372 Oslo, Norway; (M.V.); (J.B.); (I.B.)
| | - Ilangko Balasingham
- The Intervention Centre, Oslo University Hospital, 0372 Oslo, Norway; (M.V.); (J.B.); (I.B.)
- Department of Electronic Systems, Norwegian University of Science and Technology, 7491 Trondheim, Norway
| |
Collapse
|
8
|
Galán-Prado F, Morán A, Font J, Roca M, Rosselló JL. Compact Hardware Synthesis of Stochastic Spiking Neural Networks. Int J Neural Syst 2019; 29:1950004. [DOI: 10.1142/s0129065719500047] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Spiking neural networks (SNN) are able to emulate real neural behavior with high confidence due to their bio-inspired nature. Many designs have been proposed for the implementation of SNN in hardware, although the realization of high-density and biologically-inspired SNN is currently a complex challenge of high scientific and technical interest. In this work, we propose a compact digital design for the implementation of high-volume SNN that considers the intrinsic stochastic processes present in biological neurons and enables high-density hardware implementation. The proposed stochastic SNN model (SSNN) is compared with previous SSNN models, achieving a higher processing speed. We also show how the proposed model can be scaled to high-volume neural networks trained by using back propagation and applied to a pattern classification task. The proposed model achieves better results compared with other recently-published SNN models configured with unsupervised STDP learning.
Collapse
Affiliation(s)
- Fabio Galán-Prado
- Electronics Engineering Group, Physics Department, Universitat de les Illes Balears, Mateu Orfila Building, Ctra. Valldemossa km. 7.5, Palma de Mallorca, Balears 07122, Spain
| | - Alejandro Morán
- Electronics Engineering Group, Physics Department, Universitat de les Illes Balears, Mateu Orfila Building, Ctra. Valldemossa km. 7.5, Palma de Mallorca, Balears 07122, Spain
| | - Joan Font
- Electronics Engineering Group, Physics Department, Universitat de les Illes Balears, Mateu Orfila Building, Ctra. Valldemossa km. 7.5, Palma de Mallorca, Balears 07122, Spain
| | - Miquel Roca
- Electronics Engineering Group, Physics Department, Universitat de les Illes Balears, Mateu Orfila Building, Ctra. Valldemossa km. 7.5, Palma de Mallorca, Balears 07122, Spain
| | - Josep L. Rosselló
- Electronics Engineering Group, Physics Department, Universitat de les Illes Balears, Mateu Orfila Building, Ctra. Valldemossa km. 7.5, Palma de Mallorca, Balears 07122, Spain
| |
Collapse
|
9
|
Dold D, Bytschok I, Kungl AF, Baumbach A, Breitwieser O, Senn W, Schemmel J, Meier K, Petrovici MA. Stochasticity from function - Why the Bayesian brain may need no noise. Neural Netw 2019; 119:200-213. [PMID: 31450073 DOI: 10.1016/j.neunet.2019.08.002] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2019] [Revised: 07/01/2019] [Accepted: 08/01/2019] [Indexed: 11/15/2022]
Abstract
An increasing body of evidence suggests that the trial-to-trial variability of spiking activity in the brain is not mere noise, but rather the reflection of a sampling-based encoding scheme for probabilistic computing. Since the precise statistical properties of neural activity are important in this context, many models assume an ad-hoc source of well-behaved, explicit noise, either on the input or on the output side of single neuron dynamics, most often assuming an independent Poisson process in either case. However, these assumptions are somewhat problematic: neighboring neurons tend to share receptive fields, rendering both their input and their output correlated; at the same time, neurons are known to behave largely deterministically, as a function of their membrane potential and conductance. We suggest that spiking neural networks may have no need for noise to perform sampling-based Bayesian inference. We study analytically the effect of auto- and cross-correlations in functional Bayesian spiking networks and demonstrate how their effect translates to synaptic interaction strengths, rendering them controllable through synaptic plasticity. This allows even small ensembles of interconnected deterministic spiking networks to simultaneously and co-dependently shape their output activity through learning, enabling them to perform complex Bayesian computation without any need for noise, which we demonstrate in silico, both in classical simulation and in neuromorphic emulation. These results close a gap between the abstract models and the biology of functionally Bayesian spiking networks, effectively reducing the architectural constraints imposed on physical neural substrates required to perform probabilistic computing, be they biological or artificial.
Collapse
Affiliation(s)
- Dominik Dold
- Kirchhoff-Institute for Physics, Heidelberg University, Im Neuenheimer Feld 227, D-69120 Heidelberg, Germany; Department of Physiology, University of Bern, Bühlplatz 5, CH-3012 Bern, Switzerland.
| | - Ilja Bytschok
- Kirchhoff-Institute for Physics, Heidelberg University, Im Neuenheimer Feld 227, D-69120 Heidelberg, Germany
| | - Akos F Kungl
- Kirchhoff-Institute for Physics, Heidelberg University, Im Neuenheimer Feld 227, D-69120 Heidelberg, Germany; Department of Physiology, University of Bern, Bühlplatz 5, CH-3012 Bern, Switzerland
| | - Andreas Baumbach
- Kirchhoff-Institute for Physics, Heidelberg University, Im Neuenheimer Feld 227, D-69120 Heidelberg, Germany
| | - Oliver Breitwieser
- Kirchhoff-Institute for Physics, Heidelberg University, Im Neuenheimer Feld 227, D-69120 Heidelberg, Germany
| | - Walter Senn
- Department of Physiology, University of Bern, Bühlplatz 5, CH-3012 Bern, Switzerland
| | - Johannes Schemmel
- Kirchhoff-Institute for Physics, Heidelberg University, Im Neuenheimer Feld 227, D-69120 Heidelberg, Germany
| | - Karlheinz Meier
- Kirchhoff-Institute for Physics, Heidelberg University, Im Neuenheimer Feld 227, D-69120 Heidelberg, Germany
| | - Mihai A Petrovici
- Kirchhoff-Institute for Physics, Heidelberg University, Im Neuenheimer Feld 227, D-69120 Heidelberg, Germany; Department of Physiology, University of Bern, Bühlplatz 5, CH-3012 Bern, Switzerland.
| |
Collapse
|
10
|
Nolte M, Reimann MW, King JG, Markram H, Muller EB. Cortical reliability amid noise and chaos. Nat Commun 2019; 10:3792. [PMID: 31439838 PMCID: PMC6706377 DOI: 10.1038/s41467-019-11633-8] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2018] [Accepted: 07/23/2019] [Indexed: 02/01/2023] Open
Abstract
Typical responses of cortical neurons to identical sensory stimuli appear highly variable. It has thus been proposed that the cortex primarily uses a rate code. However, other studies have argued for spike-time coding under certain conditions. The potential role of spike-time coding is directly limited by the internally generated variability of cortical circuits, which remains largely unexplored. Here, we quantify this internally generated variability using a biophysical model of rat neocortical microcircuitry with biologically realistic noise sources. We find that stochastic neurotransmitter release is a critical component of internally generated variability, causing rapidly diverging, chaotic recurrent network dynamics. Surprisingly, the same nonlinear recurrent network dynamics can transiently overcome the chaos in response to weak feed-forward thalamocortical inputs, and support reliable spike times with millisecond precision. Our model shows that the noisy and chaotic network dynamics of recurrent cortical microcircuitry are compatible with stimulus-evoked, millisecond spike-time reliability, resolving a long-standing debate.
Collapse
Affiliation(s)
- Max Nolte
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, 1202, Geneva, Switzerland.
| | - Michael W Reimann
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, 1202, Geneva, Switzerland
| | - James G King
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, 1202, Geneva, Switzerland
| | - Henry Markram
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, 1202, Geneva, Switzerland
- Laboratory of Neural Microcircuitry, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015, Lausanne, Switzerland
| | - Eilif B Muller
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, 1202, Geneva, Switzerland.
| |
Collapse
|
11
|
Dvir H, Kantelhardt JW, Zinkhan M, Pillmann F, Szentkiralyi A, Obst A, Ahrens W, Bartsch RP. A Biased Diffusion Approach to Sleep Dynamics Reveals Neuronal Characteristics. Biophys J 2019; 117:987-997. [PMID: 31422824 DOI: 10.1016/j.bpj.2019.07.032] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2019] [Revised: 06/20/2019] [Accepted: 07/19/2019] [Indexed: 01/10/2023] Open
Abstract
We propose a biased diffusion model of accumulated subthreshold voltage fluctuations in wake-promoting neurons to account for stochasticity in sleep dynamics and to explain the occurrence of brief arousals during sleep. Utilizing this model, we derive four neurophysiological parameters related to neuronal noise level, excitability threshold, deep-sleep threshold, and sleep inertia. We provide the first analytic expressions for these parameters, and we show that there is good agreement between empirical findings from sleep recordings and our model simulation results. Our work suggests that these four parameters can be of clinical importance because we find them to be significantly altered in elderly subjects and in children with autism.
Collapse
Affiliation(s)
- Hila Dvir
- Department of Physics, Bar-Ilan University, Ramat-Gan, Israel.
| | - Jan W Kantelhardt
- Institute of Physics, Martin-Luther-University Halle-Wittenberg, Halle, Germany
| | - Melanie Zinkhan
- Institute of Clinical Epidemiology, Martin-Luther-University Halle-Wittenberg, Halle, Germany
| | - Frank Pillmann
- Department of Psychiatry and Psychotherapy, Martin-Luther-University Halle-Wittenberg, Halle, Germany
| | - Andras Szentkiralyi
- Institute of Epidemiology and Social Medicine, University of Münster, Münster, Germany
| | - Anne Obst
- Department of Internal Medicine B, Ernst-Moritz-Arndt University Greifswald, Greifswald, Germany
| | - Wolfgang Ahrens
- Leibniz Institute for Prevention Research and Epidemiology, Bremen, Germany
| | - Ronny P Bartsch
- Department of Physics, Bar-Ilan University, Ramat-Gan, Israel.
| |
Collapse
|
12
|
Melanson A, Longtin A. Data-driven inference for stationary jump-diffusion processes with application to membrane voltage fluctuations in pyramidal neurons. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2019; 9:6. [PMID: 31350644 PMCID: PMC6660545 DOI: 10.1186/s13408-019-0074-3] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/07/2018] [Accepted: 07/09/2019] [Indexed: 06/10/2023]
Abstract
The emergent activity of biological systems can often be represented as low-dimensional, Langevin-type stochastic differential equations. In certain systems, however, large and abrupt events occur and violate the assumptions of this approach. We address this situation here by providing a novel method that reconstructs a jump-diffusion stochastic process based solely on the statistics of the original data. Our method assumes that these data are stationary, that diffusive noise is additive, and that jumps are Poisson. We use threshold-crossing of the increments to detect jumps in the time series. This is followed by an iterative scheme that compensates for the presence of diffusive fluctuations that are falsely detected as jumps. Our approach is based on probabilistic calculations associated with these fluctuations and on the use of the Fokker-Planck and the differential Chapman-Kolmogorov equations. After some validation cases, we apply this method to recordings of membrane noise in pyramidal neurons of the electrosensory lateral line lobe of weakly electric fish. These recordings display large, jump-like depolarization events that occur at random times, the biophysics of which is unknown. We find that some pyramidal cells increase their jump rate and noise intensity as the membrane potential approaches spike threshold, while their drift function and jump amplitude distribution remain unchanged. As our method is fully data-driven, it provides a valuable means to further investigate the functional role of these jump-like events without relying on unconstrained biophysical models.
Collapse
Affiliation(s)
- Alexandre Melanson
- Department of Physics, University of Ottawa, Ottawa, Canada.
- Département de physique et d'astronomie, Université de Moncton, Moncton, Canada.
| | - André Longtin
- Department of Physics, University of Ottawa, Ottawa, Canada
- Centre for Neural Dynamics, University of Ottawa, Ottawa, Canada
- Brain and Mind Research Institute, University of Ottawa, Ottawa, Canada
| |
Collapse
|
13
|
Correlation Transfer by Layer 5 Cortical Neurons Under Recreated Synaptic Inputs In Vitro. J Neurosci 2019; 39:7648-7663. [PMID: 31346031 DOI: 10.1523/jneurosci.3169-18.2019] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2018] [Revised: 07/06/2019] [Accepted: 07/12/2019] [Indexed: 11/21/2022] Open
Abstract
Correlated electrical activity in neurons is a prominent characteristic of cortical microcircuits. Despite a growing amount of evidence concerning both spike-count and subthreshold membrane potential pairwise correlations, little is known about how different types of cortical neurons convert correlated inputs into correlated outputs. We studied pyramidal neurons and two classes of GABAergic interneurons of layer 5 in neocortical brain slices obtained from rats of both sexes, and we stimulated them with biophysically realistic correlated inputs, generated using dynamic clamp. We found that the physiological differences between cell types manifested unique features in their capacity to transfer correlated inputs. We used linear response theory and computational modeling to gain clear insights into how cellular properties determine both the gain and timescale of correlation transfer, thus tying single-cell features with network interactions. Our results provide further ground for the functionally distinct roles played by various types of neuronal cells in the cortical microcircuit.SIGNIFICANCE STATEMENT No matter how we probe the brain, we find correlated neuronal activity over a variety of spatial and temporal scales. For the cerebral cortex, significant evidence has accumulated on trial-to-trial covariability in synaptic inputs activation, subthreshold membrane potential fluctuations, and output spike trains. Although we do not yet fully understand their origin and whether they are detrimental or beneficial for information processing, we believe that clarifying how correlations emerge is pivotal for understanding large-scale neuronal network dynamics and computation. Here, we report quantitative differences between excitatory and inhibitory cells, as they relay input correlations into output correlations. We explain this heterogeneity by simple biophysical models and provide the most experimentally validated test of a theory for the emergence of correlations.
Collapse
|
14
|
Morro A, Canals V, Oliver A, Alomar ML, Galan-Prado F, Ballester PJ, Rossello JL. A Stochastic Spiking Neural Network for Virtual Screening. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2018; 29:1371-1375. [PMID: 28186913 DOI: 10.1109/tnnls.2017.2657601] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
Virtual screening (VS) has become a key computational tool in early drug design and screening performance is of high relevance due to the large volume of data that must be processed to identify molecules with the sought activity-related pattern. At the same time, the hardware implementations of spiking neural networks (SNNs) arise as an emerging computing technique that can be applied to parallelize processes that normally present a high cost in terms of computing time and power. Consequently, SNN represents an attractive alternative to perform time-consuming processing tasks, such as VS. In this brief, we present a smart stochastic spiking neural architecture that implements the ultrafast shape recognition (USR) algorithm achieving two order of magnitude of speed improvement with respect to USR software implementations. The neural system is implemented in hardware using field-programmable gate arrays allowing a highly parallelized USR implementation. The results show that, due to the high parallelization of the system, millions of compounds can be checked in reasonable times. From these results, we can state that the proposed architecture arises as a feasible methodology to efficiently enhance time-consuming data-mining processes such as 3-D molecular similarity search.
Collapse
|
15
|
Dvir H, Elbaz I, Havlin S, Appelbaum L, Ivanov PC, Bartsch RP. Neuronal noise as an origin of sleep arousals and its role in sudden infant death syndrome. SCIENCE ADVANCES 2018; 4:eaar6277. [PMID: 29707639 PMCID: PMC5916514 DOI: 10.1126/sciadv.aar6277] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/30/2017] [Accepted: 03/08/2018] [Indexed: 06/08/2023]
Abstract
In addition to regular sleep/wake cycles, humans and animals exhibit brief arousals from sleep. Although much is known about consolidated sleep and wakefulness, the mechanism that triggers arousals remains enigmatic. Here, we argue that arousals are caused by the intrinsic neuronal noise of wake-promoting neurons. We propose a model that simulates the superposition of the noise from a group of neurons, and show that, occasionally, the superposed noise exceeds the excitability threshold and provokes an arousal. Because neuronal noise decreases with increasing temperature, our model predicts arousal frequency to decrease as well. To test this prediction, we perform experiments on the sleep/wake behavior of zebrafish larvae and find that increasing water temperatures lead to fewer and shorter arousals, as predicted by our analytic derivations and model simulations. Our findings indicate a previously unrecognized neurophysiological mechanism that links sleep arousals with temperature regulation, and may explain the origin of the clinically observed higher risk for sudden infant death syndrome with increased ambient temperature.
Collapse
Affiliation(s)
- Hila Dvir
- Department of Physics, Bar-Ilan University, Ramat Gan, Israel
| | - Idan Elbaz
- The Mina and Everard Goodman Faculty of Life Sciences, Bar-Ilan University, Ramat-Gan, Israel
- The Leslie and Susan Gonda Multidisciplinary Brain Research Center, Bar-Ilan University, Ramat-Gan, Israel
| | - Shlomo Havlin
- Department of Physics, Bar-Ilan University, Ramat Gan, Israel
| | - Lior Appelbaum
- The Mina and Everard Goodman Faculty of Life Sciences, Bar-Ilan University, Ramat-Gan, Israel
- The Leslie and Susan Gonda Multidisciplinary Brain Research Center, Bar-Ilan University, Ramat-Gan, Israel
| | - Plamen Ch. Ivanov
- Keck Laboratory for Network Physiology, Department of Physics, Boston University, Boston, MA 02215, USA
- Harvard Medical School and Division of Sleep Medicine, Brigham and Women’s Hospital, Boston, MA 02115, USA
- Institute of Solid State Physics, Bulgarian Academy of Sciences, Sofia, Bulgaria
| | | |
Collapse
|
16
|
Liu Y, Yue Y, Yu Y, Liu L, Yu L. Effects of channel blocking on information transmission and energy efficiency in squid giant axons. J Comput Neurosci 2018; 44:219-231. [PMID: 29327161 DOI: 10.1007/s10827-017-0676-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2016] [Revised: 11/18/2017] [Accepted: 12/11/2017] [Indexed: 11/25/2022]
Abstract
Action potentials are the information carriers of neural systems. The generation of action potentials involves the cooperative opening and closing of sodium and potassium channels. This process is metabolically expensive because the ions flowing through open channels need to be restored to maintain concentration gradients of these ions. Toxins like tetraethylammonium can block working ion channels, thus affecting the function and energy cost of neurons. In this paper, by computer simulation of the Hodgkin-Huxley neuron model, we studied the effects of channel blocking with toxins on the information transmission and energy efficiency in squid giant axons. We found that gradually blocking sodium channels will sequentially maximize the information transmission and energy efficiency of the axons, whereas moderate blocking of potassium channels will have little impact on the information transmission and will decrease the energy efficiency. Heavy blocking of potassium channels will cause self-sustained oscillation of membrane potentials. Simultaneously blocking sodium and potassium channels with the same ratio increases both information transmission and energy efficiency. Our results are in line with previous studies suggesting that information processing capacity and energy efficiency can be maximized by regulating the number of active ion channels, and this indicates a viable avenue for future experimentation.
Collapse
Affiliation(s)
- Yujiang Liu
- Institute of Theoretical Physics, Lanzhou University, Lanzhou, 730000, China
| | - Yuan Yue
- Institute of Theoretical Physics, Lanzhou University, Lanzhou, 730000, China
- College of Electrical Engineering, Northwest University for Nationalities, Lanzhou, 730070, China
| | - Yuguo Yu
- School of Life Science and the Collaborative Innovation Center for Brain Science, Center for Computational Systems Biology, Fudan University, Shanghai Shi, 200433, China
| | - Liwei Liu
- College of Electrical Engineering, Northwest University for Nationalities, Lanzhou, 730070, China
| | - Lianchun Yu
- Institute of Theoretical Physics, Lanzhou University, Lanzhou, 730000, China.
| |
Collapse
|
17
|
Beiran M, Kruscha A, Benda J, Lindner B. Coding of time-dependent stimuli in homogeneous and heterogeneous neural populations. J Comput Neurosci 2017; 44:189-202. [PMID: 29222729 DOI: 10.1007/s10827-017-0674-4] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2017] [Revised: 11/08/2017] [Accepted: 11/12/2017] [Indexed: 11/29/2022]
Abstract
We compare the information transmission of a time-dependent signal by two types of uncoupled neuron populations that differ in their sources of variability: i) a homogeneous population whose units receive independent noise and ii) a deterministic heterogeneous population, where each unit exhibits a different baseline firing rate ('disorder'). Our criterion for making both sources of variability quantitatively comparable is that the interspike-interval distributions are identical for both systems. Numerical simulations using leaky integrate-and-fire neurons unveil that a non-zero amount of both noise or disorder maximizes the encoding efficiency of the homogeneous and heterogeneous system, respectively, as a particular case of suprathreshold stochastic resonance. Our findings thus illustrate that heterogeneity can render similarly profitable effects for neuronal populations as dynamic noise. The optimal noise/disorder depends on the system size and the properties of the stimulus such as its intensity or cutoff frequency. We find that weak stimuli are better encoded by a noiseless heterogeneous population, whereas for strong stimuli a homogeneous population outperforms an equivalent heterogeneous system up to a moderate noise level. Furthermore, we derive analytical expressions of the coherence function for the cases of very strong noise and of vanishing intrinsic noise or heterogeneity, which predict the existence of an optimal noise intensity. Our results show that, depending on the type of signal, noise as well as heterogeneity can enhance the encoding performance of neuronal populations.
Collapse
Affiliation(s)
- Manuel Beiran
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany. .,Group for Neural Theory, Laboratoire de Neurosciences Cognitives, Département Études Cognitives, École Normale Supérieure, INSERM, PSL Research University, Paris, France.
| | - Alexandra Kruscha
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany.,Physics Department, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Jan Benda
- Institute for Neurobiology, Eberhard Karls Universität, Tübingen, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany.,Physics Department, Humboldt-Universität zu Berlin, Berlin, Germany
| |
Collapse
|
18
|
Yu L, Yu Y. Energy-efficient neural information processing in individual neurons and neuronal networks. J Neurosci Res 2017; 95:2253-2266. [PMID: 28833444 DOI: 10.1002/jnr.24131] [Citation(s) in RCA: 40] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2016] [Revised: 07/07/2017] [Accepted: 07/10/2017] [Indexed: 12/22/2022]
Abstract
Brains are composed of networks of an enormous number of neurons interconnected with synapses. Neural information is carried by the electrical signals within neurons and the chemical signals among neurons. Generating these electrical and chemical signals is metabolically expensive. The fundamental issue raised here is whether brains have evolved efficient ways of developing an energy-efficient neural code from the molecular level to the circuit level. Here, we summarize the factors and biophysical mechanisms that could contribute to the energy-efficient neural code for processing input signals. The factors range from ion channel kinetics, body temperature, axonal propagation of action potentials, low-probability release of synaptic neurotransmitters, optimal input and noise, the size of neurons and neuronal clusters, excitation/inhibition balance, coding strategy, cortical wiring, and the organization of functional connectivity. Both experimental and computational evidence suggests that neural systems may use these factors to maximize the efficiency of energy consumption in processing neural signals. Studies indicate that efficient energy utilization may be universal in neuronal systems as an evolutionary consequence of the pressure of limited energy. As a result, neuronal connections may be wired in a highly economical manner to lower energy costs and space. Individual neurons within a network may encode independent stimulus components to allow a minimal number of neurons to represent whole stimulus characteristics efficiently. This basic principle may fundamentally change our view of how billions of neurons organize themselves into complex circuits to operate and generate the most powerful intelligent cognition in nature. © 2017 Wiley Periodicals, Inc.
Collapse
Affiliation(s)
- Lianchun Yu
- Institute of Theoretical Physics, Key Laboratory for Magnetism and Magnetic Materials of the Ministry of Education, Lanzhou University, Lanzhou, China
| | - Yuguo Yu
- School of Life Science and the State Key Laboratory of Medical Neurobiology, Institutes of Brain Science and the Collaborative Innovation Center for Brain Science, Fudan University, Shanghai, China
| |
Collapse
|
19
|
Singh C, Levy WB. A consensus layer V pyramidal neuron can sustain interpulse-interval coding. PLoS One 2017; 12:e0180839. [PMID: 28704450 PMCID: PMC5509228 DOI: 10.1371/journal.pone.0180839] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2017] [Accepted: 06/22/2017] [Indexed: 11/19/2022] Open
Abstract
In terms of a single neuron's long-distance communication, interpulse intervals (IPIs) are an attractive alternative to rate and binary codes. As a proxy for an IPI, a neuron's time-to-spike can be found in the biophysical and experimental intracellular literature. Using the current, consensus layer V pyramidal neuron, the present study examines the feasibility of IPI-coding and examines the noise sources that limit the information rate of such an encoding. In descending order of importance, the noise sources are (i) synaptic variability, (ii) sodium channel shot-noise, followed by (iii) thermal noise. The biophysical simulations allow the calculation of mutual information, which is about 3.0 bits/spike. More importantly, while, by any conventional definition, the biophysical model is highly nonlinear, the underlying function that relates input intensity to the defined output variable is linear. When one assumes the perspective of a neuron coding via first hitting-time, this result justifies a pervasive and simplifying assumption of computational modelers-that a class of cortical neurons can be treated as linearly additive, computational devices.
Collapse
Affiliation(s)
- Chandan Singh
- Departments of Neurosurgery and of Psychology, University of Virginia, Charlottesville, VA, United States of America
| | - William B. Levy
- Departments of Neurosurgery and of Psychology, University of Virginia, Charlottesville, VA, United States of America
| |
Collapse
|
20
|
Pakdaman K, Thieullen M, Wainrib G. Fluid limit theorems for stochastic hybrid systems with application to neuron models. ADV APPL PROBAB 2016. [DOI: 10.1239/aap/1282924062] [Citation(s) in RCA: 59] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
In this paper we establish limit theorems for a class of stochastic hybrid systems (continuous deterministic dynamics coupled with jump Markov processes) in the fluid limit (small jumps at high frequency), thus extending known results for jump Markov processes. We prove a functional law of large numbers with exponential convergence speed, derive a diffusion approximation, and establish a functional central limit theorem. We apply these results to neuron models with stochastic ion channels, as the number of channels goes to infinity, estimating the convergence to the deterministic model. In terms of neural coding, we apply our central limit theorems to numerically estimate the impact of channel noise both on frequency and spike timing coding.
Collapse
|
21
|
Abstract
In this paper we establish limit theorems for a class of stochastic hybrid systems (continuous deterministic dynamics coupled with jump Markov processes) in the fluid limit (small jumps at high frequency), thus extending known results for jump Markov processes. We prove a functional law of large numbers with exponential convergence speed, derive a diffusion approximation, and establish a functional central limit theorem. We apply these results to neuron models with stochastic ion channels, as the number of channels goes to infinity, estimating the convergence to the deterministic model. In terms of neural coding, we apply our central limit theorems to numerically estimate the impact of channel noise both on frequency and spike timing coding.
Collapse
|
22
|
Rosselló JL, Alomar ML, Morro A, Oliver A, Canals V. High-Density Liquid-State Machine Circuitry for Time-Series Forecasting. Int J Neural Syst 2016; 26:1550036. [DOI: 10.1142/s0129065715500367] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Spiking neural networks (SNN) are the last neural network generation that try to mimic the real behavior of biological neurons. Although most research in this area is done through software applications, it is in hardware implementations in which the intrinsic parallelism of these computing systems are more efficiently exploited. Liquid state machines (LSM) have arisen as a strategic technique to implement recurrent designs of SNN with a simple learning methodology. In this work, we show a new low-cost methodology to implement high-density LSM by using Boolean gates. The proposed method is based on the use of probabilistic computing concepts to reduce hardware requirements, thus considerably increasing the neuron count per chip. The result is a highly functional system that is applied to high-speed time series forecasting.
Collapse
Affiliation(s)
- Josep L. Rosselló
- Electronics Engineering Group, Physics Department, Universitat de les Illes Balears, Mateu Orfila Building, Cra. Valldemossa km. 7.5, Palma de Mallorca, Balears 07122, Spain
| | - Miquel L. Alomar
- Electronics Engineering Group, Physics Department, Universitat de les Illes Balears, Mateu Orfila Building, Cra. Valldemossa km. 7.5, Palma de Mallorca, Balears 07122, Spain
| | - Antoni Morro
- Electronics Engineering Group, Physics Department, Universitat de les Illes Balears, Mateu Orfila Building, Cra. Valldemossa km. 7.5, Palma de Mallorca, Balears 07122, Spain
| | - Antoni Oliver
- Electronics Engineering Group, Physics Department, Universitat de les Illes Balears, Mateu Orfila Building, Cra. Valldemossa km. 7.5, Palma de Mallorca, Balears 07122, Spain
| | - Vincent Canals
- Electronics Engineering Group, Physics Department, Universitat de les Illes Balears, Mateu Orfila Building, Cra. Valldemossa km. 7.5, Palma de Mallorca, Balears 07122, Spain
| |
Collapse
|
23
|
Ling A, Huang Y, Shuai J, Lan Y. Channel based generating function approach to the stochastic Hodgkin-Huxley neuronal system. Sci Rep 2016; 6:22662. [PMID: 26940002 PMCID: PMC4778126 DOI: 10.1038/srep22662] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2015] [Accepted: 02/09/2016] [Indexed: 11/09/2022] Open
Abstract
Internal and external fluctuations, such as channel noise and synaptic noise, contribute to the generation of spontaneous action potentials in neurons. Many different Langevin approaches have been proposed to speed up the computation but with waning accuracy especially at small channel numbers. We apply a generating function approach to the master equation for the ion channel dynamics and further propose two accelerating algorithms, with an accuracy close to the Gillespie algorithm but with much higher efficiency, opening the door for expedited simulation of noisy action potential propagating along axons or other types of noisy signal transduction.
Collapse
Affiliation(s)
- Anqi Ling
- Department of Physics, Tsinghua University, Beijing 100084, China.,Collaborative Innovation Center of Quantum Matter, Beijing 100084, China
| | - Yandong Huang
- Department of Physics and Institute of Theoretical Physics and Astrophysics, Xiamen University, Xiamen 361005, China
| | - Jianwei Shuai
- Department of Physics and Institute of Theoretical Physics and Astrophysics, Xiamen University, Xiamen 361005, China
| | - Yueheng Lan
- Department of Physics, Tsinghua University, Beijing 100084, China.,Collaborative Innovation Center of Quantum Matter, Beijing 100084, China
| |
Collapse
|
24
|
Brooks HA, Bressloff PC. Quasicycles in the stochastic hybrid Morris-Lecar neural model. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:012704. [PMID: 26274200 DOI: 10.1103/physreve.92.012704] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/02/2015] [Indexed: 06/04/2023]
Abstract
Intrinsic noise arising from the stochastic opening and closing of voltage-gated ion channels has been shown experimentally and mathematically to have important effects on a neuron's function. Study of classical neuron models with stochastic ion channels is becoming increasingly important, especially in understanding a cell's ability to produce subthreshold oscillations and to respond to weak periodic stimuli. While it is known that stochastic models can produce oscillations (quasicycles) in parameter regimes where the corresponding deterministic model has only a stable fixed point, little analytical work has been done to explore these connections within the context of channel noise. Using a stochastic hybrid Morris-Lecar (ML) model, we combine a system-size expansion in K(+) and a quasi-steady-state (QSS) approximation in persistent Na(+) in order to derive an effective Langevin equation that preserves the low-dimensional (planar) structure of the underlying deterministic ML model. (The QSS analysis exploits the fact that persistent Na(+) channels are fast.) By calculating the corresponding power spectrum, we determine analytically how noise significantly extends the parameter regime in which subthreshold oscillations occur.
Collapse
Affiliation(s)
- Heather A Brooks
- Department of Mathematics, University of Utah, 155 South 1400 East, Salt Lake City, Utah 84112, USA
| | - Paul C Bressloff
- Department of Mathematics, University of Utah, 155 South 1400 East, Salt Lake City, Utah 84112, USA
| |
Collapse
|
25
|
Eberhard MJB, Schleimer JH, Schreiber S, Ronacher B. A temperature rise reduces trial-to-trial variability of locust auditory neuron responses. J Neurophysiol 2015; 114:1424-37. [PMID: 26041833 DOI: 10.1152/jn.00980.2014] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2014] [Accepted: 06/03/2015] [Indexed: 11/22/2022] Open
Abstract
The neurophysiology of ectothermic animals, such as insects, is affected by environmental temperature, as their body temperature fluctuates with ambient conditions. Changes in temperature alter properties of neurons and, consequently, have an impact on the processing of information. Nevertheless, nervous system function is often maintained over a broad temperature range, exhibiting a surprising robustness to variations in temperature. A special problem arises for acoustically communicating insects, as in these animals mate recognition and mate localization typically rely on the decoding of fast amplitude modulations in calling and courtship songs. In the auditory periphery, however, temporal resolution is constrained by intrinsic neuronal noise. Such noise predominantly arises from the stochasticity of ion channel gating and potentially impairs the processing of sensory signals. On the basis of intracellular recordings of locust auditory neurons, we show that intrinsic neuronal variability on the level of spikes is reduced with increasing temperature. We use a detailed mathematical model including stochastic ion channel gating to shed light on the underlying biophysical mechanisms in auditory receptor neurons: because of a redistribution of channel-induced current noise toward higher frequencies and specifics of the temperature dependence of the membrane impedance, membrane potential noise is indeed reduced at higher temperatures. This finding holds under generic conditions and physiologically plausible assumptions on the temperature dependence of the channels' kinetics and peak conductances. We demonstrate that the identified mechanism also can explain the experimentally observed reduction of spike timing variability at higher temperatures.
Collapse
Affiliation(s)
- Monika J B Eberhard
- Department of Biology, Behavioural Physiology Group, Humboldt-Universität zu Berlin, Berlin, Germany;
| | - Jan-Hendrik Schleimer
- Department of Biology, Institute for Theoretical Biology, Humboldt-Universität zu Berlin, Berlin, Germany; and Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Susanne Schreiber
- Department of Biology, Institute for Theoretical Biology, Humboldt-Universität zu Berlin, Berlin, Germany; and Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Bernhard Ronacher
- Department of Biology, Behavioural Physiology Group, Humboldt-Universität zu Berlin, Berlin, Germany; Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| |
Collapse
|
26
|
Paffi A, Camera F, Apollonio F, d'Inzeo G, Liberti M. Restoring the encoding properties of a stochastic neuron model by an exogenous noise. Front Comput Neurosci 2015; 9:42. [PMID: 25999845 PMCID: PMC4422033 DOI: 10.3389/fncom.2015.00042] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2014] [Accepted: 03/19/2015] [Indexed: 11/13/2022] Open
Abstract
Here we evaluate the possibility of improving the encoding properties of an impaired neuronal system by superimposing an exogenous noise to an external electric stimulation signal. The approach is based on the use of mathematical neuron models consisting of stochastic HH-like circuit, where the impairment of the endogenous presynaptic inputs is described as a subthreshold injected current and the exogenous stimulation signal is a sinusoidal voltage perturbation across the membrane. Our results indicate that a correlated Gaussian noise, added to the sinusoidal signal can significantly increase the encoding properties of the impaired system, through the Stochastic Resonance (SR) phenomenon. These results suggest that an exogenous noise, suitably tailored, could improve the efficacy of those stimulation techniques used in neuronal systems, where the presynaptic sensory neurons are impaired and have to be artificially bypassed.
Collapse
Affiliation(s)
- Alessandra Paffi
- Department of Information Engineering, Electronics and Telecommunications, Sapienza University of Rome Rome, Italy ; Italian Inter-University Center for the Study of Electromagnetic Fields and Biological Systems Genova, Italy
| | - Francesca Camera
- Department of Information Engineering, Electronics and Telecommunications, Sapienza University of Rome Rome, Italy ; Italian Inter-University Center for the Study of Electromagnetic Fields and Biological Systems Genova, Italy
| | - Francesca Apollonio
- Department of Information Engineering, Electronics and Telecommunications, Sapienza University of Rome Rome, Italy ; Italian Inter-University Center for the Study of Electromagnetic Fields and Biological Systems Genova, Italy
| | - Guglielmo d'Inzeo
- Department of Information Engineering, Electronics and Telecommunications, Sapienza University of Rome Rome, Italy ; Italian Inter-University Center for the Study of Electromagnetic Fields and Biological Systems Genova, Italy
| | - Micaela Liberti
- Department of Information Engineering, Electronics and Telecommunications, Sapienza University of Rome Rome, Italy ; Italian Inter-University Center for the Study of Electromagnetic Fields and Biological Systems Genova, Italy
| |
Collapse
|
27
|
Kuenzel T, Nerlich J, Wagner H, Rübsamen R, Milenkovic I. Inhibitory properties underlying non-monotonic input-output relationship in low-frequency spherical bushy neurons of the gerbil. Front Neural Circuits 2015; 9:14. [PMID: 25873864 PMCID: PMC4379913 DOI: 10.3389/fncir.2015.00014] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2014] [Accepted: 03/11/2015] [Indexed: 02/03/2023] Open
Abstract
Spherical bushy cells (SBCs) of the anteroventral cochlear nucleus (AVCN) receive input from large excitatory auditory nerve (AN) terminals, the endbulbs of Held, and mixed glycinergic/GABAergic inhibitory inputs. The latter have sufficient potency to block action potential firing in vivo and in slice recordings. However, it is not clear how well the data from slice recordings match the inhibition in the intact brain and how it contributes to complex phenomena such as non-monotonic rate-level functions (RLF). Therefore, we determined the input-output relationship of a model SBC with simulated endbulb inputs and a dynamic inhibitory conductance constrained by recordings in brain slice preparations of hearing gerbils. Event arrival times from in vivo single-unit recordings in gerbils, where 70% of SBC showed non-monotonic RLF, were used as input for the model. Model output RLFs systematically changed from monotonic to non-monotonic shape with increasing strength of tonic inhibition. A limited range of inhibitory synaptic properties consistent with the slice data generated a good match between the model and recorded RLF. Moreover, tonic inhibition elevated the action potentials (AP) threshold and improved the temporal precision of output functions in a SBC model with phase-dependent input conductance. We conclude that activity-dependent, summating inhibition contributes to high temporal precision of SBC spiking by filtering out weak and poorly timed EPSP. Moreover, inhibitory parameters determined in slice recordings provide a good estimate of inhibitory mechanisms apparently active in vivo.
Collapse
Affiliation(s)
- Thomas Kuenzel
- Department of Zoology/Animal Physiology, Institute of Biology II, RWTH Aachen University Aachen, Germany
| | - Jana Nerlich
- Faculty of Biosciences, Pharmacy and Psychology, Institute of Biology, University of Leipzig Leipzig, Germany
| | - Hermann Wagner
- Department of Zoology/Animal Physiology, Institute of Biology II, RWTH Aachen University Aachen, Germany
| | - Rudolf Rübsamen
- Faculty of Biosciences, Pharmacy and Psychology, Institute of Biology, University of Leipzig Leipzig, Germany
| | - Ivan Milenkovic
- Faculty of Medicine, Carl Ludwig Institute for Physiology, University of Leipzig Leipzig, Germany
| |
Collapse
|
28
|
O'Donnell C, van Rossum MCW. Systematic analysis of the contributions of stochastic voltage gated channels to neuronal noise. Front Comput Neurosci 2014; 8:105. [PMID: 25360105 PMCID: PMC4199219 DOI: 10.3389/fncom.2014.00105] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2014] [Accepted: 08/17/2014] [Indexed: 11/22/2022] Open
Abstract
Electrical signaling in neurons is mediated by the opening and closing of large numbers of individual ion channels. The ion channels' state transitions are stochastic and introduce fluctuations in the macroscopic current through ion channel populations. This creates an unavoidable source of intrinsic electrical noise for the neuron, leading to fluctuations in the membrane potential and spontaneous spikes. While this effect is well known, the impact of channel noise on single neuron dynamics remains poorly understood. Most results are based on numerical simulations. There is no agreement, even in theoretical studies, on which ion channel type is the dominant noise source, nor how inclusion of additional ion channel types affects voltage noise. Here we describe a framework to calculate voltage noise directly from an arbitrary set of ion channel models, and discuss how this can be use to estimate spontaneous spike rates.
Collapse
Affiliation(s)
- Cian O'Donnell
- Computational Neurobiology Laboratory, Salk Institute for Biological Studies La Jolla, CA, USA ; School of Informatics, Institute for Adaptive and Neural Computation, University of Edinburgh Edinburgh, UK
| | - Mark C W van Rossum
- School of Informatics, Institute for Adaptive and Neural Computation, University of Edinburgh Edinburgh, UK
| |
Collapse
|
29
|
ROSSELLÓ JOSEPL, CANALS VICENS, OLIVER ANTONI, MORRO ANTONI. STUDYING THE ROLE OF SYNCHRONIZED AND CHAOTIC SPIKING NEURAL ENSEMBLES IN NEURAL INFORMATION PROCESSING. Int J Neural Syst 2014; 24:1430003. [PMID: 24875785 DOI: 10.1142/s0129065714300034] [Citation(s) in RCA: 43] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
The brain is characterized by performing many diverse processing tasks ranging from elaborate processes such as pattern recognition, memory or decision making to more simple functionalities such as linear filtering in image processing. Understanding the mechanisms by which the brain is able to produce such a different range of cortical operations remains a fundamental problem in neuroscience. Here we show a study about which processes are related to chaotic and synchronized states based on the study of in-silico implementation of Stochastic Spiking Neural Networks (SSNN). The measurements obtained reveal that chaotic neural ensembles are excellent transmission and convolution systems since mutual information between signals is minimized. At the same time, synchronized cells (that can be understood as ordered states of the brain) can be associated to more complex nonlinear computations. In this sense, we experimentally show that complex and quick pattern recognition processes arise when both synchronized and chaotic states are mixed. These measurements are in accordance with in vivo observations related to the role of neural synchrony in pattern recognition and to the speed of the real biological process. We also suggest that the high-level adaptive mechanisms of the brain that are the Hebbian and non-Hebbian learning rules can be understood as processes devoted to generate the appropriate clustering of both synchronized and chaotic ensembles. The measurements obtained from the hardware implementation of different types of neural systems suggest that the brain processing can be governed by the superposition of these two complementary states with complementary functionalities (nonlinear processing for synchronized states and information convolution and parallelization for chaotic).
Collapse
Affiliation(s)
- JOSEP L. ROSSELLÓ
- Physics Department, University of Balearic Islands, Cra. de Valldemossa, km 7.5, Palma de Majorca, 07122, Spain
| | - VICENS CANALS
- Physics Department, University of Balearic Islands, Cra. de Valldemossa, km 7.5, Palma de Majorca, 07122, Spain
| | - ANTONI OLIVER
- Physics Department, University of Balearic Islands, Cra. de Valldemossa, km 7.5, Palma de Majorca, 07122, Spain
| | - ANTONI MORRO
- Physics Department, University of Balearic Islands, Cra. de Valldemossa, km 7.5, Palma de Majorca, 07122, Spain
| |
Collapse
|
30
|
A novel model incorporating two variability sources for describing motor evoked potentials. Brain Stimul 2014; 7:541-52. [PMID: 24794287 DOI: 10.1016/j.brs.2014.03.002] [Citation(s) in RCA: 42] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2013] [Revised: 02/04/2014] [Accepted: 03/03/2014] [Indexed: 11/21/2022] Open
Abstract
OBJECTIVE Motor evoked potentials (MEPs) play a pivotal role in transcranial magnetic stimulation (TMS), e.g., for determining the motor threshold and probing cortical excitability. Sampled across the range of stimulation strengths, MEPs outline an input-output (IO) curve, which is often used to characterize the corticospinal tract. More detailed understanding of the signal generation and variability of MEPs would provide insight into the underlying physiology and aid correct statistical treatment of MEP data. METHODS A novel regression model is tested using measured IO data of twelve subjects. The model splits MEP variability into two independent contributions, acting on both sides of a strong sigmoidal nonlinearity that represents neural recruitment. Traditional sigmoidal regression with a single variability source after the nonlinearity is used for comparison. RESULTS The distribution of MEP amplitudes varied across different stimulation strengths, violating statistical assumptions in traditional regression models. In contrast to the conventional regression model, the dual variability source model better described the IO characteristics including phenomena such as changing distribution spread and skewness along the IO curve. CONCLUSIONS MEP variability is best described by two sources that most likely separate variability in the initial excitation process from effects occurring later on. The new model enables more accurate and sensitive estimation of the IO curve characteristics, enhancing its power as a detection tool, and may apply to other brain stimulation modalities. Furthermore, it extracts new information from the IO data concerning the neural variability-information that has previously been treated as noise.
Collapse
|
31
|
Krieg D, Triesch J. A unifying theory of synaptic long-term plasticity based on a sparse distribution of synaptic strength. Front Synaptic Neurosci 2014; 6:3. [PMID: 24624080 PMCID: PMC3941589 DOI: 10.3389/fnsyn.2014.00003] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2013] [Accepted: 02/13/2014] [Indexed: 11/30/2022] Open
Abstract
Long-term synaptic plasticity is fundamental to learning and network function. It has been studied under various induction protocols and depends on firing rates, membrane voltage, and precise timing of action potentials. These protocols show different facets of a common underlying mechanism but they are mostly modeled as distinct phenomena. Here, we show that all of these different dependencies can be explained from a single computational principle. The objective is a sparse distribution of excitatory synaptic strength, which may help to reduce metabolic costs associated with synaptic transmission. Based on this objective we derive a stochastic gradient ascent learning rule which is of differential-Hebbian type. It is formulated in biophysical quantities and can be related to current mechanistic theories of synaptic plasticity. The learning rule accounts for experimental findings from all major induction protocols and explains a classic phenomenon of metaplasticity. Furthermore, our model predicts the existence of metaplasticity for spike-timing-dependent plasticity Thus, we provide a theory of long-term synaptic plasticity that unifies different induction protocols and provides a connection between functional and mechanistic levels of description.
Collapse
Affiliation(s)
- Daniel Krieg
- Frankfurt Institute for Advanced Studies, Goethe University Frankfurt, Germany
| | - Jochen Triesch
- Frankfurt Institute for Advanced Studies, Goethe University Frankfurt, Germany
| |
Collapse
|
32
|
Yu L, Liu L. Optimal size of stochastic Hodgkin-Huxley neuronal systems for maximal energy efficiency in coding pulse signals. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2014; 89:032725. [PMID: 24730892 DOI: 10.1103/physreve.89.032725] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/11/2013] [Indexed: 06/03/2023]
Abstract
The generation and conduction of action potentials (APs) represents a fundamental means of communication in the nervous system and is a metabolically expensive process. In this paper, we investigate the energy efficiency of neural systems in transferring pulse signals with APs. By analytically solving a bistable neuron model that mimics the AP generation with a particle crossing the barrier of a double well, we find the optimal number of ion channels that maximizes the energy efficiency of a neuron. We also investigate the energy efficiency of a neuron population in which the input pulse signals are represented with synchronized spikes and read out with a downstream coincidence detector neuron. We find an optimal number of neurons in neuron population, as well as the number of ion channels in each neuron that maximizes the energy efficiency. The energy efficiency also depends on the characters of the input signals, e.g., the pulse strength and the interpulse intervals. These results are confirmed by computer simulation of the stochastic Hodgkin-Huxley model with a detailed description of the ion channel random gating. We argue that the tradeoff between signal transmission reliability and energy cost may influence the size of the neural systems when energy use is constrained.
Collapse
Affiliation(s)
- Lianchun Yu
- Institute of Theoretical Physics, Lanzhou University, Lanzhou 730000, China and Key Laboratory for Magnetism and Magnetic Materials of the Ministry of Education, Lanzhou University, Lanzhou 730000, China
| | - Liwei Liu
- College of Electrical Engineering, Northwest University for Nationalities, Lanzhou 730070, China
| |
Collapse
|
33
|
Sengupta B, Laughlin SB, Niven JE. Consequences of converting graded to action potentials upon neural information coding and energy efficiency. PLoS Comput Biol 2014; 10:e1003439. [PMID: 24465197 PMCID: PMC3900385 DOI: 10.1371/journal.pcbi.1003439] [Citation(s) in RCA: 36] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2013] [Accepted: 12/02/2013] [Indexed: 11/18/2022] Open
Abstract
Information is encoded in neural circuits using both graded and action potentials, converting between them within single neurons and successive processing layers. This conversion is accompanied by information loss and a drop in energy efficiency. We investigate the biophysical causes of this loss of information and efficiency by comparing spiking neuron models, containing stochastic voltage-gated Na(+) and K(+) channels, with generator potential and graded potential models lacking voltage-gated Na(+) channels. We identify three causes of information loss in the generator potential that are the by-product of action potential generation: (1) the voltage-gated Na(+) channels necessary for action potential generation increase intrinsic noise and (2) introduce non-linearities, and (3) the finite duration of the action potential creates a 'footprint' in the generator potential that obscures incoming signals. These three processes reduce information rates by ∼50% in generator potentials, to ∼3 times that of spike trains. Both generator potentials and graded potentials consume almost an order of magnitude less energy per second than spike trains. Because of the lower information rates of generator potentials they are substantially less energy efficient than graded potentials. However, both are an order of magnitude more efficient than spike trains due to the higher energy costs and low information content of spikes, emphasizing that there is a two-fold cost of converting analogue to digital; information loss and cost inflation.
Collapse
Affiliation(s)
- Biswa Sengupta
- Wellcome Trust Centre for Neuroimaging, University College London, London, United Kingdom
- Centre for Neuroscience, Indian Institute of Science, Bangalore, India
| | | | - Jeremy Edward Niven
- School of Life Sciences and Centre for Computational Neuroscience and Robotics, University of Sussex, Falmer, Brighton, United Kingdom
| |
Collapse
|
34
|
A theory for how sensorimotor skills are learned and retained in noisy and nonstationary neural circuits. Proc Natl Acad Sci U S A 2013; 110:E5078-87. [PMID: 24324147 DOI: 10.1073/pnas.1320116110] [Citation(s) in RCA: 50] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/28/2023] Open
Abstract
During the process of skill learning, synaptic connections in our brains are modified to form motor memories of learned sensorimotor acts. The more plastic the adult brain is, the easier it is to learn new skills or adapt to neurological injury. However, if the brain is too plastic and the pattern of synaptic connectivity is constantly changing, new memories will overwrite old memories, and learning becomes unstable. This trade-off is known as the stability-plasticity dilemma. Here a theory of sensorimotor learning and memory is developed whereby synaptic strengths are perpetually fluctuating without causing instability in motor memory recall, as long as the underlying neural networks are sufficiently noisy and massively redundant. The theory implies two distinct stages of learning--preasymptotic and postasymptotic--because once the error drops to a level comparable to that of the noise-induced error, further error reduction requires altered network dynamics. A key behavioral prediction derived from this analysis is tested in a visuomotor adaptation experiment, and the resultant learning curves are modeled with a nonstationary neural network. Next, the theory is used to model two-photon microscopy data that show, in animals, high rates of dendritic spine turnover, even in the absence of overt behavioral learning. Finally, the theory predicts enhanced task selectivity in the responses of individual motor cortical neurons as the level of task expertise increases. From these considerations, a unique interpretation of sensorimotor memory is proposed--memories are defined not by fixed patterns of synaptic weights but, rather, by nonstationary synaptic patterns that fluctuate coherently.
Collapse
|
35
|
Abstract
Bursts of dendritic calcium spikes play an important role in excitability and synaptic plasticity in many types of neurons. In single Purkinje cells, spontaneous and synaptically evoked dendritic calcium bursts come in a variety of shapes with a variable number of spikes. The mechanisms causing this variability have never been investigated thoroughly. In this study, a detailed computational model using novel simulation routines is applied to identify the roles that stochastic ion channels, spatial arrangements of ion channels, and stochastic intracellular calcium have toward producing calcium burst variability. Consistent with experimental recordings from rats, strong variability in the burst shape is observed in simulations. This variability persists in large model sizes in contrast to models containing only voltage-gated channels, where variability reduces quickly with increase of system size. Phase plane analysis of Hodgkin-Huxley spikes and of calcium bursts identifies fluctuation in phase space around probabilistic phase boundaries as the mechanism determining the dependence of variability on model size. Stochastic calcium dynamics are the main cause of calcium burst fluctuations, specifically the calcium activation of mslo/BK-type and SK2 channels. Local variability of calcium concentration has a significant effect at larger model sizes. Simulations of both spontaneous and synaptically evoked calcium bursts in a reconstructed dendrite show, in addition, strong spatial and temporal variability of voltage and calcium, depending on morphological properties of the dendrite. Our findings suggest that stochastic intracellular calcium mechanisms play a crucial role in dendritic calcium spike generation and are therefore an essential consideration in studies of neuronal excitability and plasticity.
Collapse
|
36
|
Paffi A, Apollonio F, d'Inzeo G, Liberti M. Stochastic resonance induced by exogenous noise in a model of a neuronal network. NETWORK (BRISTOL, ENGLAND) 2013; 24:99-113. [PMID: 23654221 DOI: 10.3109/0954898x.2013.793849] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
This study investigates the possibility of using exogenous noise to restore the processing performances of neuronal systems where the endogenous noise is reduced due to the ageing or to degenerative diseases. This idea is based on the assumption, supported by theoretical studies, that the endogenous noise has a positive role in neuronal signal detection and that its reduction impairs the system function. Results, obtained on a two-layers feedforward network, show the onset of the Stochastic Resonance (SR) behavior, as long as the exogenous noise is properly tailored and filtered. The amount of noise to be furnished from the outside to optimize the system performance depends on the residual level of endogenous noise, indicating that both kinds of noise cooperate to the signal detection. These results support potentially new bioengineering applications where exogenous noise is furnished to enhance signal detectability.
Collapse
Affiliation(s)
- Alessandra Paffi
- Sapienza University of Rome, Department of Information Engineering, Electronics and Telecommunication, Via Eudossiana 18, 0184 Rome, Italy.
| | | | | | | |
Collapse
|
37
|
Comparison of models for IP3 receptor kinetics using stochastic simulations. PLoS One 2013; 8:e59618. [PMID: 23630568 PMCID: PMC3629942 DOI: 10.1371/journal.pone.0059618] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2012] [Accepted: 02/15/2013] [Indexed: 12/07/2022] Open
Abstract
Inositol 1,4,5-trisphosphate receptor (IP3R) is a ubiquitous intracellular calcium (Ca2+) channel which has a major role in controlling Ca2+ levels in neurons. A variety of computational models have been developed to describe the kinetic function of IP3R under different conditions. In the field of computational neuroscience, it is of great interest to apply the existing models of IP3R when modeling local Ca2+ transients in dendrites or overall Ca2+ dynamics in large neuronal models. The goal of this study was to evaluate existing IP3R models, based on electrophysiological data. This was done in order to be able to suggest suitable models for neuronal modeling. Altogether four models (Othmer and Tang, 1993; Dawson etal., 2003; Fraiman and Dawson, 2004; Doi etal., 2005) were selected for a more detailed comparison. The selection was based on the computational efficiency of the models and the type of experimental data that was used in developing the model. The kinetics of all four models were simulated by stochastic means, using the simulation software STEPS, which implements the Gillespie stochastic simulation algorithm. The results show major differences in the statistical properties of model functionality. Of the four compared models, the one by Fraiman and Dawson (2004) proved most satisfactory in producing the specific features of experimental findings reported in literature. To our knowledge, the present study is the first detailed evaluation of IP3R models using stochastic simulation methods, thus providing an important setting for constructing a new, realistic model of IP3R channel kinetics for compartmental modeling of neuronal functions. We conclude that the kinetics of IP3R with different concentrations of Ca2+ and IP3 should be more carefully addressed when new models for IP3R are developed.
Collapse
|
38
|
SERLETIS DEMITRE, CARLEN PETERL, VALIANTE TAUFIKA, BARDAKJIAN BERJL. PHASE SYNCHRONIZATION OF NEURONAL NOISE IN MOUSE HIPPOCAMPAL EPILEPTIFORM DYNAMICS. Int J Neural Syst 2012; 23:1250033. [DOI: 10.1142/s0129065712500335] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Organized brain activity is the result of dynamical, segregated neuronal signals that may be used to investigate synchronization effects using sophisticated neuroengineering techniques. Phase synchrony analysis, in particular, has emerged as a promising methodology to study transient and frequency-specific coupling effects across multi-site signals. In this study, we investigated phase synchronization in intracellular recordings of interictal and ictal epileptiform events recorded from pairs of cells in the whole (intact) mouse hippocampus. In particular, we focused our analysis on the background noise-like activity (NLA), previously reported to exhibit complex neurodynamical properties. Our results show evidence for increased linear and nonlinear phase coupling in NLA across three frequency bands [theta (4–10 Hz), beta (12–30 Hz) and gamma (30–80 Hz)] in the ictal compared to interictal state dynamics. We also present qualitative and statistical evidence for increased phase synchronization in the theta, beta and gamma frequency bands from paired recordings of ictal NLA. Overall, our results validate the use of background NLA in the neurodynamical study of epileptiform transitions and suggest that what is considered "neuronal noise" is amenable to synchronization effects in the spatiotemporal domain.
Collapse
Affiliation(s)
- DEMITRE SERLETIS
- Neurological Institute, Epilepsy Center, Cleveland Clinic, Ohio 44195, USA
| | - PETER L. CARLEN
- Division of Neurology, Toronto Western Hospital, Ontario M5T 2S8, Canada
- Department of Physiology, University of Toronto, Ontario M5S 1A8, Canada
- Institute of Biomaterials and Biomedical Engineering, University of Toronto, Ontario M5S 3G9, Canada
| | - TAUFIK A. VALIANTE
- Division of Neurosurgery, Toronto Western Hospital, Ontario M5T 2S8, Canada
| | - BERJ L. BARDAKJIAN
- Institute of Biomaterials and Biomedical Engineering, University of Toronto, Ontario M5S 3G9, Canada
| |
Collapse
|
39
|
Orio P, Soudry D. Simple, fast and accurate implementation of the diffusion approximation algorithm for stochastic ion channels with multiple states. PLoS One 2012; 7:e36670. [PMID: 22629320 PMCID: PMC3358312 DOI: 10.1371/journal.pone.0036670] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2011] [Accepted: 04/11/2012] [Indexed: 11/18/2022] Open
Abstract
BACKGROUND The phenomena that emerge from the interaction of the stochastic opening and closing of ion channels (channel noise) with the non-linear neural dynamics are essential to our understanding of the operation of the nervous system. The effects that channel noise can have on neural dynamics are generally studied using numerical simulations of stochastic models. Algorithms based on discrete Markov Chains (MC) seem to be the most reliable and trustworthy, but even optimized algorithms come with a non-negligible computational cost. Diffusion Approximation (DA) methods use Stochastic Differential Equations (SDE) to approximate the behavior of a number of MCs, considerably speeding up simulation times. However, model comparisons have suggested that DA methods did not lead to the same results as in MC modeling in terms of channel noise statistics and effects on excitability. Recently, it was shown that the difference arose because MCs were modeled with coupled gating particles, while the DA was modeled using uncoupled gating particles. Implementations of DA with coupled particles, in the context of a specific kinetic scheme, yielded similar results to MC. However, it remained unclear how to generalize these implementations to different kinetic schemes, or whether they were faster than MC algorithms. Additionally, a steady state approximation was used for the stochastic terms, which, as we show here, can introduce significant inaccuracies. MAIN CONTRIBUTIONS We derived the SDE explicitly for any given ion channel kinetic scheme. The resulting generic equations were surprisingly simple and interpretable--allowing an easy, transparent and efficient DA implementation, avoiding unnecessary approximations. The algorithm was tested in a voltage clamp simulation and in two different current clamp simulations, yielding the same results as MC modeling. Also, the simulation efficiency of this DA method demonstrated considerable superiority over MC methods, except when short time steps or low channel numbers were used.
Collapse
Affiliation(s)
- Patricio Orio
- Centro Interdisciplinario de Neurociencia de Valparaíso, Facultad de Ciencias, Universidad de Valparaíso, Valparaíso, Chile.
| | | |
Collapse
|
40
|
Soudry D, Meir R. Conductance-based neuron models and the slow dynamics of excitability. Front Comput Neurosci 2012; 6:4. [PMID: 22355288 PMCID: PMC3280430 DOI: 10.3389/fncom.2012.00004] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2011] [Accepted: 01/11/2012] [Indexed: 12/03/2022] Open
Abstract
In recent experiments, synaptically isolated neurons from rat cortical culture, were stimulated with periodic extracellular fixed-amplitude current pulses for extended durations of days. The neuron’s response depended on its own history, as well as on the history of the input, and was classified into several modes. Interestingly, in one of the modes the neuron behaved intermittently, exhibiting irregular firing patterns changing in a complex and variable manner over the entire range of experimental timescales, from seconds to days. With the aim of developing a minimal biophysical explanation for these results, we propose a general scheme, that, given a few assumptions (mainly, a timescale separation in kinetics) closely describes the response of deterministic conductance-based neuron models under pulse stimulation, using a discrete time piecewise linear mapping, which is amenable to detailed mathematical analysis. Using this method we reproduce the basic modes exhibited by the neuron experimentally, as well as the mean response in each mode. Specifically, we derive precise closed-form input-output expressions for the transient timescale and firing rates, which are expressed in terms of experimentally measurable variables, and conform with the experimental results. However, the mathematical analysis shows that the resulting firing patterns in these deterministic models are always regular and repeatable (i.e., no chaos), in contrast to the irregular and variable behavior displayed by the neuron in certain regimes. This fact, and the sensitive near-threshold dynamics of the model, indicate that intrinsic ion channel noise has a significant impact on the neuronal response, and may help reproduce the experimentally observed variability, as we also demonstrate numerically. In a companion paper, we extend our analysis to stochastic conductance-based models, and show how these can be used to reproduce the details of the observed irregular and variable neuronal response.
Collapse
Affiliation(s)
- Daniel Soudry
- Department of Electrical Engineering, The Laboratory for Network Biology Research Technion, Haifa, Israel
| | | |
Collapse
|
41
|
Raja Beharelle A, Kovačević N, McIntosh AR, Levine B. Brain signal variability relates to stability of behavior after recovery from diffuse brain injury. Neuroimage 2012; 60:1528-37. [PMID: 22261371 DOI: 10.1016/j.neuroimage.2012.01.037] [Citation(s) in RCA: 52] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2011] [Revised: 12/19/2011] [Accepted: 01/03/2012] [Indexed: 01/21/2023] Open
Abstract
Variability or noise is an unmistakable feature of neural signals; however such fluctuations have been regarded as not carrying meaningful information or as detrimental for neural processes. Recent empirical and computational work has shown that neural systems with a greater capacity for information processing are able to explore a more varied dynamic repertoire, and the hallmark of this is increased irregularity or variability in the neural signal. How this variability in neural dynamics affects behavior remains unclear. Here, we investigated the role of variability of magnetoencephalography signals in supporting healthy cognitive functioning, measured by performance on an attention task, in healthy adults and in patients with traumatic brain injury. As an index of variability, we calculated multiscale entropy, which quantifies the temporal predictability of a time series across progressively more coarse time scales. We found lower variability in traumatic brain injury patients compared to controls, arguing against the idea that greater variability reflects dysfunctional neural processing. Furthermore, higher brain signal variability indicated improved behavioral performance for all participants. This relationship was statistically stronger for people with brain injury, demonstrating that those with higher brain signal variability were also those who had recovered the most cognitive ability. Rather than impede neural processing, cortical signal variability within an optimal range enables the exploration of diverse functional configurations, and may therefore play a vital role in healthy brain function.
Collapse
|
42
|
Abstract
The processing speed of the brain depends on the ability of neurons to rapidly relay input changes. Previous theoretical and experimental studies of the timescale of population firing rate responses arrived at controversial conclusions, some advocating an ultrafast response scale but others arguing for an inherent disadvantage of mean encoded signals for rapid detection of the stimulus onset. Here we assessed the timescale of population firing rate responses of neocortical neurons in experiments performed in the time domain and the frequency domain in vitro and in vivo. We show that populations of neocortical neurons can alter their firing rate within 1 ms in response to somatically delivered weak current signals presented on a fluctuating background. Signals with amplitudes of miniature postsynaptic currents can be robustly and rapidly detected in the population firing. We further show that population firing rate of neurons of rat visual cortex in vitro and cat visual cortex in vivo can reliably encode weak signals varying at frequencies up to ∼200-300 Hz, or ∼50 times faster than the firing rate of individual neurons. These results provide coherent evidence for the ultrafast, millisecond timescale of cortical population responses. Notably, fast responses to weak stimuli are limited to the mean encoding. Rapid detection of current variance changes requires extraordinarily large signal amplitudes. Our study presents conclusive evidence showing that cortical neurons are capable of rapidly relaying subtle mean current signals. This provides a vital mechanism for the propagation of rate-coded information within and across brain areas.
Collapse
|
43
|
Reduction of stochastic conductance-based neuron models with time-scales separation. J Comput Neurosci 2011; 32:327-46. [DOI: 10.1007/s10827-011-0355-7] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2011] [Revised: 06/13/2011] [Accepted: 07/19/2011] [Indexed: 10/17/2022]
|
44
|
Accurate and fast simulation of channel noise in conductance-based model neurons by diffusion approximation. PLoS Comput Biol 2011; 7:e1001102. [PMID: 21423712 PMCID: PMC3053314 DOI: 10.1371/journal.pcbi.1001102] [Citation(s) in RCA: 66] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2010] [Accepted: 01/28/2011] [Indexed: 11/19/2022] Open
Abstract
Stochastic channel gating is the major source of intrinsic neuronal noise whose functional consequences at the microcircuit- and network-levels have been only partly explored. A systematic study of this channel noise in large ensembles of biophysically detailed model neurons calls for the availability of fast numerical methods. In fact, exact techniques employ the microscopic simulation of the random opening and closing of individual ion channels, usually based on Markov models, whose computational loads are prohibitive for next generation massive computer models of the brain. In this work, we operatively define a procedure for translating any Markov model describing voltage- or ligand-gated membrane ion-conductances into an effective stochastic version, whose computer simulation is efficient, without compromising accuracy. Our approximation is based on an improved Langevin-like approach, which employs stochastic differential equations and no Montecarlo methods. As opposed to an earlier proposal recently debated in the literature, our approximation reproduces accurately the statistical properties of the exact microscopic simulations, under a variety of conditions, from spontaneous to evoked response features. In addition, our method is not restricted to the Hodgkin-Huxley sodium and potassium currents and is general for a variety of voltage- and ligand-gated ion currents. As a by-product, the analysis of the properties emerging in exact Markov schemes by standard probability calculus enables us for the first time to analytically identify the sources of inaccuracy of the previous proposal, while providing solid ground for its modification and improvement we present here. A possible approach to understanding the neuronal bases of the computational properties of the nervous system consists of modelling its basic building blocks, neurons and synapses, and then simulating their collective activity emerging in large networks. In developing such models, a satisfactory description level must be chosen as a compromise between simplicity and faithfulness in reproducing experimental data. Deterministic neuron models – i.e., models that upon repeated simulation with fixed parameter values provide the same results – are usually made up of ordinary differential equations and allow for relatively fast simulation times. By contrast, they do not describe accurately the underlying stochastic response properties arising from the microscopical correlate of neuronal excitability. Stochastic models are usually based on mathematical descriptions of individual ion channels, or on an effective macroscopic account of their random opening and closing. In this contribution we describe a general method to transform any deterministic neuron model into its effective stochastic version that accurately replicates the statistical properties of ion channels random kinetics.
Collapse
|
45
|
Complexity in neuronal noise depends on network interconnectivity. Ann Biomed Eng 2011; 39:1768-78. [PMID: 21347547 DOI: 10.1007/s10439-011-0281-x] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2010] [Accepted: 02/13/2011] [Indexed: 12/31/2022]
Abstract
"Noise," or noise-like activity (NLA), defines background electrical membrane potential fluctuations at the cellular level of the nervous system, comprising an important aspect of brain dynamics. Using whole-cell voltage recordings from fast-spiking stratum oriens interneurons and stratum pyramidale neurons located in the CA3 region of the intact mouse hippocampus, we applied complexity measures from dynamical systems theory (i.e., 1/f(γ) noise and correlation dimension) and found evidence for complexity in neuronal NLA, ranging from high- to low-complexity dynamics. Importantly, these high- and low-complexity signal features were largely dependent on gap junction and chemical synaptic transmission. Progressive neuronal isolation from the surrounding local network via gap junction blockade (abolishing gap junction-dependent spikelets) and then chemical synaptic blockade (abolishing excitatory and inhibitory post-synaptic potentials), or the reverse order of these treatments, resulted in emergence of high-complexity NLA dynamics. Restoring local network interconnectivity via blockade washout resulted in resolution to low-complexity behavior. These results suggest that the observed increase in background NLA complexity is the result of reduced network interconnectivity, thereby highlighting the potential importance of the NLA signal to the study of network state transitions arising in normal and abnormal brain dynamics (such as in epilepsy, for example).
Collapse
|
46
|
Lin RJ, Jaeger D. Using computer simulations to determine the limitations of dynamic clamp stimuli applied at the soma in mimicking distributed conductance sources. J Neurophysiol 2011; 105:2610-24. [PMID: 21325676 DOI: 10.1152/jn.00968.2010] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
In previous studies we used the technique of dynamic clamp to study how temporal modulation of inhibitory and excitatory inputs control the frequency and precise timing of spikes in neurons of the deep cerebellar nuclei (DCN). Although this technique is now widely used, it is limited to interpreting conductance inputs as being location independent; i.e., all inputs that are biologically distributed across the dendritic tree are applied to the soma. We used computer simulations of a morphologically realistic model of DCN neurons to compare the effects of purely somatic vs. distributed dendritic inputs in this cell type. We applied the same conductance stimuli used in our published experiments to the model. To simulate variability in neuronal responses to repeated stimuli, we added a somatic white current noise to reproduce subthreshold fluctuations in the membrane potential. We were able to replicate our dynamic clamp results with respect to spike rates and spike precision for different patterns of background synaptic activity. We found only minor differences in the spike pattern generation between focal or distributed input in this cell type even when strong inhibitory or excitatory bursts were applied. However, the location dependence of dynamic clamp stimuli is likely to be different for each cell type examined, and the simulation approach developed in the present study will allow a careful assessment of location dependence in all cell types.
Collapse
Affiliation(s)
- Risa J Lin
- Department of Biomedical Engineering, Georgia Institute of Technology and Emory University, Atlanta, GA 30322, USA
| | | |
Collapse
|
47
|
Nesse WH, Clark GA. Relative spike timing in stochastic oscillator networks of the Hermissenda eye. BIOLOGICAL CYBERNETICS 2010; 102:389-412. [PMID: 20237937 DOI: 10.1007/s00422-010-0374-x] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/06/2009] [Accepted: 02/18/2010] [Indexed: 05/28/2023]
Abstract
The role of relative spike timing on sensory coding and stochastic dynamics of small pulse-coupled oscillator networks is investigated physiologically and mathematically, based on the small biological eye network of the marine invertebrate Hermissenda. Without network interactions, the five inhibitory photoreceptors of the eye network exhibit quasi-regular rhythmic spiking; in contrast, within the active network, they display more irregular spiking but collective network rhythmicity. We investigate the source of this emergent network behavior first analyzing the role of relative input to spike-timing relationships in individual cells. We use a stochastic phase oscillator equation to model photoreceptor spike sequences in response to sequences of inhibitory current pulses. Although spike sequences can be complex and irregular in response to inputs, we show that spike timing is better predicted if relative timing of spikes to inputs is accounted for in the model. Further, we establish that greater noise levels in the model serve to destroy network phase-locked states that induce non-monotonic stimulus rate-coding, as predicted in Butson and Clark (J Neurophysiol 99:146-154, 2008a; J Neurophysiol 99:155-165, 2008b). Hence, rate-coding can function better in noisy spiking cells relative to non-noisy cells. We then study how relative input to spike-timing dynamics of single oscillators contribute to network-level dynamics. Relative timing interactions in the network sharpen the stimulus window that can trigger a spike, affecting stimulus encoding. Also, we derive analytical inter-spike interval distributions of cells in the model network, revealing that irregular Poisson-like spike emission and collective network rhythmicity are emergent properties of network dynamics, consistent with experimental observations. Our theoretical results generate experimental predictions about the nature of spike patterns in the Hermissenda eye.
Collapse
Affiliation(s)
- William H Nesse
- Department of Cellular and Molecular Medicine, University of Ottawa, 451 Smyth Road, Ottawa, ON, K1H 8M5, Canada.
| | | |
Collapse
|
48
|
Network-state modulation of power-law frequency-scaling in visual cortical neurons. PLoS Comput Biol 2009; 5:e1000519. [PMID: 19779556 PMCID: PMC2740863 DOI: 10.1371/journal.pcbi.1000519] [Citation(s) in RCA: 59] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2009] [Accepted: 08/25/2009] [Indexed: 11/19/2022] Open
Abstract
Various types of neural-based signals, such as EEG, local field potentials and intracellular synaptic potentials, integrate multiple sources of activity distributed across large assemblies. They have in common a power-law frequency-scaling structure at high frequencies, but it is still unclear whether this scaling property is dominated by intrinsic neuronal properties or by network activity. The latter case is particularly interesting because if frequency-scaling reflects the network state it could be used to characterize the functional impact of the connectivity. In intracellularly recorded neurons of cat primary visual cortex in vivo, the power spectral density of Vm activity displays a power-law structure at high frequencies with a fractional scaling exponent. We show that this exponent is not constant, but depends on the visual statistics used to drive the network. To investigate the determinants of this frequency-scaling, we considered a generic recurrent model of cortex receiving a retinotopically organized external input. Similarly to the in vivo case, our in computo simulations show that the scaling exponent reflects the correlation level imposed in the input. This systematic dependence was also replicated at the single cell level, by controlling independently, in a parametric way, the strength and the temporal decay of the pairwise correlation between presynaptic inputs. This last model was implemented in vitro by imposing the correlation control in artificial presynaptic spike trains through dynamic-clamp techniques. These in vitro manipulations induced a modulation of the scaling exponent, similar to that observed in vivo and predicted in computo. We conclude that the frequency-scaling exponent of the Vm reflects stimulus-driven correlations in the cortical network activity. Therefore, we propose that the scaling exponent could be used to read-out the “effective” connectivity responsible for the dynamical signature of the population signals measured at different integration levels, from Vm to LFP, EEG and fMRI. Intracellular recording of neocortical neurons provides an opportunity of characterizing the statistical signature of the synaptic bombardment to which it is submitted. Indeed the membrane potential displays intense fluctuations which reflect the cumulative activity of thousands of input neurons. In sensory cortical areas, this measure could be used to estimate the correlational structure of the external drive. We show that changes in the statistical properties of network activity, namely the local correlation between neurons, can be detected by analyzing the power spectrum density (PSD) of the subthreshold membrane potential. These PSD can be fitted by a power-law function 1/fα in the upper temporal frequency range. In vivo recordings in primary visual cortex show that the α exponent varies with the statistics of the sensory input. Most remarkably, the exponent observed in the ongoing activity is indistinguishable from that evoked by natural visual statistics. These results are emulated by models which demonstrate that the exponent α is determined by the local level of correlation imposed in the recurrent network activity. Similar relationships are also reproduced in cortical neurons recorded in vitro with artificial synaptic inputs by controlling in computo the level of correlation in real time.
Collapse
|
49
|
Numerical exploration of the influence of neural noise on the psychometric function at low stimulation intensity levels. J Biosci 2008; 33:743-753. [DOI: 10.1007/s12038-008-0094-8] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
50
|
Massobrio P, Martinoia S. Modelling small-patterned neuronal networks coupled to microelectrode arrays. J Neural Eng 2008; 5:350-9. [PMID: 18756034 DOI: 10.1088/1741-2560/5/3/008] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Cultured neurons coupled to planar substrates which exhibit 'well-defined' two-dimensional network architectures can provide valuable insights into cell-to-cell communication, network dynamics versus topology, and basic mechanisms of synaptic plasticity and learning. In the literature several approaches were presented to drive neuronal growth, such as surface modification by silane chemistry, photolithographic techniques, microcontact printing, microfluidic channel flow patterning, microdrop patterning, etc. This work presents a computational model fit for reproducing and explaining the dynamics exhibited by small-patterned neuronal networks coupled to microelectrode arrays (MEAs). The model is based on the concept of meta-neuron, i.e., a small spatially confined number of actual neurons which perform single macroscopic functions. Each meta-neuron is characterized by a detailed morphology, and the membrane channels are modelled by simple Hodgkin-Huxley and passive kinetics. The two main findings that emerge from the simulations can be summarized as follows: (i) the increasing complexity of meta-neuron morphology reflects the variations of the network dynamics as a function of network development; (ii) the dynamics displayed by the patterned neuronal networks considered can be explained by hypothesizing the presence of several short- and a few long-term distance interactions among small assemblies of neurons (i.e., meta-neurons).
Collapse
Affiliation(s)
- Paolo Massobrio
- Neuroengineering and Bio-Nano Technology Group, Department of Biophysical and Electronic Engineering (DIBE), University of Genova, Via Opera Pia 11a, 16145 Genova, Italy.
| | | |
Collapse
|