1
|
Haggard M, Chacron MJ. Nonresponsive Neurons Improve Population Coding of Object Location. J Neurosci 2025; 45:e1068242024. [PMID: 39542727 PMCID: PMC11735655 DOI: 10.1523/jneurosci.1068-24.2024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2024] [Revised: 09/24/2024] [Accepted: 10/26/2024] [Indexed: 11/17/2024] Open
Abstract
Understanding how heterogeneous neural populations represent sensory input to give rise to behavior remains a central problem in systems neuroscience. Here we investigated how midbrain neurons within the electrosensory system of Apteronotus leptorhynchus code for object location in space. In vivo simultaneous recordings were achieved via Neuropixels probes, high-density electrode arrays, with the stimulus positioned at different locations relative to the animal. Midbrain neurons exhibited heterogeneous response profiles, with a significant proportion (65%) seemingly nonresponsive to moving stimuli. Remarkably, we found that nonresponsive neurons increased population coding of object location through synergistic interactions with responsive neurons by effectively reducing noise. Mathematical modeling demonstrated that increased response heterogeneity together with the experimentally observed correlations was sufficient to give rise to independent encoding by responsive neurons. Furthermore, the addition of nonresponsive neurons in the model gave rise to synergistic population coding. Taken together, our findings reveal that nonresponsive neurons, which are frequently excluded from analysis, can significantly improve population coding of object location through synergistic interactions with responsive neurons. Combinations of responsive and nonresponsive neurons have been observed in sensory systems across taxa; it is likely that similar synergistic interactions improve population coding across modalities and behavioral tasks.
Collapse
Affiliation(s)
- Myriah Haggard
- Quantitative Life Sciences, McGill University, Montreal, Quebec H3G 1Y6, Canada
| | - Maurice J Chacron
- Department of Physiology, McGill University, Montreal, Quebec H3G 1Y6, Canada
| |
Collapse
|
2
|
Haggard M, Chacron MJ. Coding of object location by heterogeneous neural populations with spatially dependent correlations in weakly electric fish. PLoS Comput Biol 2023; 19:e1010938. [PMID: 36867650 PMCID: PMC10016687 DOI: 10.1371/journal.pcbi.1010938] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2022] [Revised: 03/15/2023] [Accepted: 02/09/2023] [Indexed: 03/04/2023] Open
Abstract
Understanding how neural populations encode sensory stimuli remains a central problem in neuroscience. Here we performed multi-unit recordings from sensory neural populations in the electrosensory system of the weakly electric fish Apteronotus leptorhynchus in response to stimuli located at different positions along the rostro-caudal axis. Our results reveal that the spatial dependence of correlated activity along receptive fields can help mitigate the deleterious effects that these correlations would otherwise have if they were spatially independent. Moreover, using mathematical modeling, we show that experimentally observed heterogeneities in the receptive fields of neurons help optimize information transmission as to object location. Taken together, our results have important implications for understanding how sensory neurons whose receptive fields display antagonistic center-surround organization encode location. Important similarities between the electrosensory system and other sensory systems suggest that our results will be applicable elsewhere.
Collapse
Affiliation(s)
- Myriah Haggard
- Quantitative Life Sciences, McGill University, Montreal, Canada
| | | |
Collapse
|
3
|
Sidhu RS, Johnson EC, Jones DL, Ratnam R. A dynamic spike threshold with correlated noise predicts observed patterns of negative interval correlations in neuronal spike trains. BIOLOGICAL CYBERNETICS 2022; 116:611-633. [PMID: 36244004 PMCID: PMC9691502 DOI: 10.1007/s00422-022-00946-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/10/2021] [Accepted: 09/26/2022] [Indexed: 06/16/2023]
Abstract
Negative correlations in the sequential evolution of interspike intervals (ISIs) are a signature of memory in neuronal spike-trains. They provide coding benefits including firing-rate stabilization, improved detectability of weak sensory signals, and enhanced transmission of information by improving signal-to-noise ratio. Primary electrosensory afferent spike-trains in weakly electric fish fall into two categories based on the pattern of ISI correlations: non-bursting units have negative correlations which remain negative but decay to zero with increasing lags (Type I ISI correlations), and bursting units have oscillatory (alternating sign) correlation which damp to zero with increasing lags (Type II ISI correlations). Here, we predict and match observed ISI correlations in these afferents using a stochastic dynamic threshold model. We determine the ISI correlation function as a function of an arbitrary discrete noise correlation function [Formula: see text], where k is a multiple of the mean ISI. The function permits forward and inverse calculations of the correlation function. Both types of correlation functions can be generated by adding colored noise to the spike threshold with Type I correlations generated with slow noise and Type II correlations generated with fast noise. A first-order autoregressive (AR) process with a single parameter is sufficient to predict and accurately match both types of afferent ISI correlation functions, with the type being determined by the sign of the AR parameter. The predicted and experimentally observed correlations are in geometric progression. The theory predicts that the limiting sum of ISI correlations is [Formula: see text] yielding a perfect DC-block in the power spectrum of the spike train. Observed ISI correlations from afferents have a limiting sum that is slightly larger at [Formula: see text] ([Formula: see text]). We conclude that the underlying process for generating ISIs may be a simple combination of low-order AR and moving average processes and discuss the results from the perspective of optimal coding.
Collapse
Affiliation(s)
- Robin S Sidhu
- Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, Urbana, IL, USA
| | - Erik C Johnson
- The Johns Hopkins University Applied Physics Laboratory, Laurel, MD, USA
| | - Douglas L Jones
- Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, Urbana, IL, USA
| | - Rama Ratnam
- Division of Biological and Life Sciences, School of Arts and Sciences, Ahmedabad University, Ahmedabad, Gujarat, India.
| |
Collapse
|
4
|
Zeltser G, Sukhanov IM, Nevorotin AJ. MMM - The molecular model of memory. J Theor Biol 2022; 549:111219. [PMID: 35810778 DOI: 10.1016/j.jtbi.2022.111219] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2021] [Revised: 07/01/2022] [Accepted: 07/04/2022] [Indexed: 11/17/2022]
Abstract
Identifying mechanisms underlying neurons ability to process information including acquisition, storage, and retrieval plays an important role in the understanding of the different types of memory, pathogenesis of many neurological diseases affecting memory and therapeutic target discovery. However, the traditional understanding of the mechanisms of memory associated with the electrical signals having a unique combination of frequency and amplitude does not answer the question how the memories can survive for life-long periods of time, while exposed to synaptic noise. Recent evidence suggests that, apart from neuronal circuits, a diversity of the molecular memory (MM) carriers, are essential for memory performance. The molecular model of memory (MMM) is proposed, according to which each item of incoming information (the elementary memory item - eMI) is encoded by both circuitries, with the unique for a given MI electrical parameters, and also the MM carriers, unique by its molecular composition. While operating as the carriers of incoming information, the MMs, are functioning within the neuron plasma membrane. Inactive (latent) initially, during acquisition each of the eMIs is activated to become a virtual copy of some real fact or events bygone. This activation is accompanied by the considerable remodeling of the MM molecule associated with the resonance effect.
Collapse
Affiliation(s)
| | - Ilya M Sukhanov
- Lab. Behavioral Pharmacology, Dept. Psychopharmacology, Valdman Institute of Pharmacology, I.P. Pavlov Medical University, Leo Tolstoi Street 6/8, St. Petersburg 197022, The Russian Federation
| | - Alexey J Nevorotin
- Laboratory of Electron Microscopy, I.P. Pavlov Medical University, Leo Tolstoi Street 6/8, St. Petersburg 197022, The Russian Federation
| |
Collapse
|
5
|
Holzhausen K, Ramlow L, Pu S, Thomas PJ, Lindner B. Mean-return-time phase of a stochastic oscillator provides an approximate renewal description for the associated point process. BIOLOGICAL CYBERNETICS 2022; 116:235-251. [PMID: 35166932 PMCID: PMC9068687 DOI: 10.1007/s00422-022-00920-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/19/2021] [Accepted: 01/11/2022] [Indexed: 06/14/2023]
Abstract
Stochastic oscillations can be characterized by a corresponding point process; this is a common practice in computational neuroscience, where oscillations of the membrane voltage under the influence of noise are often analyzed in terms of the interspike interval statistics, specifically the distribution and correlation of intervals between subsequent threshold-crossing times. More generally, crossing times and the corresponding interval sequences can be introduced for different kinds of stochastic oscillators that have been used to model variability of rhythmic activity in biological systems. In this paper we show that if we use the so-called mean-return-time (MRT) phase isochrons (introduced by Schwabedal and Pikovsky) to count the cycles of a stochastic oscillator with Markovian dynamics, the interphase interval sequence does not show any linear correlations, i.e., the corresponding sequence of passage times forms approximately a renewal point process. We first outline the general mathematical argument for this finding and illustrate it numerically for three models of increasing complexity: (i) the isotropic Guckenheimer-Schwabedal-Pikovsky oscillator that displays positive interspike interval (ISI) correlations if rotations are counted by passing the spoke of a wheel; (ii) the adaptive leaky integrate-and-fire model with white Gaussian noise that shows negative interspike interval correlations when spikes are counted in the usual way by the passage of a voltage threshold; (iii) a Hodgkin-Huxley model with channel noise (in the diffusion approximation represented by Gaussian noise) that exhibits weak but statistically significant interspike interval correlations, again for spikes counted when passing a voltage threshold. For all these models, linear correlations between intervals vanish when we count rotations by the passage of an MRT isochron. We finally discuss that the removal of interval correlations does not change the long-term variability and its effect on information transmission, especially in the neural context.
Collapse
Affiliation(s)
- Konstantin Holzhausen
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Lukas Ramlow
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Shusen Pu
- Department of Biomedical Engineering, 5814 Stevenson Center, Vanderbilt University, Nashville, TN 37215 USA
| | - Peter J. Thomas
- Department of Mathematics, Applied Mathematics, and Statistics, 212 Yost Hall, Case Western Reserve University, 10900 Euclid Avenue, Cleveland, Ohio USA
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| |
Collapse
|
6
|
Wason TD. A model integrating multiple processes of synchronization and coherence for information instantiation within a cortical area. Biosystems 2021; 205:104403. [PMID: 33746019 DOI: 10.1016/j.biosystems.2021.104403] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2021] [Accepted: 03/05/2021] [Indexed: 12/14/2022]
Abstract
What is the form of dynamic, e.g., sensory, information in the mammalian cortex? Information in the cortex is modeled as a coherence map of a mixed chimera state of synchronous, phasic, and disordered minicolumns. The theoretical model is built on neurophysiological evidence. Complex spatiotemporal information is instantiated through a system of interacting biological processes that generate a synchronized cortical area, a coherent aperture. Minicolumn elements are grouped in macrocolumns in an array analogous to a phased-array radar, modeled as an aperture, a "hole through which radiant energy flows." Coherence maps in a cortical area transform inputs from multiple sources into outputs to multiple targets, while reducing complexity and entropy. Coherent apertures can assume extremely large numbers of different information states as coherence maps, which can be communicated among apertures with corresponding very large bandwidths. The coherent aperture model incorporates considerable reported research, integrating five conceptually and mathematically independent processes: 1) a damped Kuramoto network model, 2) a pumped area field potential, 3) the gating of nearly coincident spikes, 4) the coherence of activity across cortical lamina, and 5) complex information formed through functions in macrocolumns. Biological processes and their interactions are described in equations and a functional circuit such that the mathematical pieces can be assembled the same way the neurophysiological ones are. The model can be conceptually convolved over the specifics of local cortical areas within and across species. A coherent aperture becomes a node in a graph of cortical areas with a corresponding distribution of information.
Collapse
Affiliation(s)
- Thomas D Wason
- North Carolina State University, Department of Biological Sciences, Meitzen Laboratory, Campus Box 7617, 128 David Clark Labs, Raleigh, NC 27695-7617, USA.
| |
Collapse
|
7
|
Nesse WH, Maler L, Longtin A. Enhanced Signal Detection by Adaptive Decorrelation of Interspike Intervals. Neural Comput 2020; 33:341-375. [PMID: 33253034 DOI: 10.1162/neco_a_01347] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/27/2022]
Abstract
Spike trains with negative interspike interval (ISI) correlations, in which long/short ISIs are more likely followed by short/long ISIs, are common in many neurons. They can be described by stochastic models with a spike-triggered adaptation variable. We analyze a phenomenon in these models where such statistically dependent ISI sequences arise in tandem with quasi-statistically independent and identically distributed (quasi-IID) adaptation variable sequences. The sequences of adaptation states and resulting ISIs are linked by a nonlinear decorrelating transformation. We establish general conditions on a family of stochastic spiking models that guarantee this quasi-IID property and establish bounds on the resulting baseline ISI correlations. Inputs that elicit weak firing rate changes in samples with many spikes are known to be more detectible when negative ISI correlations are present because they reduce spike count variance; this defines a variance-reduced firing rate coding benchmark. We performed a Fisher information analysis on these adapting models exhibiting ISI correlations to show that a spike pattern code based on the quasi-IID property achieves the upper bound of detection performance, surpassing rate codes with the same mean rate-including the variance-reduced rate code benchmark-by 20% to 30%. The information loss in rate codes arises because the benefits of reduced spike count variance cannot compensate for the lower firing rate gain due to adaptation. Since adaptation states have similar dynamics to synaptic responses, the quasi-IID decorrelation transformation of the spike train is plausibly implemented by downstream neurons through matched postsynaptic kinetics. This provides an explanation for observed coding performance in sensory systems that cannot be accounted for by rate coding, for example, at the detection threshold where rate changes can be insignificant.
Collapse
Affiliation(s)
- William H Nesse
- Department of Mathematics, University of Utah, Salt Lake City, UT 84112, U.S.A.
| | - Leonard Maler
- Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, ON K1H 8M5, Canada
| | - André Longtin
- Department of Physics, University of Ottawa, Ottawa, ON K1N 6N5, Canada
| |
Collapse
|
8
|
Braun W, Longtin A. Interspike interval correlations in networks of inhibitory integrate-and-fire neurons. Phys Rev E 2019; 99:032402. [PMID: 30999498 DOI: 10.1103/physreve.99.032402] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2018] [Indexed: 11/07/2022]
Abstract
We study temporal correlations of interspike intervals, quantified by the network-averaged serial correlation coefficient (SCC), in networks of both current- and conductance-based purely inhibitory integrate-and-fire neurons. Numerical simulations reveal transitions to negative SCCs at intermediate values of bias current drive and network size. As bias drive and network size are increased past these values, the SCC returns to zero. The SCC is maximally negative at an intermediate value of the network oscillation strength. The dependence of the SCC on two canonical schemes for synaptic connectivity is studied, and it is shown that the results occur robustly in both schemes. For conductance-based synapses, the SCC becomes negative at the onset of both a fast and slow coherent network oscillation. We then show by means of offline simulations using prerecorded network activity that a neuron's SCC is highly sensitive to its number of presynaptic inputs. Finally, we devise a noise-reduced diffusion approximation for current-based networks that accounts for the observed temporal correlation transitions.
Collapse
Affiliation(s)
- Wilhelm Braun
- Neural Network Dynamics and Computation, Institut für Genetik, Universität Bonn, Kirschallee 1, 53115 Bonn, Germany.,Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
| | - André Longtin
- Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
| |
Collapse
|
9
|
Doose J, Lindner B. Evoking prescribed spike times in stochastic neurons. Phys Rev E 2017; 96:032109. [PMID: 29346970 DOI: 10.1103/physreve.96.032109] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2017] [Indexed: 11/07/2022]
Abstract
Single cell stimulation in vivo is a powerful tool to investigate the properties of single neurons and their functionality in neural networks. We present a method to determine a cell-specific stimulus that reliably evokes a prescribed spike train with high temporal precision of action potentials. We test the performance of this stimulus in simulations for two different stochastic neuron models. For a broad range of parameters and a neuron firing with intermediate firing rates (20-40 Hz) the reliability in evoking the prescribed spike train is close to its theoretical maximum that is mainly determined by the level of intrinsic noise.
Collapse
Affiliation(s)
- Jens Doose
- Bernstein Center for Computational Neuroscience, Berlin 10115, Germany and Physics Department of the Humboldt University Berlin, Berlin 12489, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Berlin 10115, Germany and Physics Department of the Humboldt University Berlin, Berlin 12489, Germany
| |
Collapse
|
10
|
Braun W, Thul R, Longtin A. Evolution of moments and correlations in nonrenewal escape-time processes. Phys Rev E 2017; 95:052127. [PMID: 28618562 DOI: 10.1103/physreve.95.052127] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2016] [Indexed: 06/07/2023]
Abstract
The theoretical description of nonrenewal stochastic systems is a challenge. Analytical results are often not available or can be obtained only under strong conditions, limiting their applicability. Also, numerical results have mostly been obtained by ad hoc Monte Carlo simulations, which are usually computationally expensive when a high degree of accuracy is needed. To gain quantitative insight into these systems under general conditions, we here introduce a numerical iterated first-passage time approach based on solving the time-dependent Fokker-Planck equation (FPE) to describe the statistics of nonrenewal stochastic systems. We illustrate the approach using spike-triggered neuronal adaptation in the leaky and perfect integrate-and-fire model, respectively. The transition to stationarity of first-passage time moments and their sequential correlations occur on a nontrivial time scale that depends on all system parameters. Surprisingly this is so for both single exponential and scale-free power-law adaptation. The method works beyond the small noise and time-scale separation approximations. It shows excellent agreement with direct Monte Carlo simulations, which allow for the computation of transient and stationary distributions. We compare different methods to compute the evolution of the moments and serial correlation coefficients (SCCs) and discuss the challenge of reliably computing the SCCs, which we find to be very sensitive to numerical inaccuracies for both the leaky and perfect integrate-and-fire models. In conclusion, our methods provide a general picture of nonrenewal dynamics in a wide range of stochastic systems exhibiting short- and long-range correlations.
Collapse
Affiliation(s)
- Wilhelm Braun
- Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
- University of Ottawa Brain and Mind Research Institute, University of Ottawa, 451 Smyth Road, Ottawa, ON K1H 8M5, Canada
| | - Rüdiger Thul
- Centre for Mathematical Medicine and Biology, School of Mathematical Sciences, University of Nottingham, Nottingham NG7 2RD, United Kingdom
| | - André Longtin
- Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
- University of Ottawa Brain and Mind Research Institute, University of Ottawa, 451 Smyth Road, Ottawa, ON K1H 8M5, Canada
| |
Collapse
|
11
|
Huang CG, Zhang ZD, Chacron MJ. Temporal decorrelation by SK channels enables efficient neural coding and perception of natural stimuli. Nat Commun 2016; 7:11353. [PMID: 27088670 PMCID: PMC4837484 DOI: 10.1038/ncomms11353] [Citation(s) in RCA: 31] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2015] [Accepted: 03/17/2016] [Indexed: 11/21/2022] Open
Abstract
It is commonly assumed that neural systems efficiently process natural sensory input. However, the mechanisms by which such efficient processing is achieved, and the consequences for perception and behaviour remain poorly understood. Here we show that small conductance calcium-activated potassium (SK) channels enable efficient neural processing and perception of natural stimuli. Specifically, these channels allow for the high-pass filtering of sensory input, thereby removing temporal correlations or, equivalently, whitening frequency response power. Varying the degree of adaptation through pharmacological manipulation of SK channels reduced efficiency of coding of natural stimuli, which in turn gave rise to predictable changes in behavioural responses that were no longer matched to natural stimulus statistics. Our results thus demonstrate a novel mechanism by which the nervous system can implement efficient processing and perception of natural sensory input that is likely to be shared across systems and species.
Collapse
Affiliation(s)
- Chengjie G. Huang
- Department of Physiology, McGill University, 3655 Sir William Osler, Montreal, Quebec, Canada H3G 1Y6
| | - Zhubo D. Zhang
- Department of Physiology, McGill University, 3655 Sir William Osler, Montreal, Quebec, Canada H3G 1Y6
| | - Maurice J. Chacron
- Department of Physiology, McGill University, 3655 Sir William Osler, Montreal, Quebec, Canada H3G 1Y6
| |
Collapse
|
12
|
Johnson EC, Jones DL, Ratnam R. A minimum-error, energy-constrained neural code is an instantaneous-rate code. J Comput Neurosci 2016; 40:193-206. [PMID: 26922680 DOI: 10.1007/s10827-016-0592-x] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2015] [Revised: 12/29/2015] [Accepted: 02/03/2016] [Indexed: 10/22/2022]
Abstract
Sensory neurons code information about stimuli in their sequence of action potentials (spikes). Intuitively, the spikes should represent stimuli with high fidelity. However, generating and propagating spikes is a metabolically expensive process. It is therefore likely that neural codes have been selected to balance energy expenditure against encoding error. Our recently proposed optimal, energy-constrained neural coder (Jones et al. Frontiers in Computational Neuroscience, 9, 61 2015) postulates that neurons time spikes to minimize the trade-off between stimulus reconstruction error and expended energy by adjusting the spike threshold using a simple dynamic threshold. Here, we show that this proposed coding scheme is related to existing coding schemes, such as rate and temporal codes. We derive an instantaneous rate coder and show that the spike-rate depends on the signal and its derivative. In the limit of high spike rates the spike train maximizes fidelity given an energy constraint (average spike-rate), and the predicted interspike intervals are identical to those generated by our existing optimal coding neuron. The instantaneous rate coder is shown to closely match the spike-rates recorded from P-type primary afferents in weakly electric fish. In particular, the coder is a predictor of the peristimulus time histogram (PSTH). When tested against in vitro cortical pyramidal neuron recordings, the instantaneous spike-rate approximates DC step inputs, matching both the average spike-rate and the time-to-first-spike (a simple temporal code). Overall, the instantaneous rate coder relates optimal, energy-constrained encoding to the concepts of rate-coding and temporal-coding, suggesting a possible unifying principle of neural encoding of sensory signals.
Collapse
Affiliation(s)
- Erik C Johnson
- Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, Urbana, IL, 61801, USA. .,Coordinated Science Laboratory, University of Illinois at Urbana-Champaign, Urbana, IL, 61801, USA. .,Beckman Institute for Advanced Science and Technology, University of Illinois at Urbana-Champaign, Urbana, IL, 61801, USA.
| | - Douglas L Jones
- Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, Urbana, IL, 61801, USA. .,Coordinated Science Laboratory, University of Illinois at Urbana-Champaign, Urbana, IL, 61801, USA. .,Beckman Institute for Advanced Science and Technology, University of Illinois at Urbana-Champaign, Urbana, IL, 61801, USA. .,Advanced Digital Sciences Center, Illinois at Singapore Pte. Ltd, Singapore, Singapore. .,Neuroscience Program, University of Illinois at Urbana-Champaign, Urbana, IL, 61801, USA.
| | - Rama Ratnam
- Coordinated Science Laboratory, University of Illinois at Urbana-Champaign, Urbana, IL, 61801, USA. .,Beckman Institute for Advanced Science and Technology, University of Illinois at Urbana-Champaign, Urbana, IL, 61801, USA. .,Advanced Digital Sciences Center, Illinois at Singapore Pte. Ltd, Singapore, Singapore.
| |
Collapse
|
13
|
Jung SN, Longtin A, Maler L. Weak signal amplification and detection by higher-order sensory neurons. J Neurophysiol 2016; 115:2158-75. [PMID: 26843601 DOI: 10.1152/jn.00811.2015] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2015] [Accepted: 01/30/2016] [Indexed: 12/22/2022] Open
Abstract
Sensory systems must extract behaviorally relevant information and therefore often exhibit a very high sensitivity. How the nervous system reaches such high sensitivity levels is an outstanding question in neuroscience. Weakly electric fish (Apteronotus leptorhynchus/albifrons) are an excellent model system to address this question because detailed background knowledge is available regarding their behavioral performance and its underlying neuronal substrate. Apteronotus use their electrosense to detect prey objects. Therefore, they must be able to detect electrical signals as low as 1 μV while using a sensory integration time of <200 ms. How these very weak signals are extracted and amplified by the nervous system is not yet understood. We studied the responses of cells in the early sensory processing areas, namely, the electroreceptor afferents (EAs) and pyramidal cells (PCs) of the electrosensory lobe (ELL), the first-order electrosensory processing area. In agreement with previous work we found that EAs cannot encode very weak signals with a spike count code. However, PCs can encode prey mimic signals by their firing rate, revealing a huge signal amplification between EAs and PCs and also suggesting differences in their stimulus encoding properties. Using a simple leaky integrate-and-fire (LIF) model we predict that the target neurons of PCs in the midbrain torus semicircularis (TS) are able to detect very weak signals. In particular, TS neurons could do so by assuming biologically plausible convergence rates as well as very simple decoding strategies such as temporal integration, threshold crossing, and combining the inputs of PCs.
Collapse
Affiliation(s)
- Sarah N Jung
- Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Ontario, Canada; Department of Physics, University of Ottawa, Ottawa, Ontario, Canada; and
| | - Andre Longtin
- Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Ontario, Canada; Department of Physics, University of Ottawa, Ottawa, Ontario, Canada; and Brain and Mind Institute and Center for Neural Dynamics, University of Ottawa, Ottawa, Ontario, Canada
| | - Leonard Maler
- Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Ontario, Canada; Brain and Mind Institute and Center for Neural Dynamics, University of Ottawa, Ottawa, Ontario, Canada
| |
Collapse
|
14
|
Marcoux CM, Clarke SE, Nesse WH, Longtin A, Maler L. Balanced ionotropic receptor dynamics support signal estimation via voltage-dependent membrane noise. J Neurophysiol 2015; 115:530-45. [PMID: 26561607 DOI: 10.1152/jn.00786.2015] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2015] [Accepted: 11/10/2015] [Indexed: 11/22/2022] Open
Abstract
Encoding behaviorally relevant stimuli in a noisy background is critical for animals to survive in their natural environment. We identify core biophysical and synaptic mechanisms that permit the encoding of low-frequency signals in pyramidal neurons of the weakly electric fish Apteronotus leptorhynchus, an animal that can accurately encode even miniscule amplitude modulations of its self-generated electric field. We demonstrate that slow NMDA receptor (NMDA-R)-mediated excitatory postsynaptic potentials (EPSPs) are able to summate over many interspike intervals (ISIs) of the primary electrosensory afferents (EAs), effectively eliminating the baseline EA ISI correlations from the pyramidal cell input. Together with a dynamic balance of NMDA-R and GABA-A-R currents, this permits stimulus-evoked changes in EA spiking to be transmitted efficiently to target electrosensory lobe (ELL) pyramidal cells, for encoding low-frequency signals. Interestingly, AMPA-R activity is depressed and appears to play a negligible role in the generation of action potentials. Instead, we hypothesize that cell-intrinsic voltage-dependent membrane noise supports the encoding of perithreshold sensory input; this noise drives a significant proportion of pyramidal cell spikes. Together, these mechanisms may be sufficient for the ELL to encode signals near the threshold of behavioral detection.
Collapse
Affiliation(s)
- Curtis M Marcoux
- Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Ontario, Canada
| | - Stephen E Clarke
- Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Ontario, Canada
| | - William H Nesse
- Department of Mathematics, University of Utah, Salt Lake City, Utah
| | - Andre Longtin
- Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Ontario, Canada; Department of Physics, University of Ottawa, Ottawa, Ontario, Canada; and Brain and Mind Institute and Center for Neural Dynamics, University of Ottawa, Ottawa, Ontario, Canada
| | - Leonard Maler
- Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Ontario, Canada; Brain and Mind Institute and Center for Neural Dynamics, University of Ottawa, Ottawa, Ontario, Canada
| |
Collapse
|
15
|
Schwalger T, Lindner B. Patterns of interval correlations in neural oscillators with adaptation. Front Comput Neurosci 2013; 7:164. [PMID: 24348372 PMCID: PMC3843362 DOI: 10.3389/fncom.2013.00164] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2013] [Accepted: 10/26/2013] [Indexed: 11/24/2022] Open
Abstract
Neural firing is often subject to negative feedback by adaptation currents. These currents can induce strong correlations among the time intervals between spikes. Here we study analytically the interval correlations of a broad class of noisy neural oscillators with spike-triggered adaptation of arbitrary strength and time scale. Our weak-noise theory provides a general relation between the correlations and the phase-response curve (PRC) of the oscillator, proves anti-correlations between neighboring intervals for adapting neurons with type I PRC and identifies a single order parameter that determines the qualitative pattern of correlations. Monotonically decaying or oscillating correlation structures can be related to qualitatively different voltage traces after spiking, which can be explained by the phase plane geometry. At high firing rates, the long-term variability of the spike train associated with the cumulative interval correlations becomes small, independent of model details. Our results are verified by comparison with stochastic simulations of the exponential, leaky, and generalized integrate-and-fire models with adaptation.
Collapse
Affiliation(s)
- Tilo Schwalger
- Bernstein Center for Computational Neuroscience Berlin, Germany ; Department of Physics, Humboldt Universität zu Berlin Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Germany ; Department of Physics, Humboldt Universität zu Berlin Berlin, Germany
| |
Collapse
|
16
|
Farkhooi F, Froese A, Muller E, Menzel R, Nawrot MP. Cellular adaptation facilitates sparse and reliable coding in sensory pathways. PLoS Comput Biol 2013; 9:e1003251. [PMID: 24098101 PMCID: PMC3789775 DOI: 10.1371/journal.pcbi.1003251] [Citation(s) in RCA: 34] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2013] [Accepted: 08/16/2013] [Indexed: 11/30/2022] Open
Abstract
Most neurons in peripheral sensory pathways initially respond vigorously when a preferred stimulus is presented, but adapt as stimulation continues. It is unclear how this phenomenon affects stimulus coding in the later stages of sensory processing. Here, we show that a temporally sparse and reliable stimulus representation develops naturally in sequential stages of a sensory network with adapting neurons. As a modeling framework we employ a mean-field approach together with an adaptive population density treatment, accompanied by numerical simulations of spiking neural networks. We find that cellular adaptation plays a critical role in the dynamic reduction of the trial-by-trial variability of cortical spike responses by transiently suppressing self-generated fast fluctuations in the cortical balanced network. This provides an explanation for a widespread cortical phenomenon by a simple mechanism. We further show that in the insect olfactory system cellular adaptation is sufficient to explain the emergence of the temporally sparse and reliable stimulus representation in the mushroom body. Our results reveal a generic, biophysically plausible mechanism that can explain the emergence of a temporally sparse and reliable stimulus representation within a sequential processing architecture.
Collapse
Affiliation(s)
- Farzad Farkhooi
- Neuroinformatics & Theoretical Neuroscience, Freie Universität Berlin, and Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Anja Froese
- Institute für Biologie-Neurobiologie, Freie Universität Berlin, Berlin, Germany
| | - Eilif Muller
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Randolf Menzel
- Institute für Biologie-Neurobiologie, Freie Universität Berlin, Berlin, Germany
| | - Martin P. Nawrot
- Neuroinformatics & Theoretical Neuroscience, Freie Universität Berlin, and Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| |
Collapse
|
17
|
Steimer A, Douglas R. Spike-based probabilistic inference in analog graphical models using interspike-interval coding. Neural Comput 2013; 25:2303-54. [PMID: 23663144 DOI: 10.1162/neco_a_00477] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Temporal spike codes play a crucial role in neural information processing. In particular, there is strong experimental evidence that interspike intervals (ISIs) are used for stimulus representation in neural systems. However, very few algorithmic principles exploit the benefits of such temporal codes for probabilistic inference of stimuli or decisions. Here, we describe and rigorously prove the functional properties of a spike-based processor that uses ISI distributions to perform probabilistic inference. The abstract processor architecture serves as a building block for more concrete, neural implementations of the belief-propagation (BP) algorithm in arbitrary graphical models (e.g., Bayesian networks and factor graphs). The distributed nature of graphical models matches well with the architectural and functional constraints imposed by biology. In our model, ISI distributions represent the BP messages exchanged between factor nodes, leading to the interpretation of a single spike as a random sample that follows such a distribution. We verify the abstract processor model by numerical simulation in full graphs, and demonstrate that it can be applied even in the presence of analog variables. As a particular example, we also show results of a concrete, neural implementation of the processor, although in principle our approach is more flexible and allows different neurobiological interpretations. Furthermore, electrophysiological data from area LIP during behavioral experiments are assessed in light of ISI coding, leading to concrete testable, quantitative predictions and a more accurate description of these data compared to hitherto existing models.
Collapse
Affiliation(s)
- Andreas Steimer
- Institute of Neuroinformatics, University of Zürich and ETH Zürich, Zürich 8057, Switzerland.
| | | |
Collapse
|
18
|
Mejias JF, Longtin A. Optimal heterogeneity for coding in spiking neural networks. PHYSICAL REVIEW LETTERS 2012; 108:228102. [PMID: 23003656 DOI: 10.1103/physrevlett.108.228102] [Citation(s) in RCA: 73] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/19/2011] [Indexed: 06/01/2023]
Abstract
The effect of cellular heterogeneity on the coding properties of neural populations is studied analytically and numerically. We find that heterogeneity decreases the threshold for synchronization, and its strength is nonlinearly related to the network mean firing rate. In addition, conditions are shown under which heterogeneity optimizes network information transmission for either temporal or rate coding, with high input frequencies leading to different effects for each coding strategy. The results are shown to be robust for more realistic conditions.
Collapse
Affiliation(s)
- J F Mejias
- Department of Physics and Center for Neural Dynamics, University of Ottawa, 150 Louis Pasteur, K1N-6N5 Ottawa, Ontario, Canada.
| | | |
Collapse
|
19
|
Lyamzin DR, Garcia-Lazaro JA, Lesica NA. Analysis and modelling of variability and covariability of population spike trains across multiple time scales. NETWORK (BRISTOL, ENGLAND) 2012; 23:76-103. [PMID: 22578115 DOI: 10.3109/0954898x.2012.679334] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/31/2023]
Abstract
As multi-electrode and imaging technology begin to provide us with simultaneous recordings of large neuronal populations, new methods for modelling such data must also be developed. We present a model of responses to repeated trials of a sensory stimulus based on thresholded Gaussian processes that allows for analysis and modelling of variability and covariability of population spike trains across multiple time scales. The model framework can be used to specify the values of many different variability measures including spike timing precision across trials, coefficient of variation of the interspike interval distribution, and Fano factor of spike counts for individual neurons, as well as signal and noise correlations and correlations of spike counts across multiple neurons. Using both simulated data and data from different stages of the mammalian auditory pathway, we demonstrate the range of possible independent manipulations of different variability measures, and explore how this range depends on the sensory stimulus. The model provides a powerful framework for the study of experimental and surrogate data and for analyzing dependencies between different statistical properties of neuronal populations.
Collapse
|
20
|
Urdapilleta E. Onset of negative interspike interval correlations in adapting neurons. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2011; 84:041904. [PMID: 22181172 DOI: 10.1103/physreve.84.041904] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/07/2011] [Revised: 07/21/2011] [Indexed: 05/31/2023]
Abstract
Negative serial correlations in single spike trains are an effective method to reduce the variability of spike counts. One of the factors contributing to the development of negative correlations between successive interspike intervals is the presence of adaptation currents. In this work, based on a hidden Markov model and a proper statistical description of conditional responses, we obtain analytically these correlations in an adequate dynamical neuron model resembling adaptation. We derive the serial correlation coefficients for arbitrary lags, under a small adaptation scenario. In this case, the behavior of correlations is universal and depends on the first-order statistical description of an exponentially driven time-inhomogeneous stochastic process.
Collapse
Affiliation(s)
- Eugenio Urdapilleta
- División de Física Estadística e Interdisciplinaria & Instituto Balseiro, Centro Atómico Bariloche, Avenida E. Bustillo Km 9.500, S.C. de Bariloche (8400), Río Negro, Argentina.
| |
Collapse
|
21
|
Budini AA. Large deviations of ergodic counting processes: a statistical mechanics approach. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2011; 84:011141. [PMID: 21867147 DOI: 10.1103/physreve.84.011141] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/15/2011] [Revised: 06/15/2011] [Indexed: 05/31/2023]
Abstract
The large-deviation method allows to characterize an ergodic counting process in terms of a thermodynamic frame where a free energy function determines the asymptotic nonstationary statistical properties of its fluctuations. Here we study this formalism through a statistical mechanics approach, that is, with an auxiliary counting process that maximizes an entropy function associated with the thermodynamic potential. We show that the realizations of this auxiliary process can be obtained after applying a conditional measurement scheme to the original ones, providing is this way an alternative measurement interpretation of the thermodynamic approach. General results are obtained for renewal counting processes, that is, those where the time intervals between consecutive events are independent and defined by a unique waiting time distribution. The underlying statistical mechanics is controlled by the same waiting time distribution, rescaled by an exponential decay measured by the free energy function. A scale invariance, shift closure, and intermittence phenomena are obtained and interpreted in this context. Similar conclusions apply for nonrenewal processes when the memory between successive events is induced by a stochastic waiting time distribution.
Collapse
Affiliation(s)
- Adrián A Budini
- Consejo Nacional de Investigaciones Científicas y Técnicas, Centro Atómico Bariloche, Avenida E Bustillo Km 9.5, 8400 Bariloche, Argentina
| |
Collapse
|
22
|
Efficient computation via sparse coding in electrosensory neural networks. Curr Opin Neurobiol 2011; 21:752-60. [PMID: 21683574 DOI: 10.1016/j.conb.2011.05.016] [Citation(s) in RCA: 76] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2011] [Revised: 05/17/2011] [Accepted: 05/20/2011] [Indexed: 11/24/2022]
Abstract
The electric sense combines spatial aspects of vision and touch with temporal features of audition. Its accessible neural architecture shares similarities with mammalian sensory systems and allows for recordings from successive brain areas to test hypotheses about neural coding. Further, electrosensory stimuli encountered during prey capture, navigation, and communication, can be readily synthesized in the laboratory. These features enable analyses of the neural circuitry that reveal general principles of encoding and decoding, such as segregation of information into separate streams and neural response sparsification. A systems level understanding arises via linkage between cellular differentiation and network architecture, revealed by in vitro and in vivo analyses, while computational modeling reveals how single cell dynamics and connectivity shape the sparsification process.
Collapse
|
23
|
Avila-Akerberg O, Chacron MJ. Nonrenewal spike train statistics: causes and functional consequences on neural coding. Exp Brain Res 2011; 210:353-71. [PMID: 21267548 PMCID: PMC4529317 DOI: 10.1007/s00221-011-2553-y] [Citation(s) in RCA: 44] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2010] [Accepted: 01/06/2011] [Indexed: 10/18/2022]
Abstract
Many neurons display significant patterning in their spike trains (e.g. oscillations, bursting), and there is accumulating evidence that information is contained in these patterns. In many cases, this patterning is caused by intrinsic mechanisms rather than external signals. In this review, we focus on spiking activity that displays nonrenewal statistics (i.e. memory that persists from one firing to the next). Such statistics are seen in both peripheral and central neurons and appear to be ubiquitous in the CNS. We review the principal mechanisms that can give rise to nonrenewal spike train statistics. These are separated into intrinsic mechanisms such as relative refractoriness and network mechanisms such as coupling with delayed inhibitory feedback. Next, we focus on the functional roles for nonrenewal spike train statistics. These can either increase or decrease information transmission. We also focus on how such statistics can give rise to an optimal integration timescale at which spike train variability is minimal and how this might be exploited by sensory systems to maximize the detection of weak signals. We finish by pointing out some interesting future directions for research in this area. In particular, we explore the interesting possibility that synaptic dynamics might be matched with the nonrenewal spiking statistics of presynaptic spike trains in order to further improve information transmission.
Collapse
|