1
|
Yamane Y. Adaptation of the inferior temporal neurons and efficient visual processing. Front Behav Neurosci 2024; 18:1398874. [PMID: 39132448 PMCID: PMC11310006 DOI: 10.3389/fnbeh.2024.1398874] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2024] [Accepted: 07/16/2024] [Indexed: 08/13/2024] Open
Abstract
Numerous studies examining the responses of individual neurons in the inferior temporal (IT) cortex have revealed their characteristics such as two-dimensional or three-dimensional shape tuning, objects, or category selectivity. While these basic selectivities have been studied assuming that their response to stimuli is relatively stable, physiological experiments have revealed that the responsiveness of IT neurons also depends on visual experience. The activity changes of IT neurons occur over various time ranges; among these, repetition suppression (RS), in particular, is robustly observed in IT neurons without any behavioral or task constraints. I observed a similar phenomenon in the ventral visual neurons in macaque monkeys while they engaged in free viewing and actively fixated on one consistent object multiple times. This observation indicates that the phenomenon also occurs in natural situations during which the subject actively views stimuli without forced fixation, suggesting that this phenomenon is an everyday occurrence and widespread across regions of the visual system, making it a default process for visual neurons. Such short-term activity modulation may be a key to understanding the visual system; however, the circuit mechanism and the biological significance of RS remain unclear. Thus, in this review, I summarize the observed modulation types in IT neurons and the known properties of RS. Subsequently, I discuss adaptation in vision, including concepts such as efficient and predictive coding, as well as the relationship between adaptation and psychophysical aftereffects. Finally, I discuss some conceptual implications of this phenomenon as well as the circuit mechanisms and the models that may explain adaptation as a fundamental aspect of visual processing.
Collapse
Affiliation(s)
- Yukako Yamane
- Neural Computation Unit, Okinawa Institute of Science and Technology Graduate University, Okinawa, Japan
| |
Collapse
|
2
|
Nicola W, Newton TR, Clopath C. The impact of spike timing precision and spike emission reliability on decoding accuracy. Sci Rep 2024; 14:10536. [PMID: 38719897 PMCID: PMC11078995 DOI: 10.1038/s41598-024-58524-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2023] [Accepted: 04/01/2024] [Indexed: 05/12/2024] Open
Abstract
Precisely timed and reliably emitted spikes are hypothesized to serve multiple functions, including improving the accuracy and reproducibility of encoding stimuli, memories, or behaviours across trials. When these spikes occur as a repeating sequence, they can be used to encode and decode a potential time series. Here, we show both analytically and in simulations that the error incurred in approximating a time series with precisely timed and reliably emitted spikes decreases linearly with the number of neurons or spikes used in the decoding. This was verified numerically with synthetically generated patterns of spikes. Further, we found that if spikes were imprecise in their timing, or unreliable in their emission, the error incurred in decoding with these spikes would be sub-linear. However, if the spike precision or spike reliability increased with network size, the error incurred in decoding a time-series with sequences of spikes would maintain a linear decrease with network size. The spike precision had to increase linearly with network size, while the probability of spike failure had to decrease with the square-root of the network size. Finally, we identified a candidate circuit to test this scaling relationship: the repeating sequences of spikes with sub-millisecond precision in area HVC (proper name) of the zebra finch. This scaling relationship can be tested using both neural data and song-spectrogram-based recordings while taking advantage of the natural fluctuation in HVC network size due to neurogenesis.
Collapse
Affiliation(s)
- Wilten Nicola
- University of Calgary, Calgary, Canada.
- Department of Cell Biology and Anatomy, Calgary, Canada.
- Hotchkiss Brain Institute, Calgary, Canada.
| | | | - Claudia Clopath
- Department of Bioengineering, Imperial College London, London, UK
| |
Collapse
|
3
|
Koren V, Malerba SB, Schwalger T, Panzeri S. Structure, dynamics, coding and optimal biophysical parameters of efficient excitatory-inhibitory spiking networks. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.04.24.590955. [PMID: 38712237 PMCID: PMC11071478 DOI: 10.1101/2024.04.24.590955] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/08/2024]
Abstract
The principle of efficient coding posits that sensory cortical networks are designed to encode maximal sensory information with minimal metabolic cost. Despite the major influence of efficient coding in neuroscience, it has remained unclear whether fundamental empirical properties of neural network activity can be explained solely based on this normative principle. Here, we rigorously derive the structural, coding, biophysical and dynamical properties of excitatory-inhibitory recurrent networks of spiking neurons that emerge directly from imposing that the network minimizes an instantaneous loss function and a time-averaged performance measure enacting efficient coding. The optimal network has biologically-plausible biophysical features, including realistic integrate-and-fire spiking dynamics, spike-triggered adaptation, and a non-stimulus-specific excitatory external input regulating metabolic cost. The efficient network has excitatory-inhibitory recurrent connectivity between neurons with similar stimulus tuning implementing feature-specific competition, similar to that recently found in visual cortex. Networks with unstructured connectivity cannot reach comparable levels of coding efficiency. The optimal biophysical parameters include 4 to 1 ratio of excitatory vs inhibitory neurons and 3 to 1 ratio of mean inhibitory-to-inhibitory vs. excitatory-to-inhibitory connectivity that closely match those of cortical sensory networks. The efficient network has biologically-plausible spiking dynamics, with a tight instantaneous E-I balance that makes them capable to achieve efficient coding of external stimuli varying over multiple time scales. Together, these results explain how efficient coding may be implemented in cortical networks and suggests that key properties of biological neural networks may be accounted for by efficient coding.
Collapse
Affiliation(s)
- Veronika Koren
- Institute of Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-Eppendorf (UKE), 20251 Hamburg, Germany
- Institute of Mathematics, Technische Universität Berlin, 10623 Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Simone Blanco Malerba
- Institute of Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-Eppendorf (UKE), 20251 Hamburg, Germany
| | - Tilo Schwalger
- Institute of Mathematics, Technische Universität Berlin, 10623 Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Stefano Panzeri
- Institute of Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-Eppendorf (UKE), 20251 Hamburg, Germany
| |
Collapse
|
4
|
Willmore BDB, King AJ. Adaptation in auditory processing. Physiol Rev 2023; 103:1025-1058. [PMID: 36049112 PMCID: PMC9829473 DOI: 10.1152/physrev.00011.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/21/2023] Open
Abstract
Adaptation is an essential feature of auditory neurons, which reduces their responses to unchanging and recurring sounds and allows their response properties to be matched to the constantly changing statistics of sounds that reach the ears. As a consequence, processing in the auditory system highlights novel or unpredictable sounds and produces an efficient representation of the vast range of sounds that animals can perceive by continually adjusting the sensitivity and, to a lesser extent, the tuning properties of neurons to the most commonly encountered stimulus values. Together with attentional modulation, adaptation to sound statistics also helps to generate neural representations of sound that are tolerant to background noise and therefore plays a vital role in auditory scene analysis. In this review, we consider the diverse forms of adaptation that are found in the auditory system in terms of the processing levels at which they arise, the underlying neural mechanisms, and their impact on neural coding and perception. We also ask what the dynamics of adaptation, which can occur over multiple timescales, reveal about the statistical properties of the environment. Finally, we examine how adaptation to sound statistics is influenced by learning and experience and changes as a result of aging and hearing loss.
Collapse
Affiliation(s)
- Ben D. B. Willmore
- Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
| | - Andrew J. King
- Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
| |
Collapse
|
5
|
Koren V, Bondanelli G, Panzeri S. Computational methods to study information processing in neural circuits. Comput Struct Biotechnol J 2023; 21:910-922. [PMID: 36698970 PMCID: PMC9851868 DOI: 10.1016/j.csbj.2023.01.009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2022] [Revised: 01/09/2023] [Accepted: 01/09/2023] [Indexed: 01/13/2023] Open
Abstract
The brain is an information processing machine and thus naturally lends itself to be studied using computational tools based on the principles of information theory. For this reason, computational methods based on or inspired by information theory have been a cornerstone of practical and conceptual progress in neuroscience. In this Review, we address how concepts and computational tools related to information theory are spurring the development of principled theories of information processing in neural circuits and the development of influential mathematical methods for the analyses of neural population recordings. We review how these computational approaches reveal mechanisms of essential functions performed by neural circuits. These functions include efficiently encoding sensory information and facilitating the transmission of information to downstream brain areas to inform and guide behavior. Finally, we discuss how further progress and insights can be achieved, in particular by studying how competing requirements of neural encoding and readout may be optimally traded off to optimize neural information processing.
Collapse
Affiliation(s)
- Veronika Koren
- Department of Excellence for Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-Eppendorf (UKE), Falkenried 94, Hamburg 20251, Germany
| | | | - Stefano Panzeri
- Department of Excellence for Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-Eppendorf (UKE), Falkenried 94, Hamburg 20251, Germany
- Istituto Italiano di Tecnologia, Via Melen 83, Genova 16152, Italy
| |
Collapse
|
6
|
Exact mean-field models for spiking neural networks with adaptation. J Comput Neurosci 2022; 50:445-469. [PMID: 35834100 DOI: 10.1007/s10827-022-00825-9] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2022] [Accepted: 06/15/2022] [Indexed: 10/17/2022]
Abstract
Networks of spiking neurons with adaption have been shown to be able to reproduce a wide range of neural activities, including the emergent population bursting and spike synchrony that underpin brain disorders and normal function. Exact mean-field models derived from spiking neural networks are extremely valuable, as such models can be used to determine how individual neurons and the network they reside within interact to produce macroscopic network behaviours. In the paper, we derive and analyze a set of exact mean-field equations for the neural network with spike frequency adaptation. Specifically, our model is a network of Izhikevich neurons, where each neuron is modeled by a two dimensional system consisting of a quadratic integrate and fire equation plus an equation which implements spike frequency adaptation. Previous work deriving a mean-field model for this type of network, relied on the assumption of sufficiently slow dynamics of the adaptation variable. However, this approximation did not succeed in establishing an exact correspondence between the macroscopic description and the realistic neural network, especially when the adaptation time constant was not large. The challenge lies in how to achieve a closed set of mean-field equations with the inclusion of the mean-field dynamics of the adaptation variable. We address this problem by using a Lorentzian ansatz combined with the moment closure approach to arrive at a mean-field system in the thermodynamic limit. The resulting macroscopic description is capable of qualitatively and quantitatively describing the collective dynamics of the neural network, including transition between states where the individual neurons exhibit asynchronous tonic firing and synchronous bursting. We extend the approach to a network of two populations of neurons and discuss the accuracy and efficacy of our mean-field approximations by examining all assumptions that are imposed during the derivation. Numerical bifurcation analysis of our mean-field models reveals bifurcations not previously observed in the models, including a novel mechanism for emergence of bursting in the network. We anticipate our results will provide a tractable and reliable tool to investigate the underlying mechanism of brain function and dysfunction from the perspective of computational neuroscience.
Collapse
|
7
|
Wilmerding LK, Yazdanbakhsh A, Hasselmo ME. Impact of optogenetic pulse design on CA3 learning and replay: A neural model. CELL REPORTS METHODS 2022; 2:100208. [PMID: 35637904 PMCID: PMC9142690 DOI: 10.1016/j.crmeth.2022.100208] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/24/2021] [Revised: 10/22/2021] [Accepted: 04/11/2022] [Indexed: 11/23/2022]
Abstract
Optogenetic manipulation of hippocampal circuitry is an important tool for investigating learning in vivo. Numerous approaches to pulse design have been employed to elicit desirable circuit and behavioral outcomes. Here, we systematically test the outcome of different single-pulse waveforms in a rate-based model of hippocampal memory function at the level of mnemonic replay extension and de novo synaptic weight formation in CA3 and CA1. Lower-power waveforms with long forward or forward and backward ramps yield more natural sequence replay dynamics and induce synaptic plasticity that allows for more natural memory replay timing, in contrast to square or backward ramps. These differences between waveform shape and amplitude are preserved with the addition of noise in membrane potential, light scattering, and protein expression, improving the potential validity of predictions for in vivo work. These results inform future optogenetic experimental design choices in the field of learning and memory.
Collapse
Affiliation(s)
- Lucius K. Wilmerding
- Department of Psychological and Brain Sciences, Boston University, Boston, MA 02215, USA
- Graduate Program for Neuroscience, Boston University, Boston, MA, USA
- Center for Systems Neuroscience, Boston University, Boston, MA, USA
| | - Arash Yazdanbakhsh
- Department of Psychological and Brain Sciences, Boston University, Boston, MA 02215, USA
- Graduate Program for Neuroscience, Boston University, Boston, MA, USA
- Center for Systems Neuroscience, Boston University, Boston, MA, USA
| | - Michael E. Hasselmo
- Department of Psychological and Brain Sciences, Boston University, Boston, MA 02215, USA
- Graduate Program for Neuroscience, Boston University, Boston, MA, USA
- Center for Systems Neuroscience, Boston University, Boston, MA, USA
| |
Collapse
|
8
|
Salaj D, Subramoney A, Kraisnikovic C, Bellec G, Legenstein R, Maass W. Spike frequency adaptation supports network computations on temporally dispersed information. eLife 2021; 10:e65459. [PMID: 34310281 PMCID: PMC8313230 DOI: 10.7554/elife.65459] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2020] [Accepted: 06/29/2021] [Indexed: 11/13/2022] Open
Abstract
For solving tasks such as recognizing a song, answering a question, or inverting a sequence of symbols, cortical microcircuits need to integrate and manipulate information that was dispersed over time during the preceding seconds. Creating biologically realistic models for the underlying computations, especially with spiking neurons and for behaviorally relevant integration time spans, is notoriously difficult. We examine the role of spike frequency adaptation in such computations and find that it has a surprisingly large impact. The inclusion of this well-known property of a substantial fraction of neurons in the neocortex - especially in higher areas of the human neocortex - moves the performance of spiking neural network models for computations on network inputs that are temporally dispersed from a fairly low level up to the performance level of the human brain.
Collapse
Affiliation(s)
- Darjan Salaj
- Institute of Theoretical Computer Science, Graz University of TechnologyGrazAustria
| | - Anand Subramoney
- Institute of Theoretical Computer Science, Graz University of TechnologyGrazAustria
| | - Ceca Kraisnikovic
- Institute of Theoretical Computer Science, Graz University of TechnologyGrazAustria
| | - Guillaume Bellec
- Institute of Theoretical Computer Science, Graz University of TechnologyGrazAustria
- Laboratory of Computational Neuroscience, Ecole Polytechnique Fédérale de Lausanne (EPFL)LausanneSwitzerland
| | - Robert Legenstein
- Institute of Theoretical Computer Science, Graz University of TechnologyGrazAustria
| | - Wolfgang Maass
- Institute of Theoretical Computer Science, Graz University of TechnologyGrazAustria
| |
Collapse
|
9
|
Talyansky S, Brinkman BAW. Dysregulation of excitatory neural firing replicates physiological and functional changes in aging visual cortex. PLoS Comput Biol 2021; 17:e1008620. [PMID: 33497380 PMCID: PMC7864437 DOI: 10.1371/journal.pcbi.1008620] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2020] [Revised: 02/05/2021] [Accepted: 12/08/2020] [Indexed: 11/19/2022] Open
Abstract
The mammalian visual system has been the focus of countless experimental and theoretical studies designed to elucidate principles of neural computation and sensory coding. Most theoretical work has focused on networks intended to reflect developing or mature neural circuitry, in both health and disease. Few computational studies have attempted to model changes that occur in neural circuitry as an organism ages non-pathologically. In this work we contribute to closing this gap, studying how physiological changes correlated with advanced age impact the computational performance of a spiking network model of primary visual cortex (V1). Our results demonstrate that deterioration of homeostatic regulation of excitatory firing, coupled with long-term synaptic plasticity, is a sufficient mechanism to reproduce features of observed physiological and functional changes in neural activity data, specifically declines in inhibition and in selectivity to oriented stimuli. This suggests a potential causality between dysregulation of neuron firing and age-induced changes in brain physiology and functional performance. While this does not rule out deeper underlying causes or other mechanisms that could give rise to these changes, our approach opens new avenues for exploring these underlying mechanisms in greater depth and making predictions for future experiments.
Collapse
Affiliation(s)
- Seth Talyansky
- Catlin Gabel School, Portland, Oregon, United States of America
- Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, New York, United States of America
| | - Braden A. W. Brinkman
- Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, New York, United States of America
| |
Collapse
|
10
|
Rullán Buxó CE, Pillow JW. Poisson balanced spiking networks. PLoS Comput Biol 2020; 16:e1008261. [PMID: 33216741 PMCID: PMC7717583 DOI: 10.1371/journal.pcbi.1008261] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2019] [Revised: 12/04/2020] [Accepted: 08/14/2020] [Indexed: 11/18/2022] Open
Abstract
An important problem in computational neuroscience is to understand how networks of spiking neurons can carry out various computations underlying behavior. Balanced spiking networks (BSNs) provide a powerful framework for implementing arbitrary linear dynamical systems in networks of integrate-and-fire neurons. However, the classic BSN model requires near-instantaneous transmission of spikes between neurons, which is biologically implausible. Introducing realistic synaptic delays leads to an pathological regime known as "ping-ponging", in which different populations spike maximally in alternating time bins, causing network output to overshoot the target solution. Here we document this phenomenon and provide a novel solution: we show that a network can have realistic synaptic delays while maintaining accuracy and stability if neurons are endowed with conditionally Poisson firing. Formally, we propose two alternate formulations of Poisson balanced spiking networks: (1) a "local" framework, which replaces the hard integrate-and-fire spiking rule within each neuron by a "soft" threshold function, such that firing probability grows as a smooth nonlinear function of membrane potential; and (2) a "population" framework, which reformulates the BSN objective function in terms of expected spike counts over the entire population. We show that both approaches offer improved robustness, allowing for accurate implementation of network dynamics with realistic synaptic delays between neurons. Both Poisson frameworks preserve the coding accuracy and robustness to neuron loss of the original model and, moreover, produce positive correlations between similarly tuned neurons, a feature of real neural populations that is not found in the deterministic BSN. This work unifies balanced spiking networks with Poisson generalized linear models and suggests several promising avenues for future research.
Collapse
Affiliation(s)
| | - Jonathan W. Pillow
- Princeton Neuroscience Institute, Princeton University, Princeton, New Jersey, USA
| |
Collapse
|