1
|
Zhao S, Zhou J, Zhang Y, Wang DH. γ And β Band Oscillation in Working Memory Given Sequential or Concurrent Multiple Items: A Spiking Network Model. eNeuro 2023; 10:ENEURO.0373-22.2023. [PMID: 37903618 PMCID: PMC10630927 DOI: 10.1523/eneuro.0373-22.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2022] [Revised: 10/10/2023] [Accepted: 10/22/2023] [Indexed: 11/01/2023] Open
Abstract
Working memory (WM) can maintain sequential and concurrent information, and the load enhances the γ band oscillation during the delay period. To provide a unified account for these phenomena in working memory, we investigated a continuous network model consisting of pyramidal cells, high-threshold fast-spiking interneurons (FS), and low-threshold nonfast-spiking interneurons (nFS) for working memory of sequential and concurrent directional cues. Our model exhibits the γ (30-100 Hz) and β (10-30 Hz) band oscillation during the retention of both concurrent cues and sequential cues. We found that the β oscillation results from the interaction between pyramidal cells and nFS, whereas the γ oscillation emerges from the interaction between pyramidal cells and FS because of the strong excitation elicited by cue presentation, shedding light on the mechanism underlying the enhancement of γ power in many cognitive executions.
Collapse
Affiliation(s)
- Shukuo Zhao
- School of Systems Science, Beijing Normal University, Beijing 100875, China
| | - Jinpu Zhou
- School of Systems Science, Beijing Normal University, Beijing 100875, China
| | - Yongwen Zhang
- School of Systems Science, Beijing Normal University, Beijing 100875, China
| | - Da-Hui Wang
- School of Systems Science, Beijing Normal University, Beijing 100875, China
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing 100875, China
- Beijing Key Laboratory of Brain Imaging and Connectomics, Beijing Normal University, Beijing 100875, China
| |
Collapse
|
2
|
Tian Y, Tan Z, Hou H, Li G, Cheng A, Qiu Y, Weng K, Chen C, Sun P. Theoretical foundations of studying criticality in the brain. Netw Neurosci 2022; 6:1148-1185. [PMID: 38800464 PMCID: PMC11117095 DOI: 10.1162/netn_a_00269] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2022] [Accepted: 07/12/2022] [Indexed: 05/29/2024] Open
Abstract
Criticality is hypothesized as a physical mechanism underlying efficient transitions between cortical states and remarkable information-processing capacities in the brain. While considerable evidence generally supports this hypothesis, nonnegligible controversies persist regarding the ubiquity of criticality in neural dynamics and its role in information processing. Validity issues frequently arise during identifying potential brain criticality from empirical data. Moreover, the functional benefits implied by brain criticality are frequently misconceived or unduly generalized. These problems stem from the nontriviality and immaturity of the physical theories that analytically derive brain criticality and the statistic techniques that estimate brain criticality from empirical data. To help solve these problems, we present a systematic review and reformulate the foundations of studying brain criticality, that is, ordinary criticality (OC), quasi-criticality (qC), self-organized criticality (SOC), and self-organized quasi-criticality (SOqC), using the terminology of neuroscience. We offer accessible explanations of the physical theories and statistical techniques of brain criticality, providing step-by-step derivations to characterize neural dynamics as a physical system with avalanches. We summarize error-prone details and existing limitations in brain criticality analysis and suggest possible solutions. Moreover, we present a forward-looking perspective on how optimizing the foundations of studying brain criticality can deepen our understanding of various neuroscience questions.
Collapse
Affiliation(s)
- Yang Tian
- Department of Psychology & Tsinghua Laboratory of Brain and Intelligence, Tsinghua University, Beijing, China
- Laboratory of Advanced Computing and Storage, Central Research Institute, 2012 Laboratories, Huawei Technologies Co. Ltd., Beijing, China
| | - Zeren Tan
- Institute for Interdisciplinary Information Science, Tsinghua University, Beijing, China
| | - Hedong Hou
- UFR de Mathématiques, Université de Paris, Paris, France
| | - Guoqi Li
- Institute of Automation, Chinese Academy of Science, Beijing, China
- University of Chinese Academy of Science, Beijing, China
| | - Aohua Cheng
- Tsien Excellence in Engineering Program, School of Aerospace Engineering, Tsinghua University, Beijing, China
| | - Yike Qiu
- Tsien Excellence in Engineering Program, School of Aerospace Engineering, Tsinghua University, Beijing, China
| | - Kangyu Weng
- Tsien Excellence in Engineering Program, School of Aerospace Engineering, Tsinghua University, Beijing, China
| | - Chun Chen
- Department of Psychology & Tsinghua Laboratory of Brain and Intelligence, Tsinghua University, Beijing, China
| | - Pei Sun
- Department of Psychology & Tsinghua Laboratory of Brain and Intelligence, Tsinghua University, Beijing, China
| |
Collapse
|
3
|
Conditional Bistability, a Generic Cellular Mnemonic Mechanism for Robust and Flexible Working Memory Computations. J Neurosci 2018; 38:5209-5219. [PMID: 29712783 DOI: 10.1523/jneurosci.1992-17.2017] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2017] [Revised: 12/10/2017] [Accepted: 12/27/2017] [Indexed: 11/21/2022] Open
Abstract
Persistent neural activity, the substrate of working memory, is thought to emerge from synaptic reverberation within recurrent networks. However, reverberation models do not robustly explain the fundamental dynamics of persistent activity, including high-spiking irregularity, large intertrial variability, and state transitions. While cellular bistability may contribute to persistent activity, its rigidity appears incompatible with persistent activity labile characteristics. Here, we unravel in a cellular model a form of spike-mediated conditional bistability that is robust and generic. and provides a rich repertoire of mnemonic computations. Under asynchronous synaptic inputs of the awakened state, conditional bistability generates spiking/bursting episodes, accounting for the irregularity, variability, and state transitions characterizing persistent activity. This mechanism has likely been overlooked because of the subthreshold input it requires, and we predict how to assess it experimentally. Our results suggest a reexamination of the role of intrinsic properties in the collective network dynamics responsible for flexible working memory.SIGNIFICANCE STATEMENT This study unravels a novel form of intrinsic neuronal property: conditional bistability. We show that, thanks to its conditional character, conditional bistability favors the emergence of flexible and robust forms of persistent activity in PFC neural networks, in opposition to previously studied classical forms of absolute bistability. Specifically, we demonstrate for the first time that conditional bistability (1) is a generic biophysical spike-dependent mechanism of layer V pyramidal neurons in the PFC and that (2) it accounts for essential neurodynamical features for the organization and flexibility of PFC persistent activity (the large irregularity and intertrial variability of the discharge and its organization under discrete stable states), which remain unexplained in a robust fashion by current models.
Collapse
|
4
|
van Steenbergen H, Warren CM, Kühn S, de Wit S, Wiers RW, Hommel B. Representational precision in visual cortex reveals outcome encoding and reward modulation during action preparation. Neuroimage 2017; 157:415-428. [DOI: 10.1016/j.neuroimage.2017.06.012] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2017] [Revised: 05/29/2017] [Accepted: 06/06/2017] [Indexed: 11/28/2022] Open
|
5
|
Spatial working memory alters the efficacy of input to visual cortex. Nat Commun 2017; 8:15041. [PMID: 28447609 PMCID: PMC5414175 DOI: 10.1038/ncomms15041] [Citation(s) in RCA: 72] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2016] [Accepted: 02/22/2017] [Indexed: 11/26/2022] Open
Abstract
Prefrontal cortex modulates sensory signals in extrastriate visual cortex, in part via its direct projections from the frontal eye field (FEF), an area involved in selective attention. We find that working memory-related activity is a dominant signal within FEF input to visual cortex. Although this signal alone does not evoke spiking responses in areas V4 and MT during memory, the gain of visual responses in these areas increases, and neuronal receptive fields expand and shift towards the remembered location, improving the stimulus representation by neuronal populations. These results provide a basis for enhancing the representation of working memory targets and implicate persistent FEF activity as a basis for the interdependence of working memory and selective attention. Frontal eye field (FEF) is a visual prefrontal area involved in top-down attention. Here the authors report that FEF neurons projecting to V4/MT are persistently active during spatial working memory, and V4/MT neurons show changes in receptive field and gain at the location held in working memory.
Collapse
|
6
|
Abstract
Biology is the study of dynamical systems. Yet most of us working in biology have limited pedagogical training in the theory of dynamical systems, an unfortunate historical fact that can be remedied for future generations of life scientists. In my particular field of systems neuroscience, neural circuits are rife with nonlinearities at all levels of description, rendering simple methodologies and our own intuition unreliable. Therefore, our ideas are likely to be wrong unless informed by good models. These models should be based on the mathematical theories of dynamical systems since functioning neurons are dynamic—they change their membrane potential and firing rates with time. Thus, selecting the appropriate type of dynamical system upon which to base a model is an important first step in the modeling process. This step all too easily goes awry, in part because there are many frameworks to choose from, in part because the sparsely sampled data can be consistent with a variety of dynamical processes, and in part because each modeler has a preferred modeling approach that is difficult to move away from. This brief review summarizes some of the main dynamical paradigms that can arise in neural circuits, with comments on what they can achieve computationally and what signatures might reveal their presence within empirical data. I provide examples of different dynamical systems using simple circuits of two or three cells, emphasizing that any one connectivity pattern is compatible with multiple, diverse functions.
Collapse
Affiliation(s)
- Paul Miller
- Volen National Center for Complex Systems, Brandeis University, Waltham, Massachusetts, 02454-9110, USA
| |
Collapse
|
7
|
Zhang X, Yi H, Bai W, Tian X. Dynamic trajectory of multiple single-unit activity during working memory task in rats. Front Comput Neurosci 2015; 9:117. [PMID: 26441626 PMCID: PMC4585230 DOI: 10.3389/fncom.2015.00117] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2015] [Accepted: 09/07/2015] [Indexed: 02/02/2023] Open
Abstract
Working memory plays an important role in complex cognitive tasks. A popular theoretical view is that transient properties of neuronal dynamics underlie cognitive processing. The question raised here as to how the transient dynamics evolve in working memory. To address this issue, we investigated the multiple single-unit activity dynamics in rat medial prefrontal cortex (mPFC) during a Y-maze working memory task. The approach worked by reconstructing state space from delays of the original single-unit firing rate variables, which were further analyzed using kernel principal component analysis (KPCA). Then the neural trajectories were obtained to visualize the multiple single-unit activity. Furthermore, the maximal Lyapunov exponent (MLE) was calculated to quantitatively evaluate the neural trajectories during the working memory task. The results showed that the neuronal activity produced stable and reproducible neural trajectories in the correct trials while showed irregular trajectories in the incorrect trials, which may establish a link between the neurocognitive process and behavioral performance in working memory. The MLEs significantly increased during working memory in the correctly performed trials, indicating an increased divergence of the neural trajectories. In the incorrect trials, the MLEs were nearly zero and remained unchanged during the task. Taken together, the trial-specific neural trajectory provides an effective way to track the instantaneous state of the neuronal population during the working memory task and offers valuable insights into working memory function. The MLE describes the changes of neural dynamics in working memory and may reflect different neuronal population states in working memory.
Collapse
Affiliation(s)
- Xiaofan Zhang
- Department of Biomedical Engineering, School of Biomedical Engineering and Technology, Tianjin Medical University Tianjin, China
| | - Hu Yi
- Department of Biomedical Engineering, School of Biomedical Engineering and Technology, Tianjin Medical University Tianjin, China
| | - Wenwen Bai
- Department of Biomedical Engineering, School of Biomedical Engineering and Technology, Tianjin Medical University Tianjin, China
| | - Xin Tian
- Department of Biomedical Engineering, School of Biomedical Engineering and Technology, Tianjin Medical University Tianjin, China
| |
Collapse
|
8
|
Langlois D, Cousineau D, Thivierge JP. Maximum likelihood estimators for truncated and censored power-law distributions show how neuronal avalanches may be misevaluated. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2014; 89:012709. [PMID: 24580259 DOI: 10.1103/physreve.89.012709] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/12/2013] [Indexed: 06/03/2023]
Abstract
The coordination of activity amongst populations of neurons in the brain is critical to cognition and behavior. One form of coordinated activity that has been widely studied in recent years is the so-called neuronal avalanche, whereby ongoing bursts of activity follow a power-law distribution. Avalanches that follow a power law are not unique to neuroscience, but arise in a broad range of natural systems, including earthquakes, magnetic fields, biological extinctions, fluid dynamics, and superconductors. Here, we show that common techniques that estimate this distribution fail to take into account important characteristics of the data and may lead to a sizable misestimation of the slope of power laws. We develop an alternative series of maximum likelihood estimators for discrete, continuous, bounded, and censored data. Using numerical simulations, we show that these estimators lead to accurate evaluations of power-law distributions, improving on common approaches. Next, we apply these estimators to recordings of in vitro rat neocortical activity. We show that different estimators lead to marked discrepancies in the evaluation of power-law distributions. These results call into question a broad range of findings that may misestimate the slope of power laws by failing to take into account key aspects of the observed data.
Collapse
Affiliation(s)
- Dominic Langlois
- School of Psychology, University of Ottawa, 136 Jean Jacques Lussier, Ottawa, Ontario, Canada K1N 6N5
| | - Denis Cousineau
- School of Psychology, University of Ottawa, 136 Jean Jacques Lussier, Ottawa, Ontario, Canada K1N 6N5
| | - J P Thivierge
- School of Psychology, University of Ottawa, 136 Jean Jacques Lussier, Ottawa, Ontario, Canada K1N 6N5
| |
Collapse
|
9
|
Cain N, Barreiro AK, Shadlen M, Shea-Brown E. Neural integrators for decision making: a favorable tradeoff between robustness and sensitivity. J Neurophysiol 2013; 109:2542-59. [PMID: 23446688 PMCID: PMC3653050 DOI: 10.1152/jn.00976.2012] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
A key step in many perceptual decision tasks is the integration of sensory inputs over time, but a fundamental questions remain about how this is accomplished in neural circuits. One possibility is to balance decay modes of membranes and synapses with recurrent excitation. To allow integration over long timescales, however, this balance must be exceedingly precise. The need for fine tuning can be overcome via a “robust integrator” mechanism in which momentary inputs must be above a preset limit to be registered by the circuit. The degree of this limiting embodies a tradeoff between sensitivity to the input stream and robustness against parameter mistuning. Here, we analyze the consequences of this tradeoff for decision-making performance. For concreteness, we focus on the well-studied random dot motion discrimination task and constrain stimulus parameters by experimental data. We show that mistuning feedback in an integrator circuit decreases decision performance but that the robust integrator mechanism can limit this loss. Intriguingly, even for perfectly tuned circuits with no immediate need for a robustness mechanism, including one often does not impose a substantial penalty for decision-making performance. The implication is that robust integrators may be well suited to subserve the basic function of evidence integration in many cognitive tasks. We develop these ideas using simulations of coupled neural units and the mathematics of sequential analysis.
Collapse
Affiliation(s)
- Nicholas Cain
- Department of Applied Mathematics, University of Washington, Seattle, Washington, USA
| | | | | | | |
Collapse
|
10
|
Neural dynamics and circuit mechanisms of decision-making. Curr Opin Neurobiol 2012; 22:1039-46. [PMID: 23026743 DOI: 10.1016/j.conb.2012.08.006] [Citation(s) in RCA: 69] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2012] [Revised: 08/19/2012] [Accepted: 08/21/2012] [Indexed: 11/24/2022]
Abstract
In this review, I briefly summarize current neurobiological studies of decision-making that bear on two general themes. The first focuses on the nature of neural representation and dynamics in a decision circuit. Experimental and computational results suggest that ramping-to-threshold in the temporal domain and trajectory of population activity in the state space represent a duality of perspectives on a decision process. Moreover, a decision circuit can display several different dynamical regimes, such as the ramping mode and the jumping mode with distinct defining properties. The second is concerned with the relationship between biologically-based mechanistic models and normative-type models. A fruitful interplay between experiments and these models at different levels of abstraction have enabled investigators to pose increasingly refined questions and gain new insights into the neural basis of decision-making. In particular, recent work on multi-alternative decisions suggests that deviations from rational models of choice behavior can be explained by established neural mechanisms.
Collapse
|
11
|
Computational models of decision making: integration, stability, and noise. Curr Opin Neurobiol 2012; 22:1047-53. [PMID: 22591667 DOI: 10.1016/j.conb.2012.04.013] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2012] [Revised: 04/16/2012] [Accepted: 04/24/2012] [Indexed: 11/20/2022]
Abstract
Decision making demands the accumulation of sensory evidence over time. Questions remain about how this occurs, but recent years have seen progress on several fronts. The first concerns when optimal accumulation of evidence coincides with the simplest method of accumulating neural activity: summation over time. The second involves what computations the brain might perform when summation is difficult due to imprecision in neural circuits or is suboptimal due to uncertainty or variability in how evidence arrives. Finally, the third concerns sources of noise in decision circuits. Empirical studies have better constrained the extent of this noise, and modeling work is helping to clarify its possible origins.
Collapse
|
12
|
Dissociation of response variability from firing rate effects in frontal eye field neurons during visual stimulation, working memory, and attention. J Neurosci 2012; 32:2204-16. [PMID: 22323732 DOI: 10.1523/jneurosci.2967-11.2012] [Citation(s) in RCA: 44] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Recent studies suggest that trial-to-trial variability of neuronal spiking responses may provide important information about behavioral state. Observed changes in variability during sensory stimulation, attention, motor preparation, and visual discrimination suggest that variability may reflect the engagement of neurons in a behavioral task. We examined changes in spiking variability of frontal eye field (FEF) neurons in a change detection task requiring monkeys to remember a visually cued location and direct attention to that location while ignoring distracters elsewhere. In this task, the firing rates (FRs) of FEF neurons not only continuously reflect the location of the remembered cue and select targets, but also predict detection performance on a trial-by-trial basis. Changes in FEF response variability, as measured by the Fano factor (FF), showed clear dissociations from changes in FR. The FF declined in response to visual stimulation at all tested locations, even in the opposite hemifield, indicating much broader spatial tuning of the FF compared with the FR. Furthermore, despite robust spatial modulation of the FR throughout all epochs of the task, spatial tuning of the FF did not persist throughout the delay period, nor did it show attentional modulation. These results indicate that changes in variability, at least in the FEF, are most effectively driven by visual stimulation, while behavioral engagement is not sufficient. Instead, changes in variability may reflect shifts in the balance between feedforward and recurrent sources of excitatory drive.
Collapse
|
13
|
Webb TJ, Rolls ET, Deco G, Feng J. Noise in attractor networks in the brain produced by graded firing rate representations. PLoS One 2011; 6:e23630. [PMID: 21931607 PMCID: PMC3169549 DOI: 10.1371/journal.pone.0023630] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2011] [Accepted: 07/20/2011] [Indexed: 11/19/2022] Open
Abstract
Representations in the cortex are often distributed with graded firing rates in the neuronal populations. The firing rate probability distribution of each neuron to a set of stimuli is often exponential or gamma. In processes in the brain, such as decision-making, that are influenced by the noise produced by the close to random spike timings of each neuron for a given mean rate, the noise with this graded type of representation may be larger than with the binary firing rate distribution that is usually investigated. In integrate-and-fire simulations of an attractor decision-making network, we show that the noise is indeed greater for a given sparseness of the representation for graded, exponential, than for binary firing rate distributions. The greater noise was measured by faster escaping times from the spontaneous firing rate state when the decision cues are applied, and this corresponds to faster decision or reaction times. The greater noise was also evident as less stability of the spontaneous firing state before the decision cues are applied. The implication is that spiking-related noise will continue to be a factor that influences processes such as decision-making, signal detection, short-term memory, and memory recall even with the quite large networks found in the cerebral cortex. In these networks there are several thousand recurrent collateral synapses onto each neuron. The greater noise with graded firing rate distributions has the advantage that it can increase the speed of operation of cortical circuitry.
Collapse
Affiliation(s)
- Tristan J. Webb
- Department of Computer Science and Complexity Science Centre, University of Warwick, Coventry, United Kingdom
| | - Edmund T. Rolls
- Oxford Centre for Computational Neuroscience, Oxford, United Kingdom
- Department of Computer Science, University of Warwick, Coventry, United Kingdom
| | - Gustavo Deco
- Theoretical and Computational Neuroscience, Universitat Pompeu Fabra, Barcelona, Spain
| | - Jianfeng Feng
- Department of Computer Science and Complexity Science Centre, University of Warwick, Coventry, United Kingdom
| |
Collapse
|
14
|
Cortical attractor network dynamics with diluted connectivity. Brain Res 2011; 1434:212-25. [PMID: 21875702 DOI: 10.1016/j.brainres.2011.08.002] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2011] [Revised: 07/29/2011] [Accepted: 08/02/2011] [Indexed: 11/23/2022]
Abstract
The connectivity of the cerebral cortex is diluted, with the probability of excitatory connections between even nearby pyramidal cells rarely more than 0.1, and in the hippocampus 0.04. To investigate the extent to which this diluted connectivity affects the dynamics of attractor networks in the cerebral cortex, we simulated an integrate-and-fire attractor network taking decisions between competing inputs with diluted connectivity of 0.25 or 0.1, and with the same number of synaptic connections per neuron for the recurrent collateral synapses within an attractor population as for full connectivity. The results indicated that there was less spiking-related noise with the diluted connectivity in that the stability of the network when in the spontaneous state of firing increased, and the accuracy of the correct decisions increased. The decision times were a little slower with diluted than with complete connectivity. Given that the capacity of the network is set by the number of recurrent collateral synaptic connections per neuron, on which there is a biological limit, the findings indicate that the stability of cortical networks, and the accuracy of their correct decisions or memory recall operations, can be increased by utilizing diluted connectivity and correspondingly increasing the number of neurons in the network, with little impact on the speed of processing of the cortex. Thus diluted connectivity can decrease cortical spiking-related noise. In addition, we show that the Fano factor for the trial-to-trial variability of the neuronal firing decreases from the spontaneous firing state value when the attractor network makes a decision. This article is part of a Special Issue entitled "Neural Coding".
Collapse
|
15
|
Churchland AK, Kiani R, Chaudhuri R, Wang XJ, Pouget A, Shadlen MN. Variance as a signature of neural computations during decision making. Neuron 2011; 69:818-31. [PMID: 21338889 DOI: 10.1016/j.neuron.2010.12.037] [Citation(s) in RCA: 228] [Impact Index Per Article: 17.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/03/2010] [Indexed: 10/18/2022]
Abstract
Traditionally, insights into neural computation have been furnished by averaged firing rates from many stimulus repetitions or trials. We pursue an analysis of neural response variance to unveil neural computations that cannot be discerned from measures of average firing rate. We analyzed single-neuron recordings from the lateral intraparietal area (LIP), during a perceptual decision-making task. Spike count variance was divided into two components using the law of total variance for doubly stochastic processes: (1) variance of counts that would be produced by a stochastic point process with a given rate, and loosely (2) the variance of the rates that would produce those counts (i.e., "conditional expectation"). The variance and correlation of the conditional expectation exposed several neural mechanisms: mixtures of firing rate states preceding the decision, accumulation of stochastic "evidence" during decision formation, and a stereotyped response at decision end. These analyses help to differentiate among several alternative decision-making models.
Collapse
Affiliation(s)
- Anne K Churchland
- Department of Physiology and Biophysics, University of Washington Medical School, National Primate Research Center, Seattle, WA 98195-7290, USA.
| | | | | | | | | | | |
Collapse
|
16
|
Hasselmo ME, Brandon MP, Yoshida M, Giocomo LM, Heys JG, Fransen E, Newman EL, Zilli EA. A phase code for memory could arise from circuit mechanisms in entorhinal cortex. Neural Netw 2009; 22:1129-38. [PMID: 19656654 PMCID: PMC2825042 DOI: 10.1016/j.neunet.2009.07.012] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2009] [Revised: 06/24/2009] [Accepted: 07/14/2009] [Indexed: 10/20/2022]
Abstract
Neurophysiological data reveals intrinsic cellular properties that suggest how entorhinal cortical neurons could code memory by the phase of their firing. Potential cellular mechanisms for this phase coding in models of entorhinal function are reviewed. This mechanism for phase coding provides a substrate for modeling the responses of entorhinal grid cells, as well as the replay of neural spiking activity during waking and sleep. Efforts to implement these abstract models in more detailed biophysical compartmental simulations raise specific issues that could be addressed in larger scale population models incorporating mechanisms of inhibition.
Collapse
Affiliation(s)
- Michael E Hasselmo
- Center for Memory and Brain, Department of Psychology and Program in Neuroscience, Boston University, 2 Cummington Street, Boston, MA 02215, USA.
| | | | | | | | | | | | | | | |
Collapse
|
17
|
Abstract
A contiguity effect-the finding that stimuli that occur close together in time become associated to each other--is observed between words that are separated by several seconds. The traditional account of contiguity effects is that item representations become associated to each other while active in a short-term memory buffer--a limited-capacity store that can hold a small, integral number of items. Participants studied and free recalled 48 lists of words. At the end of the session, participants were given a surprise final free recall test on all of the items from all of the lists. In addition to a standard contiguity effect between items presented at nearby serial positions, we simultaneously observed a contiguity effect between items presented in different lists. This latter contiguity effect extended over several lists, or several hundred seconds, well beyond the range that can be attributed to a buffer holding a small, integral number of items.
Collapse
|
18
|
Wang XJ. Decision making in recurrent neuronal circuits. Neuron 2008; 60:215-34. [PMID: 18957215 PMCID: PMC2710297 DOI: 10.1016/j.neuron.2008.09.034] [Citation(s) in RCA: 396] [Impact Index Per Article: 24.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2008] [Revised: 09/21/2008] [Accepted: 09/23/2008] [Indexed: 11/28/2022]
Abstract
Decision making has recently emerged as a central theme in neurophysiological studies of cognition, and experimental and computational work has led to the proposal of a cortical circuit mechanism of elemental decision computations. This mechanism depends on slow recurrent synaptic excitation balanced by fast feedback inhibition, which not only instantiates attractor states for forming categorical choices but also long transients for gradually accumulating evidence in favor of or against alternative options. Such a circuit endowed with reward-dependent synaptic plasticity is able to produce adaptive choice behavior. While decision threshold is a core concept for reaction time tasks, it can be dissociated from a general decision rule. Moreover, perceptual decisions and value-based economic choices are described within a unified framework in which probabilistic choices result from irregular neuronal activity as well as iterative interactions of a decision maker with an uncertain environment or other unpredictable decision makers in a social group.
Collapse
Affiliation(s)
- Xiao-Jing Wang
- Department of Neurobiology and Kavli Institute for Neuroscience, Yale University School of Medicine, New Haven, CT 06510, USA.
| |
Collapse
|
19
|
Population dynamics: variance and the sigmoid activation function. Neuroimage 2008; 42:147-57. [PMID: 18547818 DOI: 10.1016/j.neuroimage.2008.04.239] [Citation(s) in RCA: 105] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2008] [Revised: 04/08/2008] [Accepted: 04/16/2008] [Indexed: 11/27/2022] Open
Abstract
This paper demonstrates how the sigmoid activation function of neural-mass models can be understood in terms of the variance or dispersion of neuronal states. We use this relationship to estimate the probability density on hidden neuronal states, using non-invasive electrophysiological (EEG) measures and dynamic casual modelling. The importance of implicit variance in neuronal states for neural-mass models of cortical dynamics is illustrated using both synthetic data and real EEG measurements of sensory evoked responses.
Collapse
|
20
|
Ly C, Tranchina D. Critical analysis of dimension reduction by a moment closure method in a population density approach to neural network modeling. Neural Comput 2007; 19:2032-92. [PMID: 17571938 DOI: 10.1162/neco.2007.19.8.2032] [Citation(s) in RCA: 68] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Computational techniques within the population density function (PDF) framework have provided time-saving alternatives to classical Monte Carlo simulations of neural network activity. Efficiency of the PDF method is lost as the underlying neuron model is made more realistic and the number of state variables increases. In a detailed theoretical and computational study, we elucidate strengths and weaknesses of dimension reduction by a particular moment closure method (Cai, Tao, Shelley, & McLaughlin, 2004; Cai, Tao, Rangan, & McLaughlin, 2006) as applied to integrate-and-fire neurons that receive excitatory synaptic input only. When the unitary postsynaptic conductance event has a single-exponential time course, the evolution equation for the PDF is a partial differential integral equation in two state variables, voltage and excitatory conductance. In the moment closure method, one approximates the conditional kth centered moment of excitatory conductance given voltage by the corresponding unconditioned moment. The result is a system of k coupled partial differential equations with one state variable, voltage, and k coupled ordinary differential equations. Moment closure at k = 2 works well, and at k = 3 works even better, in the regime of high dynamically varying synaptic input rates. Both closures break down at lower synaptic input rates. Phase-plane analysis of the k = 2 problem with typical parameters proves, and reveals why, no steady-state solutions exist below a synaptic input rate that gives a firing rate of 59 s(1) in the full 2D problem. Closure at k = 3 fails for similar reasons. Low firing-rate solutions can be obtained only with parameters for the amplitude or kinetics (or both) of the unitary postsynaptic conductance event that are on the edge of the physiological range. We conclude that this dimension-reduction method gives ill-posed problems for a wide range of physiological parameters, and we suggest future directions.
Collapse
Affiliation(s)
- Cheng Ly
- Courant Institute of Mathematical Sciences, New York University, New York, NY 10012, USA.
| | | |
Collapse
|
21
|
Carter E, Wang XJ. Cannabinoid-Mediated Disinhibition and Working Memory: Dynamical Interplay of Multiple Feedback Mechanisms in a Continuous Attractor Model of Prefrontal Cortex. Cereb Cortex 2007; 17 Suppl 1:i16-26. [PMID: 17725998 DOI: 10.1093/cercor/bhm103] [Citation(s) in RCA: 42] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023] Open
Abstract
Recurrent excitation is believed to underlie persistent neural activity observed in the prefrontal cortex and elsewhere during working memory. However, other positive and negative feedback mechanisms, operating on disparate timescales, may also play significant roles in determining the behavior of a working memory circuit. In this study, we examined dynamical interactions of multiple feedback mechanisms in a biophysically based neural model of spatial working memory. In such continuous attractor networks, a self-sustained activity pattern tends to drift randomly, resulting in a decreased accuracy of memory over time. Moreover, attractor states become unstable when spike-frequency adaptation reduces the excitability of persistently firing pyramidal neurons. Here, we show that a slow activity-dependent local disinhibition, namely cannabinoid-dependent depolarization-induced suppression of inhibition (DSI), can counteract these destabilizing effects, rendering working memory function more robust. In addition, the slow DSI effect gives rise to trial-to-trial correlations of memory-guided behavioral responses. On the other hand, computer simulations revealed that a global cannabinoid agonist (mimicking the effect of drug intake) yields the opposite effect. Thus, this work suggests a circuit scenario according to which endogenous DSI is beneficial for, whereas an exogenous drug such as marijuana is detrimental to, working memory and possibly other prefrontal functions.
Collapse
Affiliation(s)
- Eugene Carter
- Department of Biology, Brandeis University, Waltham, MA 02454-9110, USA
| | | |
Collapse
|
22
|
Shafi M, Zhou Y, Quintana J, Chow C, Fuster J, Bodner M. Variability in neuronal activity in primate cortex during working memory tasks. Neuroscience 2007; 146:1082-108. [PMID: 17418956 DOI: 10.1016/j.neuroscience.2006.12.072] [Citation(s) in RCA: 133] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2004] [Revised: 11/22/2006] [Accepted: 12/24/2006] [Indexed: 11/22/2022]
Abstract
Persistent elevated neuronal activity has been identified as the neuronal correlate of working memory. It is generally assumed in the literature and in computational and theoretical models of working memory that memory-cell activity is stable and replicable; however, this assumption may be an artifact of the averaging of data collected across trials, and needs experimental verification. In this study, we introduce a classification scheme to characterize the firing frequency trends of cells recorded from the cortex of monkeys during performance of working memory tasks. We examine the frequency statistics and variability of firing during baseline and memory periods. We also study the behavior of cells on individual trials and across trials, and explore the stability of cellular firing during the memory period. We find that cells from different firing-trend classes possess markedly different statistics. We also find that individual cells show substantial variability in their firing behavior across trials, and that firing frequency also varies markedly over the course of a single trial. Finally, the average frequency distribution is wider, the magnitude of the frequency increases from baseline to memory smaller, and the magnitude of frequency decreases larger than is generally assumed. These results may serve as a guide in the evaluation of current theories of the cortical mechanisms of working memory.
Collapse
Affiliation(s)
- M Shafi
- Neuropsychiatric Institute, 760 Westwood Plaza, School of Medicine, University of California, Los Angeles, CA 90095-1759, USA
| | | | | | | | | | | |
Collapse
|
23
|
Miller P, Wang XJ. Stability of discrete memory states to stochastic fluctuations in neuronal systems. CHAOS (WOODBURY, N.Y.) 2006; 16:026109. [PMID: 16822041 PMCID: PMC3897304 DOI: 10.1063/1.2208923] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
Noise can degrade memories by causing transitions from one memory state to another. For any biological memory system to be useful, the time scale of such noise-induced transitions must be much longer than the required duration for memory retention. Using biophysically-realistic modeling, we consider two types of memory in the brain: short-term memories maintained by reverberating neuronal activity for a few seconds, and long-term memories maintained by a molecular switch for years. Both systems require persistence of (neuronal or molecular) activity self-sustained by an autocatalytic process and, we argue, that both have limited memory lifetimes because of significant fluctuations. We will first discuss a strongly recurrent cortical network model endowed with feedback loops, for short-term memory. Fluctuations are due to highly irregular spike firing, a salient characteristic of cortical neurons. Then, we will analyze a model for long-term memory, based on an autophosphorylation mechanism of calcium/calmodulin-dependent protein kinase II (CaMKII) molecules. There, fluctuations arise from the fact that there are only a small number of CaMKII molecules at each postsynaptic density (putative synaptic memory unit). Our results are twofold. First, we demonstrate analytically and computationally the exponential dependence of stability on the number of neurons in a self-excitatory network, and on the number of CaMKII proteins in a molecular switch. Second, for each of the two systems, we implement graded memory consisting of a group of bistable switches. For the neuronal network we report interesting ramping temporal dynamics as a result of sequentially switching an increasing number of discrete, bistable, units. The general observation of an exponential increase in memory stability with the system size leads to a trade-off between the robustness of memories (which increases with the size of each bistable unit) and the total amount of information storage (which decreases with increasing unit size), which may be optimized in the brain through biological evolution.
Collapse
Affiliation(s)
- Paul Miller
- Volen Center for Complex Systems, Brandeis University, Waltham, Massachusetts 02454, USA.
| | | |
Collapse
|