1
|
Agetsuma M, Sato I, Tanaka YR, Carrillo-Reid L, Kasai A, Noritake A, Arai Y, Yoshitomo M, Inagaki T, Yukawa H, Hashimoto H, Nabekura J, Nagai T. Activity-dependent organization of prefrontal hub-networks for associative learning and signal transformation. Nat Commun 2023; 14:5996. [PMID: 37803014 PMCID: PMC10558457 DOI: 10.1038/s41467-023-41547-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2022] [Accepted: 09/08/2023] [Indexed: 10/08/2023] Open
Abstract
Associative learning is crucial for adapting to environmental changes. Interactions among neuronal populations involving the dorso-medial prefrontal cortex (dmPFC) are proposed to regulate associative learning, but how these neuronal populations store and process information about the association remains unclear. Here we developed a pipeline for longitudinal two-photon imaging and computational dissection of neural population activities in male mouse dmPFC during fear-conditioning procedures, enabling us to detect learning-dependent changes in the dmPFC network topology. Using regularized regression methods and graphical modeling, we found that fear conditioning drove dmPFC reorganization to generate a neuronal ensemble encoding conditioned responses (CR) characterized by enhanced internal coactivity, functional connectivity, and association with conditioned stimuli (CS). Importantly, neurons strongly responding to unconditioned stimuli during conditioning subsequently became hubs of this novel associative network for the CS-to-CR transformation. Altogether, we demonstrate learning-dependent dynamic modulation of population coding structured on the activity-dependent formation of the hub network within the dmPFC.
Collapse
Grants
- MEXT | Japan Society for the Promotion of Science (JSPS)
- This study was supported by the Japan Science and Technology Agency, PRESTO (to M.A.), JSPS KAKENHI Grant (grant number JP18K06536, JP18H05144, JP20H05076, JP21H02801, JP22H05081, JP22H05519 to M.A.; JP20H03357, JP20H05073, JP21K18563 to Y.R.T.; JP20H05065, JP22H05080 to A.K.; JP22H05081 to A.N.), JSPS Bilateral Program (JPJSBP1-20199901 to M.A.), AMED (grant number JP19dm0207086 to M.A.; JP21dm0207117 to H.H.), the grant of Joint Research by the National Institutes of Natural Sciences (NINS program No 01112008 and 01112106 to M.A.), and grants from Brain Science Foundation and Shimadzu Foundation to M.A. and the Takeda Science Foundation to A.K. and H.H. Authors declare that they have no competing interests.
Collapse
Affiliation(s)
- Masakazu Agetsuma
- Division of Homeostatic Development, National Institute for Physiological Sciences, 38 Nishigohnaka Myodaiji-cho, Okazaki, Aichi, 444-8585, Japan.
- Japan Science and Technology Agency, PRESTO, 4-1-8 Honcho, Kawaguchi, Saitama, 332-0012, Japan.
- SANKEN (The Institute of Scientific and Industrial Research), Osaka University, Mihogaoka 8-1, Ibaraki, Osaka, 567-0047, Japan.
- Division of Molecular Design, Research Center for Systems Immunology, Medical Institute of Bioregulation, Kyushu University, 3-1-1 Maidashi, Higashi-ku, Fukuoka, 812-8582, Japan.
- Quantum Regenerative and Biomedical Engineering Team, Institute for Quantum Life Science, National Institutes for Quantum Science and Technology (QST), Anagawa 4-9-1, Chiba Inage-ku, Chiba, 263-8555, Japan.
| | - Issei Sato
- Department of Computer Science, Graduate School of Information Science and Technology, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo, 113-0033, Japan
| | - Yasuhiro R Tanaka
- Brain Science Institute, Tamagawa University, 6-1-1 Tamagawagakuen, Machida, Tokyo, 194-8610, Japan
| | - Luis Carrillo-Reid
- Instituto de Neurobiologia, National Autonomous University of Mexico, Boulevard Juriquilla 3001, Juriquilla, Queretaro, CP, 76230, Mexico
| | - Atsushi Kasai
- Graduate School of Pharmaceutical Sciences, Osaka University, Yamadaoka 1-6, Suita, Osaka, 565-0871, Japan
| | - Atsushi Noritake
- Division of Behavioral Development, National Institute for Physiological Sciences, 38 Nishigohnaka Myodaiji-cho, Okazaki, Aichi, 444-8585, Japan
| | - Yoshiyuki Arai
- SANKEN (The Institute of Scientific and Industrial Research), Osaka University, Mihogaoka 8-1, Ibaraki, Osaka, 567-0047, Japan
| | - Miki Yoshitomo
- Division of Homeostatic Development, National Institute for Physiological Sciences, 38 Nishigohnaka Myodaiji-cho, Okazaki, Aichi, 444-8585, Japan
| | - Takashi Inagaki
- Division of Homeostatic Development, National Institute for Physiological Sciences, 38 Nishigohnaka Myodaiji-cho, Okazaki, Aichi, 444-8585, Japan
| | - Hiroshi Yukawa
- Quantum Regenerative and Biomedical Engineering Team, Institute for Quantum Life Science, National Institutes for Quantum Science and Technology (QST), Anagawa 4-9-1, Chiba Inage-ku, Chiba, 263-8555, Japan
- Institute of Nano-Life-Systems, Institutes of Innovation for Future Society Nagoya University, Furo-cho, Chikusa-ku, Nagoya, 464-8603, Japan
| | - Hitoshi Hashimoto
- Graduate School of Pharmaceutical Sciences, Osaka University, Yamadaoka 1-6, Suita, Osaka, 565-0871, Japan
- United Graduate School of Child Development, Osaka University, Kanazawa University, Hamamatsu University School of Medicine, Chiba University, and University of Fukui, 2-2 Yamadaoka, Suita, Osaka, 565-0871, Japan
- Division of Bioscience, Institute for Datability Science, Osaka University, 1-8 Yamadaoka, Suita, Osaka, 565-0871, Japan
- Open and Transdisciplinary Research Initiatives, Osaka University, 2-1 Yamadaoka, Suita, Osaka, 565-0871, Japan
- Graduate School of Medicine, Osaka University, 2-2 Yamadaoka, Suita, Osaka, 565-0871, Japan
| | - Junichi Nabekura
- Division of Homeostatic Development, National Institute for Physiological Sciences, 38 Nishigohnaka Myodaiji-cho, Okazaki, Aichi, 444-8585, Japan
| | - Takeharu Nagai
- SANKEN (The Institute of Scientific and Industrial Research), Osaka University, Mihogaoka 8-1, Ibaraki, Osaka, 567-0047, Japan
| |
Collapse
|
2
|
Optimal Population Coding for Dynamic Input by Nonequilibrium Networks. ENTROPY 2022; 24:e24050598. [PMID: 35626482 PMCID: PMC9140425 DOI: 10.3390/e24050598] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/10/2022] [Revised: 04/07/2022] [Accepted: 04/19/2022] [Indexed: 12/04/2022]
Abstract
The efficient coding hypothesis states that neural response should maximize its information about the external input. Theoretical studies focus on optimal response in single neuron and population code in networks with weak pairwise interactions. However, more biological settings with asymmetric connectivity and the encoding for dynamical stimuli have not been well-characterized. Here, we study the collective response in a kinetic Ising model that encodes the dynamic input. We apply gradient-based method and mean-field approximation to reconstruct networks given the neural code that encodes dynamic input patterns. We measure network asymmetry, decoding performance, and entropy production from networks that generate optimal population code. We analyze how stimulus correlation, time scale, and reliability of the network affect optimal encoding networks. Specifically, we find network dynamics altered by statistics of the dynamic input, identify stimulus encoding strategies, and show optimal effective temperature in the asymmetric networks. We further discuss how this approach connects to the Bayesian framework and continuous recurrent neural networks. Together, these results bridge concepts of nonequilibrium physics with the analyses of dynamics and coding in networks.
Collapse
|
3
|
Yang C, Liu X. A Novel Neural Metric Based on Deep Boltzmann Machine. Neural Process Lett 2022. [DOI: 10.1007/s11063-022-10810-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
4
|
Identification of Pattern Completion Neurons in Neuronal Ensembles Using Probabilistic Graphical Models. J Neurosci 2021; 41:8577-8588. [PMID: 34413204 DOI: 10.1523/jneurosci.0051-21.2021] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2021] [Revised: 07/06/2021] [Accepted: 07/11/2021] [Indexed: 01/21/2023] Open
Abstract
Neuronal ensembles are groups of neurons with coordinated activity that could represent sensory, motor, or cognitive states. The study of how neuronal ensembles are built, recalled, and involved in the guiding of complex behaviors has been limited by the lack of experimental and analytical tools to reliably identify and manipulate neurons that have the ability to activate entire ensembles. Such pattern completion neurons have also been proposed as key elements of artificial and biological neural networks. Indeed, the relevance of pattern completion neurons is highlighted by growing evidence that targeting them can activate neuronal ensembles and trigger behavior. As a method to reliably detect pattern completion neurons, we use conditional random fields (CRFs), a type of probabilistic graphical model. We apply CRFs to identify pattern completion neurons in ensembles in experiments using in vivo two-photon calcium imaging from primary visual cortex of male mice and confirm the CRFs predictions with two-photon optogenetics. To test the broader applicability of CRFs we also analyze publicly available calcium imaging data (Allen Institute Brain Observatory dataset) and demonstrate that CRFs can reliably identify neurons that predict specific features of visual stimuli. Finally, to explore the scalability of CRFs we apply them to in silico network simulations and show that CRFs-identified pattern completion neurons have increased functional connectivity. These results demonstrate the potential of CRFs to characterize and selectively manipulate neural circuits.SIGNIFICANCE STATEMENT We describe a graph theory method to identify and optically manipulate neurons with pattern completion capability in mouse cortical circuits. Using calcium imaging and two-photon optogenetics in vivo we confirm that key neurons identified by this method can recall entire neuronal ensembles. This method could be broadly applied to manipulate neuronal ensemble activity to trigger behavior or for therapeutic applications in brain prostheses.
Collapse
|
5
|
Kandeepan S, Rudas J, Gomez F, Stojanoski B, Valluri S, Owen AM, Naci L, Nichols ES, Soddu A. Modeling an auditory stimulated brain under altered states of consciousness using the generalized Ising model. Neuroimage 2020; 223:117367. [PMID: 32931944 DOI: 10.1016/j.neuroimage.2020.117367] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2020] [Revised: 08/08/2020] [Accepted: 09/08/2020] [Indexed: 10/23/2022] Open
Abstract
Propofol is a short-acting medication that results in decreased levels of consciousness and is used for general anesthesia. Although it is the most commonly used anesthetic in the world, much remains unknown about the mechanisms by which it induces a loss of consciousness. Characterizing anesthesia-induced alterations to brain network activity might provide a powerful framework for understanding the neural mechanisms of unconsciousness. The aim of this work was to model brain activity in healthy brains during various stages of consciousness, as induced by propofol, in the auditory paradigm. We used the generalized Ising model (GIM) to fit the empirical fMRI data of healthy subjects while they listened to an audio clip from a movie. The external stimulus (audio clip) is believed to be at least partially driving a synchronization process of the brain activity and provides a similar conscious experience in different subjects. In order to observe the common synchronization among the subjects, a novel technique called the inter subject correlation (ISC) was implemented. We showed that the GIM-modified to incorporate the naturalistic external field-was able to fit the empirical task fMRI data in the awake state, in mild sedation, in deep sedation, and in recovery, at a temperature T* which is well above the critical temperature. To our knowledge this is the first study that captures human brain activity in response to real-life external stimuli at different levels of conscious awareness using mathematical modeling. This study might be helpful in the future to assess the level of consciousness of patients with disorders of consciousness and help in regaining their consciousness.
Collapse
Affiliation(s)
- Sivayini Kandeepan
- Department of Physics and Astronomy and the Brain and Mind Institute, University of Western Ontario, 1151 Richmond St, London, ON, N6A 3K7, Canada; Department of Physics, Faculty of Applied Sciences, University of Sri Jayewardenepura, Nugegoda, Sri Lanka.
| | - Jorge Rudas
- Department of Mathematics, Universidad Nacional de Colombia, Cra 45, Bogotá, Colombia
| | - Francisco Gomez
- Department of Mathematics, Universidad Nacional de Colombia, Cra 45, Bogotá, Colombia
| | - Bobby Stojanoski
- Brain and Mind Institute, University of Western Ontario, 1151 Richmond St, London, Ontario, N6A 3K7, Canada
| | - Sreeram Valluri
- Department of Physics and Astronomy and the Brain and Mind Institute, University of Western Ontario, 1151 Richmond St, London, ON, N6A 3K7, Canada
| | - Adrian Mark Owen
- Brain and Mind Institute, University of Western Ontario, 1151 Richmond St, London, Ontario, N6A 3K7, Canada
| | - Lorina Naci
- Trinity College Institute of Neuroscience, Trinity College Dublin, College Green, Dublin 2, Ireland
| | - Emily Sophia Nichols
- Brain and Mind Institute, University of Western Ontario, 1151 Richmond St, London, Ontario, N6A 3K7, Canada
| | - Andrea Soddu
- Department of Physics and Astronomy and the Brain and Mind Institute, University of Western Ontario, 1151 Richmond St, London, ON, N6A 3K7, Canada
| |
Collapse
|
6
|
Garolini D, Vitalis A, Caflisch A. Unsupervised identification of states from voltage recordings of neural networks. J Neurosci Methods 2019; 318:104-117. [PMID: 30807781 DOI: 10.1016/j.jneumeth.2019.01.019] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2018] [Revised: 01/31/2019] [Accepted: 01/31/2019] [Indexed: 12/23/2022]
Abstract
BACKGROUND Modern techniques for multi-neuronal recording produce large amounts of data. There is no automatic procedure for the identification of states in recurrent voltage patterns. NEW METHOD We propose NetSAP (Network States And Pathways), a data-driven analysis method that is able to recognize multi-neuron voltage patterns (states). To capture the subtle differences between snapshots in voltage recordings, NetSAP infers the underlying functional neural network in a time-resolved manner with a sliding window approach. Then NetSAP identifies states from a reordering of the time series of inferred networks according to a user-defined metric. The procedure for unsupervised identification of states was developed originally for the analysis of molecular dynamics simulations of proteins. RESULTS We tested NetSAP on neural network simulations of GABAergic inhibitory interneurons. Most simulation parameters are chosen to reproduce literature observations, and we keep noise terms as control parameters to regulate the coherence of the simulated signals. NetSAP is able to identify multiple states even in the case of high internal noise and low signal coherence. We provide evidence that NetSAP is robust for networks with up to about 50% of the neurons spiking randomly. NetSAP is scalable and its code is open source. COMPARISON WITH EXISTING METHODS NetSAP outperforms common analysis techniques, such as PCA and k-means clustering, on a simulated recording of voltage traces of 50 neurons. CONCLUSIONS NetSAP analysis is an efficient tool to identify voltage patterns from neuronal recordings.
Collapse
Affiliation(s)
- Davide Garolini
- Department of Biochemistry, University of Zurich, Winterthurerstrasse 190, CH-8057 Zurich, Switzerland
| | - Andreas Vitalis
- Department of Biochemistry, University of Zurich, Winterthurerstrasse 190, CH-8057 Zurich, Switzerland
| | - Amedeo Caflisch
- Department of Biochemistry, University of Zurich, Winterthurerstrasse 190, CH-8057 Zurich, Switzerland.
| |
Collapse
|
7
|
Gardella C, Marre O, Mora T. Modeling the Correlated Activity of Neural Populations: A Review. Neural Comput 2018; 31:233-269. [PMID: 30576613 DOI: 10.1162/neco_a_01154] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The principles of neural encoding and computations are inherently collective and usually involve large populations of interacting neurons with highly correlated activities. While theories of neural function have long recognized the importance of collective effects in populations of neurons, only in the past two decades has it become possible to record from many cells simultaneously using advanced experimental techniques with single-spike resolution and to relate these correlations to function and behavior. This review focuses on the modeling and inference approaches that have been recently developed to describe the correlated spiking activity of populations of neurons. We cover a variety of models describing correlations between pairs of neurons, as well as between larger groups, synchronous or delayed in time, with or without the explicit influence of the stimulus, and including or not latent variables. We discuss the advantages and drawbacks or each method, as well as the computational challenges related to their application to recordings of ever larger populations.
Collapse
Affiliation(s)
- Christophe Gardella
- Laboratoire de physique statistique, CNRS, Sorbonne Université, Université Paris-Diderot, and École normale supérieure, 75005 Paris, France, and Institut de la Vision, INSERM, CNRS, and Sorbonne Université, 75012 Paris, France
| | - Olivier Marre
- Institut de la Vision, INSERM, CNRS, and Sorbonne Université, 75012 Paris, France
| | - Thierry Mora
- Laboratoire de physique statistique, CNRS, Sorbonne Université, Université Paris-Diderot, and École normale supérieure, 75005 Paris, France
| |
Collapse
|
8
|
Donner C, Obermayer K, Shimazaki H. Approximate Inference for Time-Varying Interactions and Macroscopic Dynamics of Neural Populations. PLoS Comput Biol 2017; 13:e1005309. [PMID: 28095421 PMCID: PMC5283755 DOI: 10.1371/journal.pcbi.1005309] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2016] [Revised: 01/31/2017] [Accepted: 12/12/2016] [Indexed: 11/29/2022] Open
Abstract
The models in statistical physics such as an Ising model offer a convenient way to characterize stationary activity of neural populations. Such stationary activity of neurons may be expected for recordings from in vitro slices or anesthetized animals. However, modeling activity of cortical circuitries of awake animals has been more challenging because both spike-rates and interactions can change according to sensory stimulation, behavior, or an internal state of the brain. Previous approaches modeling the dynamics of neural interactions suffer from computational cost; therefore, its application was limited to only a dozen neurons. Here by introducing multiple analytic approximation methods to a state-space model of neural population activity, we make it possible to estimate dynamic pairwise interactions of up to 60 neurons. More specifically, we applied the pseudolikelihood approximation to the state-space model, and combined it with the Bethe or TAP mean-field approximation to make the sequential Bayesian estimation of the model parameters possible. The large-scale analysis allows us to investigate dynamics of macroscopic properties of neural circuitries underlying stimulus processing and behavior. We show that the model accurately estimates dynamics of network properties such as sparseness, entropy, and heat capacity by simulated data, and demonstrate utilities of these measures by analyzing activity of monkey V4 neurons as well as a simulated balanced network of spiking neurons. Simultaneous analysis of large-scale neural populations is necessary to understand coding principles of neurons because they concertedly process information. Methods of thermodynamics and statistical mechanics are useful to understand collective phenomena of the interacting elements, and they have been successfully used to understand diverse activity of neurons. However, most analysis methods assume stationary data, in which activity rates of neurons and their correlations are constant over time. This assumption is easily violated in the data recorded from awake animals. Neural correlations likely organize dynamically during behavior and cognition, and this may be independent from the modulated activity rates of individual neurons. Recently several methods were proposed to simultaneously estimate dynamics of neural interactions. However, these methods are applicable to up to about 10 neurons. Here by combining multiple analytic approximation methods, we made it possible to estimate time-varying interactions of much larger neural populations. The method allows us to trace dynamic macroscopic properties of neural circuitries such as sparseness, entropy, and sensitivity. Using these statistics, researchers can now quantify to what extent neurons are correlated or de-correlated, and test if neural systems are susceptible within a specific behavioral period.
Collapse
Affiliation(s)
- Christian Donner
- Bernstein Center for Computational Neuroscience, Berlin, Germany
- Neural Information Processing Group, Department of Electrical Engineering and Computer Science, Technische Universität Berlin, Berlin, Germany
- Group for Methods of Artificial Intelligence, Department of Electrical Engineering and Computer Science, Technische Universität Berlin, Berlin, Germany
| | - Klaus Obermayer
- Bernstein Center for Computational Neuroscience, Berlin, Germany
- Neural Information Processing Group, Department of Electrical Engineering and Computer Science, Technische Universität Berlin, Berlin, Germany
| | | |
Collapse
|
9
|
Schultz SR, Copeland CS, Foust AJ, Quicke P, Schuck R. Advances in two photon scanning and scanless microscopy technologies for functional neural circuit imaging. PROCEEDINGS OF THE IEEE. INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS 2017; 105:139-157. [PMID: 28757657 PMCID: PMC5526632 DOI: 10.1109/jproc.2016.2577380] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/11/2023]
Abstract
Recent years have seen substantial developments in technology for imaging neural circuits, raising the prospect of large scale imaging studies of neural populations involved in information processing, with the potential to lead to step changes in our understanding of brain function and dysfunction. In this article we will review some key recent advances: improved fluorophores for single cell resolution functional neuroimaging using a two photon microscope; improved approaches to the problem of scanning active circuits; and the prospect of scanless microscopes which overcome some of the bandwidth limitations of current imaging techniques. These advances in technology for experimental neuroscience have in themselves led to technical challenges, such as the need for the development of novel signal processing and data analysis tools in order to make the most of the new experimental tools. We review recent work in some active topics, such as region of interest segmentation algorithms capable of demixing overlapping signals, and new highly accurate algorithms for calcium transient detection. These advances motivate the development of new data analysis tools capable of dealing with spatial or spatiotemporal patterns of neural activity, that scale well with pattern size.
Collapse
Affiliation(s)
- Simon R Schultz
- Center for Neurotechnology and Department of Bioengineering Imperial College London, South Kensington, LondonSW7 2AZ, UK
| | - Caroline S Copeland
- Center for Neurotechnology and Department of Bioengineering Imperial College London, South Kensington, LondonSW7 2AZ, UK
| | - Amanda J Foust
- Center for Neurotechnology and Department of Bioengineering Imperial College London, South Kensington, LondonSW7 2AZ, UK
| | - Peter Quicke
- Center for Neurotechnology and Department of Bioengineering Imperial College London, South Kensington, LondonSW7 2AZ, UK
| | - Renaud Schuck
- Center for Neurotechnology and Department of Bioengineering Imperial College London, South Kensington, LondonSW7 2AZ, UK
| |
Collapse
|
10
|
O'Donnell C, Gonçalves JT, Whiteley N, Portera-Cailliau C, Sejnowski TJ. The Population Tracking Model: A Simple, Scalable Statistical Model for Neural Population Data. Neural Comput 2016; 29:50-93. [PMID: 27870612 DOI: 10.1162/neco_a_00910] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/21/2023]
Abstract
Our understanding of neural population coding has been limited by a lack of analysis methods to characterize spiking data from large populations. The biggest challenge comes from the fact that the number of possible network activity patterns scales exponentially with the number of neurons recorded ([Formula: see text]). Here we introduce a new statistical method for characterizing neural population activity that requires semi-independent fitting of only as many parameters as the square of the number of neurons, requiring drastically smaller data sets and minimal computation time. The model works by matching the population rate (the number of neurons synchronously active) and the probability that each individual neuron fires given the population rate. We found that this model can accurately fit synthetic data from up to 1000 neurons. We also found that the model could rapidly decode visual stimuli from neural population data from macaque primary visual cortex about 65 ms after stimulus onset. Finally, we used the model to estimate the entropy of neural population activity in developing mouse somatosensory cortex and, surprisingly, found that it first increases, and then decreases during development. This statistical model opens new options for interrogating neural population data and can bolster the use of modern large-scale in vivo Ca[Formula: see text] and voltage imaging tools.
Collapse
Affiliation(s)
- Cian O'Donnell
- Department of Computer Science, University of Bristol, Bristol BS81UB. U.K., and Howard Hughes Medical Institute, Salk Institute for Biological Studies, La Jolla, CA 92037, U.S.A.
| | - J Tiago Gonçalves
- Salk Institute for Biological Studies, La Jolla, CA 92037, U.S.A., and Departments of Neurology and Neurobiology, David Geffen School of Medicine at UCLA, Los Angeles, CA 90095, U.S.A.
| | - Nick Whiteley
- School of Mathematics, University of Bristol, Bristol BS81UB, U.K.
| | - Carlos Portera-Cailliau
- Departments of Neurology and Neurobiology, David Geffen School of Medicine at UCLA, Los Angeles, CA 90095, U.S.A.
| | - Terrence J Sejnowski
- Howard Hughes Medical Institute, Salk Institute for Biological Studies, La Jolla, CA 92037, U.S.A., and Division of Biological Sciences, University of California at San Diego, La Jolla, CA 92093, U.S.A.
| |
Collapse
|
11
|
Wohrer A, Machens CK. On the number of neurons and time scale of integration underlying the formation of percepts in the brain. PLoS Comput Biol 2015; 11:e1004082. [PMID: 25793393 PMCID: PMC4368836 DOI: 10.1371/journal.pcbi.1004082] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2013] [Accepted: 12/10/2014] [Indexed: 11/18/2022] Open
Abstract
All of our perceptual experiences arise from the activity of neural populations. Here we study the formation of such percepts under the assumption that they emerge from a linear readout, i.e., a weighted sum of the neurons’ firing rates. We show that this assumption constrains the trial-to-trial covariance structure of neural activities and animal behavior. The predicted covariance structure depends on the readout parameters, and in particular on the temporal integration window w and typical number of neurons K used in the formation of the percept. Using these predictions, we show how to infer the readout parameters from joint measurements of a subject’s behavior and neural activities. We consider three such scenarios: (1) recordings from the complete neural population, (2) recordings of neuronal sub-ensembles whose size exceeds K, and (3) recordings of neuronal sub-ensembles that are smaller than K. Using theoretical arguments and artificially generated data, we show that the first two scenarios allow us to recover the typical spatial and temporal scales of the readout. In the third scenario, we show that the readout parameters can only be recovered by making additional assumptions about the structure of the full population activity. Our work provides the first thorough interpretation of (feed-forward) percept formation from a population of sensory neurons. We discuss applications to experimental recordings in classic sensory decision-making tasks, which will hopefully provide new insights into the nature of perceptual integration. This article deals with the interpretation of neural activities during perceptual decision-making tasks, where animals must assess the value of a sensory stimulus and take a decision on the basis of their percept. A “standard model” for these tasks has progressively emerged, whence the animal’s percept and subsequent choice on each trial are obtained from a linear integration of the activity of sensory neurons. However, up to date, there has been no principled method to estimate the parameters of this model: mainly, the typical number of neurons K from the population involved in conveying the percept, and the typical time scale w during which these neurons’ activities are integrated. In this article, we propose a novel method to estimate these quantities from experimental data, and thus assess the validity of the standard model of percept formation. In the process, we clarify the predictions of the standard model regarding two classic experimental measures in these tasks: sensitivity, which is the animal’s ability to distinguish nearby stimulus values, and choice signals, which assess the amount of correlation between the activity of single neurons and the animal’s ultimate choice on each trial.
Collapse
Affiliation(s)
- Adrien Wohrer
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives, INSERM U960, École Normale Supérieure, Paris, France
- * E-mail:
| | - Christian K. Machens
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives, INSERM U960, École Normale Supérieure, Paris, France
- Champalimaud Neuroscience Programme, Champalimaud Centre for the Unknown, Lisbon, Portugal
| |
Collapse
|
12
|
Köster U, Sohl-Dickstein J, Gray CM, Olshausen BA. Modeling higher-order correlations within cortical microcolumns. PLoS Comput Biol 2014; 10:e1003684. [PMID: 24991969 PMCID: PMC4081002 DOI: 10.1371/journal.pcbi.1003684] [Citation(s) in RCA: 41] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2013] [Accepted: 05/08/2014] [Indexed: 11/19/2022] Open
Abstract
We statistically characterize the population spiking activity obtained from simultaneous recordings of neurons across all layers of a cortical microcolumn. Three types of models are compared: an Ising model which captures pairwise correlations between units, a Restricted Boltzmann Machine (RBM) which allows for modeling of higher-order correlations, and a semi-Restricted Boltzmann Machine which is a combination of Ising and RBM models. Model parameters were estimated in a fast and efficient manner using minimum probability flow, and log likelihoods were compared using annealed importance sampling. The higher-order models reveal localized activity patterns which reflect the laminar organization of neurons within a cortical column. The higher-order models also outperformed the Ising model in log-likelihood: On populations of 20 cells, the RBM had 10% higher log-likelihood (relative to an independent model) than a pairwise model, increasing to 45% gain in a larger network with 100 spatiotemporal elements, consisting of 10 neurons over 10 time steps. We further removed the need to model stimulus-induced correlations by incorporating a peri-stimulus time histogram term, in which case the higher order models continued to perform best. These results demonstrate the importance of higher-order interactions to describe the structure of correlated activity in cortical networks. Boltzmann Machines with hidden units provide a succinct and effective way to capture these dependencies without increasing the difficulty of model estimation and evaluation.
Collapse
Affiliation(s)
- Urs Köster
- Redwood Center for Theoretical Neuroscience, University of California, Berkeley, Berkeley, California, United States of America
- * E-mail:
| | - Jascha Sohl-Dickstein
- Department of Applied Physics, Stanford University and Khan Academy, Palo Alto, California, United States of America
| | - Charles M. Gray
- Department of Cell Biology and Neuroscience, Montana State University, Bozeman, Montana, United States of America
| | - Bruno A. Olshausen
- Redwood Center for Theoretical Neuroscience, University of California, Berkeley, Berkeley, California, United States of America
| |
Collapse
|
13
|
Hamilton LS, Sohl-Dickstein J, Huth AG, Carels VM, Deisseroth K, Bao S. Optogenetic activation of an inhibitory network enhances feedforward functional connectivity in auditory cortex. Neuron 2014; 80:1066-76. [PMID: 24267655 DOI: 10.1016/j.neuron.2013.08.017] [Citation(s) in RCA: 72] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/19/2013] [Indexed: 11/17/2022]
Abstract
The mammalian neocortex is a highly interconnected network of different types of neurons organized into both layers and columns. Overlaid on this structural organization is a pattern of functional connectivity that can be rapidly and flexibly altered during behavior. Parvalbumin-positive (PV+) inhibitory neurons, which are implicated in cortical oscillations and can change neuronal selectivity, may play a pivotal role in these dynamic changes. We found that optogenetic activation of PV+ neurons in the auditory cortex enhanced feedforward functional connectivity in the putative thalamorecipient circuit and in cortical columnar circuits. In contrast, stimulation of PV+ neurons induced no change in connectivity between sites in the same layers. The activity of PV+ neurons may thus serve as a gating mechanism to enhance feedforward, but not lateral or feedback, information flow in cortical circuits. Functionally, it may preferentially enhance the contribution of bottom-up sensory inputs to perception.
Collapse
Affiliation(s)
- Liberty S Hamilton
- Helen Wills Neuroscience Institute, University of California, Berkeley, Berkeley, CA 94720, USA
| | | | | | | | | | | |
Collapse
|
14
|
Snyder AC, Morais MJ, Smith MA. Variance in population firing rate as a measure of slow time-scale correlation. Front Comput Neurosci 2013; 7:176. [PMID: 24367326 PMCID: PMC3853880 DOI: 10.3389/fncom.2013.00176] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2013] [Accepted: 11/20/2013] [Indexed: 11/13/2022] Open
Abstract
Correlated variability in the spiking responses of pairs of neurons, also known as spike count correlation, is a key indicator of functional connectivity and a critical factor in population coding. Underscoring the importance of correlation as a measure for cognitive neuroscience research is the observation that spike count correlations are not fixed, but are rather modulated by perceptual and cognitive context. Yet while this context fluctuates from moment to moment, correlation must be calculated over multiple trials. This property undermines its utility as a dependent measure for investigations of cognitive processes which fluctuate on a trial-to-trial basis, such as selective attention. A measure of functional connectivity that can be assayed on a moment-to-moment basis is needed to investigate the single-trial dynamics of populations of spiking neurons. Here, we introduce the measure of population variance in normalized firing rate for this goal. We show using mathematical analysis, computer simulations and in vivo data how population variance in normalized firing rate is inversely related to the latent correlation in the population, and how this measure can be used to reliably classify trials from different typical correlation conditions, even when firing rate is held constant. We discuss the potential advantages for using population variance in normalized firing rate as a dependent measure for both basic and applied neuroscience research.
Collapse
Affiliation(s)
- Adam C Snyder
- Department of Ophthalmology, University of Pittsburgh Pittsburgh, PA, USA ; Center for the Neural Basis of Cognition, University of Pittsburgh Pittsburgh, PA, USA
| | - Michael J Morais
- Department of Ophthalmology, University of Pittsburgh Pittsburgh, PA, USA ; Department of Bioengineering, University of Pittsburgh Pittsburgh, PA, USA
| | - Matthew A Smith
- Department of Ophthalmology, University of Pittsburgh Pittsburgh, PA, USA ; Center for the Neural Basis of Cognition, University of Pittsburgh Pittsburgh, PA, USA ; Department of Bioengineering, University of Pittsburgh Pittsburgh, PA, USA ; Fox Center for Vision Restoration, University of Pittsburgh Pittsburgh, PA, USA
| |
Collapse
|
15
|
Haslinger R, Ba D, Galuske R, Williams Z, Pipa G. Missing mass approximations for the partition function of stimulus driven Ising models. Front Comput Neurosci 2013; 7:96. [PMID: 23898262 PMCID: PMC3721091 DOI: 10.3389/fncom.2013.00096] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2013] [Accepted: 06/24/2013] [Indexed: 11/13/2022] Open
Abstract
Ising models are routinely used to quantify the second order, functional structure of neural populations. With some recent exceptions, they generally do not include the influence of time varying stimulus drive. Yet if the dynamics of network function are to be understood, time varying stimuli must be taken into account. Inclusion of stimulus drive carries a heavy computational burden because the partition function becomes stimulus dependent and must be separately calculated for all unique stimuli observed. This potentially increases computation time by the length of the data set. Here we present an extremely fast, yet simply implemented, method for approximating the stimulus dependent partition function in minutes or seconds. Noting that the most probable spike patterns (which are few) occur in the training data, we sum partition function terms corresponding to those patterns explicitly. We then approximate the sum over the remaining patterns (which are improbable, but many) by casting it in terms of the stimulus modulated missing mass (total stimulus dependent probability of all patterns not observed in the training data). We use a product of conditioned logistic regression models to approximate the stimulus modulated missing mass. This method has complexity of roughly O(LNNpat) where is L the data length, N the number of neurons and N pat the number of unique patterns in the data, contrasting with the O(L2 (N) ) complexity of alternate methods. Using multiple unit recordings from rat hippocampus, macaque DLPFC and cat Area 18 we demonstrate our method requires orders of magnitude less computation time than Monte Carlo methods and can approximate the stimulus driven partition function more accurately than either Monte Carlo methods or deterministic approximations. This advance allows stimuli to be easily included in Ising models making them suitable for studying population based stimulus encoding.
Collapse
Affiliation(s)
- Robert Haslinger
- Martinos Center for Biomedical Imaging, Massachusetts General Hospital Charlestown, MA, USA ; Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology Cambridge, MA, USA
| | | | | | | | | |
Collapse
|
16
|
Banerjee A, Dean HL, Pesaran B. Parametric models to relate spike train and LFP dynamics with neural information processing. Front Comput Neurosci 2012; 6:51. [PMID: 22837745 PMCID: PMC3403111 DOI: 10.3389/fncom.2012.00051] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2012] [Accepted: 07/03/2012] [Indexed: 11/28/2022] Open
Abstract
Spike trains and local field potentials (LFPs) resulting from extracellular current flows provide a substrate for neural information processing. Understanding the neural code from simultaneous spike-field recordings and subsequent decoding of information processing events will have widespread applications. One way to demonstrate an understanding of the neural code, with particular advantages for the development of applications, is to formulate a parametric statistical model of neural activity and its covariates. Here, we propose a set of parametric spike-field models (unified models) that can be used with existing decoding algorithms to reveal the timing of task or stimulus specific processing. Our proposed unified modeling framework captures the effects of two important features of information processing: time-varying stimulus-driven inputs and ongoing background activity that occurs even in the absence of environmental inputs. We have applied this framework for decoding neural latencies in simulated and experimentally recorded spike-field sessions obtained from the lateral intraparietal area (LIP) of awake, behaving monkeys performing cued look-and-reach movements to spatial targets. Using both simulated and experimental data, we find that estimates of trial-by-trial parameters are not significantly affected by the presence of ongoing background activity. However, including background activity in the unified model improves goodness of fit for predicting individual spiking events. Uncovering the relationship between the model parameters and the timing of movements offers new ways to test hypotheses about the relationship between neural activity and behavior. We obtained significant spike-field onset time correlations from single trials using a previously published data set where significantly strong correlation was only obtained through trial averaging. We also found that unified models extracted a stronger relationship between neural response latency and trial-by-trial behavioral performance than existing models of neural information processing. Our results highlight the utility of the unified modeling framework for characterizing spike-LFP recordings obtained during behavioral performance.
Collapse
Affiliation(s)
- Arpan Banerjee
- *Correspondence: Arpan Banerjee, Center for Neural Science, New York University, 4 Washington Place, Room 809, New York, NY 10003, USA. e-mail: ;
| | | | | |
Collapse
|
17
|
Vasquez JC, Marre O, Palacios A, Berry M, Cessac B. Gibbs distribution analysis of temporal correlations structure in retina ganglion cells. JOURNAL OF PHYSIOLOGY, PARIS 2012; 106:120-7. [PMID: 22115900 PMCID: PMC3424736 DOI: 10.1016/j.jphysparis.2011.11.001] [Citation(s) in RCA: 34] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/15/2011] [Revised: 09/24/2011] [Accepted: 11/03/2011] [Indexed: 11/18/2022]
Abstract
We present a method to estimate Gibbs distributions with spatio-temporal constraints on spike trains statistics. We apply this method to spike trains recorded from ganglion cells of the salamander retina, in response to natural movies. Our analysis, restricted to a few neurons, performs more accurately than pairwise synchronization models (Ising) or the 1-time step Markov models (Marre et al., 2009) to describe the statistics of spatio-temporal spike patterns and emphasizes the role of higher order spatio-temporal interactions.
Collapse
Affiliation(s)
- J. C. Vasquez
- NeuroMathComp team (INRIA, ENS Paris, UNSA LJAD), Sophia Antipolis, France. INRIA, 2004 Route des Lucioles, 06902 Sophia-Antipolis, France.
| | - O. Marre
- Department of Molecular Biology and Princeton Neuroscience Institute, Princeton University, USA
| | - A.G. Palacios
- Centro Interdisciplinario de Neurociencia de Valparaiso, Universidad de Valparaiso, Chile
| | - M.J. Berry
- Centro Interdisciplinario de Neurociencia de Valparaiso, Universidad de Valparaiso, Chile
| | - B. Cessac
- NeuroMathComp team (INRIA, ENS Paris, UNSA LJAD), Sophia Antipolis, France. INRIA, 2004 Route des Lucioles, 06902 Sophia-Antipolis, France.
| |
Collapse
|
18
|
Schwartz G, Macke J, Amodei D, Tang H, Berry MJ. Low error discrimination using a correlated population code. J Neurophysiol 2012; 108:1069-88. [PMID: 22539825 DOI: 10.1152/jn.00564.2011] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
We explored the manner in which spatial information is encoded by retinal ganglion cell populations. We flashed a set of 36 shape stimuli onto the tiger salamander retina and used different decoding algorithms to read out information from a population of 162 ganglion cells. We compared the discrimination performance of linear decoders, which ignore correlation induced by common stimulation, with nonlinear decoders, which can accurately model these correlations. Similar to previous studies, decoders that ignored correlation suffered only a modest drop in discrimination performance for groups of up to ∼30 cells. However, for more realistic groups of 100+ cells, we found order-of-magnitude differences in the error rate. We also compared decoders that used only the presence of a single spike from each cell with more complex decoders that included information from multiple spike counts and multiple time bins. More complex decoders substantially outperformed simpler decoders, showing the importance of spike timing information. Particularly effective was the first spike latency representation, which allowed zero discrimination errors for the majority of shape stimuli. Furthermore, the performance of nonlinear decoders showed even greater enhancement compared with linear decoders for these complex representations. Finally, decoders that approximated the correlation structure in the population by matching all pairwise correlations with a maximum entropy model fit to all 162 neurons were quite successful, especially for the spike latency representation. Together, these results suggest a picture in which linear decoders allow a coarse categorization of shape stimuli, whereas nonlinear decoders, which take advantage of both correlation and spike timing, are needed to achieve high-fidelity discrimination.
Collapse
Affiliation(s)
- Greg Schwartz
- Department of Molecular Biology and the Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
| | | | | | | | | |
Collapse
|
19
|
State-space analysis of time-varying higher-order spike correlation for multiple neural spike train data. PLoS Comput Biol 2012; 8:e1002385. [PMID: 22412358 PMCID: PMC3297562 DOI: 10.1371/journal.pcbi.1002385] [Citation(s) in RCA: 62] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2011] [Accepted: 12/28/2011] [Indexed: 11/23/2022] Open
Abstract
Precise spike coordination between the spiking activities of multiple neurons is suggested as an indication of coordinated network activity in active cell assemblies. Spike correlation analysis aims to identify such cooperative network activity by detecting excess spike synchrony in simultaneously recorded multiple neural spike sequences. Cooperative activity is expected to organize dynamically during behavior and cognition; therefore currently available analysis techniques must be extended to enable the estimation of multiple time-varying spike interactions between neurons simultaneously. In particular, new methods must take advantage of the simultaneous observations of multiple neurons by addressing their higher-order dependencies, which cannot be revealed by pairwise analyses alone. In this paper, we develop a method for estimating time-varying spike interactions by means of a state-space analysis. Discretized parallel spike sequences are modeled as multi-variate binary processes using a log-linear model that provides a well-defined measure of higher-order spike correlation in an information geometry framework. We construct a recursive Bayesian filter/smoother for the extraction of spike interaction parameters. This method can simultaneously estimate the dynamic pairwise spike interactions of multiple single neurons, thereby extending the Ising/spin-glass model analysis of multiple neural spike train data to a nonstationary analysis. Furthermore, the method can estimate dynamic higher-order spike interactions. To validate the inclusion of the higher-order terms in the model, we construct an approximation method to assess the goodness-of-fit to spike data. In addition, we formulate a test method for the presence of higher-order spike correlation even in nonstationary spike data, e.g., data from awake behaving animals. The utility of the proposed methods is tested using simulated spike data with known underlying correlation dynamics. Finally, we apply the methods to neural spike data simultaneously recorded from the motor cortex of an awake monkey and demonstrate that the higher-order spike correlation organizes dynamically in relation to a behavioral demand. Nearly half a century ago, the Canadian psychologist D. O. Hebb postulated the formation of assemblies of tightly connected cells in cortical recurrent networks because of changes in synaptic weight (Hebb's learning rule) by repetitive sensory stimulation of the network. Consequently, the activation of such an assembly for processing sensory or behavioral information is likely to be expressed by precisely coordinated spiking activities of the participating neurons. However, the available analysis techniques for multiple parallel neural spike data do not allow us to reveal the detailed structure of transiently active assemblies as indicated by their dynamical pairwise and higher-order spike correlations. Here, we construct a state-space model of dynamic spike interactions, and present a recursive Bayesian method that makes it possible to trace multiple neurons exhibiting such precisely coordinated spiking activities in a time-varying manner. We also formulate a hypothesis test of the underlying dynamic spike correlation, which enables us to detect the assemblies activated in association with behavioral events. Therefore, the proposed method can serve as a useful tool to test Hebb's cell assembly hypothesis.
Collapse
|