1
|
Zdeblick DN, Shea-Brown ET, Witten DM, Buice MA. Modeling functional cell types in spike train data. PLoS Comput Biol 2023; 19:e1011509. [PMID: 37824442 PMCID: PMC10569560 DOI: 10.1371/journal.pcbi.1011509] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2023] [Accepted: 09/12/2023] [Indexed: 10/14/2023] Open
Abstract
A major goal of computational neuroscience is to build accurate models of the activity of neurons that can be used to interpret their function in circuits. Here, we explore using functional cell types to refine single-cell models by grouping them into functionally relevant classes. Formally, we define a hierarchical generative model for cell types, single-cell parameters, and neural responses, and then derive an expectation-maximization algorithm with variational inference that maximizes the likelihood of the neural recordings. We apply this "simultaneous" method to estimate cell types and fit single-cell models from simulated data, and find that it accurately recovers the ground truth parameters. We then apply our approach to in vitro neural recordings from neurons in mouse primary visual cortex, and find that it yields improved prediction of single-cell activity. We demonstrate that the discovered cell-type clusters are well separated and generalizable, and thus amenable to interpretation. We then compare discovered cluster memberships with locational, morphological, and transcriptomic data. Our findings reveal the potential to improve models of neural responses by explicitly allowing for shared functional properties across neurons.
Collapse
Affiliation(s)
- Daniel N. Zdeblick
- Department of Electrical and Computer Engineering, University of Washington, Seattle, Washington, United States of America
| | - Eric T. Shea-Brown
- Department of Applied Math, University of Washington, Seattle, Washington, United States of America
- MindScope Program, Allen Institute, Seattle, Washington, United States of America
| | - Daniela M. Witten
- Department of Statistics and Biostatistics, University of Washington, Seattle, Washington, United States of America
| | - Michael A. Buice
- Department of Applied Math, University of Washington, Seattle, Washington, United States of America
- MindScope Program, Allen Institute, Seattle, Washington, United States of America
| |
Collapse
|
2
|
King CW, Ledochowitsch P, Buice MA, de Vries SEJ. Saccade-Responsive Visual Cortical Neurons Do Not Exhibit Distinct Visual Response Properties. eNeuro 2023; 10:ENEURO.0051-23.2023. [PMID: 37591733 PMCID: PMC10506534 DOI: 10.1523/eneuro.0051-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2023] [Revised: 07/05/2023] [Accepted: 07/24/2023] [Indexed: 08/19/2023] Open
Abstract
Rapid saccadic eye movements are used by animals to sample different parts of the visual scene. Previous work has investigated neural correlates of these saccades in visual cortical areas such as V1; however, how saccade-responsive neurons are distributed across visual areas, cell types, and cortical layers has remained unknown. Through analyzing 818 1 h experimental sessions from the Allen Brain Observatory, we present a large-scale analysis of saccadic behaviors in head-fixed mice and their neural correlates. We find that saccade-responsive neurons are present across visual cortex, but their distribution varies considerably by transgenically defined cell type, cortical area, and cortical layer. We also find that saccade-responsive neurons do not exhibit distinct visual response properties from the broader neural population, suggesting that the saccadic responses of these neurons are likely not predominantly visually driven. These results provide insight into the roles played by different cell types within a broader, distributed network of sensory and motor interactions.
Collapse
Affiliation(s)
- Chase W King
- MindScope Program, Allen Institute, Seattle, Washington 98109
- Department of Computer Science, University of Washington, Seattle, Washington 98195-2350
| | | | - Michael A Buice
- MindScope Program, Allen Institute, Seattle, Washington 98109
- Department of Applied Mathematics, University of Washington, Seattle, Washington 98195-3925
| | - Saskia E J de Vries
- MindScope Program, Allen Institute, Seattle, Washington 98109
- Department of Physiology & Biophysics, University of Washington, Seattle, Washington 98195-7290
| |
Collapse
|
3
|
Zdeblick DN, Shea-Brown ET, Witten DM, Buice MA. Modeling functional cell types in spike train data. bioRxiv 2023:2023.02.28.530327. [PMID: 36909648 PMCID: PMC10002678 DOI: 10.1101/2023.02.28.530327] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/06/2023]
Abstract
A major goal of computational neuroscience is to build accurate models of the activity of neurons that can be used to interpret their function in circuits. Here, we explore using functional cell types to refine single-cell models by grouping them into functionally relevant classes. Formally, we define a hierarchical generative model for cell types, single-cell parameters, and neural responses, and then derive an expectation-maximization algorithm with variational inference that maximizes the likelihood of the neural recordings. We apply this "simultaneous" method to estimate cell types and fit single-cell models from simulated data, and find that it accurately recovers the ground truth parameters. We then apply our approach to in vitro neural recordings from neurons in mouse primary visual cortex, and find that it yields improved prediction of single-cell activity. We demonstrate that the discovered cell-type clusters are well separated and generalizable, and thus amenable to interpretation. We then compare discovered cluster memberships with locational, morphological, and transcriptomic data. Our findings reveal the potential to improve models of neural responses by explicitly allowing for shared functional properties across neurons.
Collapse
Affiliation(s)
| | | | | | - Michael A. Buice
- Applied Math, University of Washington
- Allen Institute MindScope Program
| |
Collapse
|
4
|
Shi J, Shea-Brown E, Buice MA. Learning dynamics of deep linear networks with multiple pathways. Adv Neural Inf Process Syst 2022; 35:34064-34076. [PMID: 38288081 PMCID: PMC10824491] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Grants] [Subscribe] [Scholar Register] [Indexed: 01/31/2024]
Abstract
Not only have deep networks become standard in machine learning, they are increasingly of interest in neuroscience as models of cortical computation that capture relationships between structural and functional properties. In addition they are a useful target of theoretical research into the properties of network computation. Deep networks typically have a serial or approximately serial organization across layers, and this is often mirrored in models that purport to represent computation in mammalian brains. There are, however, multiple examples of parallel pathways in mammalian brains. In some cases, such as the mouse, the entire visual system appears arranged in a largely parallel, rather than serial fashion. While these pathways may be formed by differing cost functions that drive different computations, here we present a new mathematical analysis of learning dynamics in networks that have parallel computational pathways driven by the same cost function. We use the approximation of deep linear networks with large hidden layer sizes to show that, as the depth of the parallel pathways increases, different features of the training set (defined by the singular values of the input-output correlation) will typically concentrate in one of the pathways. This result is derived analytically and demonstrated with numerical simulation with both linear and non-linear networks. Thus, rather than sharing stimulus and task features across multiple pathways, parallel network architectures learn to produce sharply diversified representations with specialized and specific pathways, a mechanism which may hold important consequences for codes in both biological and artificial systems.
Collapse
Affiliation(s)
- Jianghong Shi
- Department of Applied Mathematics, University of Washington, Seattle, WA 98195
| | - Eric Shea-Brown
- Department of Applied Mathematics, University of Washington, Seattle, WA 98195
| | | |
Collapse
|
5
|
Siegle JH, Ledochowitsch P, Jia X, Millman DJ, Ocker GK, Caldejon S, Casal L, Cho A, Denman DJ, Durand S, Groblewski PA, Heller G, Kato I, Kivikas S, Lecoq J, Nayan C, Ngo K, Nicovich PR, North K, Ramirez TK, Swapp J, Waughman X, Williford A, Olsen SR, Koch C, Buice MA, de Vries SEJ. Reconciling functional differences in populations of neurons recorded with two-photon imaging and electrophysiology. eLife 2021; 10:e69068. [PMID: 34270411 PMCID: PMC8285106 DOI: 10.7554/elife.69068] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2021] [Accepted: 07/02/2021] [Indexed: 11/20/2022] Open
Abstract
Extracellular electrophysiology and two-photon calcium imaging are widely used methods for measuring physiological activity with single-cell resolution across large populations of cortical neurons. While each of these two modalities has distinct advantages and disadvantages, neither provides complete, unbiased information about the underlying neural population. Here, we compare evoked responses in visual cortex recorded in awake mice under highly standardized conditions using either imaging of genetically expressed GCaMP6f or electrophysiology with silicon probes. Across all stimulus conditions tested, we observe a larger fraction of responsive neurons in electrophysiology and higher stimulus selectivity in calcium imaging, which was partially reconciled by applying a spikes-to-calcium forward model to the electrophysiology data. However, the forward model could only reconcile differences in responsiveness when restricted to neurons with low contamination and an event rate above a minimum threshold. This work established how the biases of these two modalities impact functional metrics that are fundamental for characterizing sensory-evoked responses.
Collapse
Affiliation(s)
| | | | - Xiaoxuan Jia
- MindScope Program, Allen InstituteSeattleUnited States
| | | | | | | | - Linzy Casal
- MindScope Program, Allen InstituteSeattleUnited States
| | - Andy Cho
- MindScope Program, Allen InstituteSeattleUnited States
| | - Daniel J Denman
- Allen Institute for Brain Science, Allen InstituteSeattleUnited States
| | | | | | - Gregg Heller
- MindScope Program, Allen InstituteSeattleUnited States
| | - India Kato
- MindScope Program, Allen InstituteSeattleUnited States
| | - Sara Kivikas
- MindScope Program, Allen InstituteSeattleUnited States
| | - Jérôme Lecoq
- MindScope Program, Allen InstituteSeattleUnited States
| | - Chelsea Nayan
- MindScope Program, Allen InstituteSeattleUnited States
| | - Kiet Ngo
- Allen Institute for Brain Science, Allen InstituteSeattleUnited States
| | - Philip R Nicovich
- Allen Institute for Brain Science, Allen InstituteSeattleUnited States
| | - Kat North
- MindScope Program, Allen InstituteSeattleUnited States
| | | | - Jackie Swapp
- MindScope Program, Allen InstituteSeattleUnited States
| | - Xana Waughman
- MindScope Program, Allen InstituteSeattleUnited States
| | - Ali Williford
- MindScope Program, Allen InstituteSeattleUnited States
| | - Shawn R Olsen
- MindScope Program, Allen InstituteSeattleUnited States
| | - Christof Koch
- MindScope Program, Allen InstituteSeattleUnited States
| | | | | |
Collapse
|
6
|
Huang L, Ledochowitsch P, Knoblich U, Lecoq J, Murphy GJ, Reid RC, de Vries SE, Koch C, Zeng H, Buice MA, Waters J, Li L. Relationship between simultaneously recorded spiking activity and fluorescence signal in GCaMP6 transgenic mice. eLife 2021; 10:51675. [PMID: 33683198 PMCID: PMC8060029 DOI: 10.7554/elife.51675] [Citation(s) in RCA: 88] [Impact Index Per Article: 29.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2019] [Accepted: 03/05/2021] [Indexed: 12/11/2022] Open
Abstract
Fluorescent calcium indicators are often used to investigate neural dynamics, but the relationship between fluorescence and action potentials (APs) remains unclear. Most APs can be detected when the soma almost fills the microscope’s field of view, but calcium indicators are used to image populations of neurons, necessitating a large field of view, generating fewer photons per neuron, and compromising AP detection. Here, we characterized the AP-fluorescence transfer function in vivo for 48 layer 2/3 pyramidal neurons in primary visual cortex, with simultaneous calcium imaging and cell-attached recordings from transgenic mice expressing GCaMP6s or GCaMP6f. While most APs were detected under optimal conditions, under conditions typical of population imaging studies, only a minority of 1 AP and 2 AP events were detected (often <10% and ~20–30%, respectively), emphasizing the limits of AP detection under more realistic imaging conditions. Neurons, the cells that make up the nervous system, transmit information using electrical signals known as action potentials or spikes. Studying the spiking patterns of neurons in the brain is essential to understand perception, memory, thought, and behaviour. One way to do that is by recording electrical activity with microelectrodes. Another way to study neuronal activity is by using molecules that change how they interact with light when calcium binds to them, since changes in calcium concentration can be indicative of neuronal spiking. That change can be observed with specialized microscopes know as two-photon fluorescence microscopes. Using calcium indicators, it is possible to simultaneously record hundreds or even thousands of neurons. However, calcium fluorescence and spikes do not translate one-to-one. In order to interpret fluorescence data, it is important to understand the relationship between the fluorescence signals and the spikes associated with individual neurons. The only way to directly measure this relationship is by using calcium imaging and electrical recording simultaneously to record activity from the same neuron. However, this is extremely challenging experimentally, so this type of data is rare. To shed some light on this, Huang, Ledochowitsch et al. used mice that had been genetically modified to produce a calcium indicator in neurons of the visual cortex and simultaneously obtained both fluorescence measurements and electrical recordings from these neurons. These experiments revealed that, while the majority of time periods containing multi-spike neural activity could be identified using calcium imaging microscopy, on average, less than 10% of isolated single spikes were detectable. This is an important caveat that researchers need to take into consideration when interpreting calcium imaging results. These findings are intended to serve as a guide for interpreting calcium imaging studies that look at neurons in the mammalian brain at the population level. In addition, the data provided will be useful as a reference for the development of activity sensors, and to benchmark and improve computational approaches for detecting and predicting spikes.
Collapse
Affiliation(s)
- Lawrence Huang
- Allen Institute for Brain Science, Seattle, United States
| | | | - Ulf Knoblich
- Allen Institute for Brain Science, Seattle, United States
| | - Jérôme Lecoq
- Allen Institute for Brain Science, Seattle, United States
| | - Gabe J Murphy
- Allen Institute for Brain Science, Seattle, United States
| | - R Clay Reid
- Allen Institute for Brain Science, Seattle, United States
| | | | - Christof Koch
- Allen Institute for Brain Science, Seattle, United States
| | - Hongkui Zeng
- Allen Institute for Brain Science, Seattle, United States
| | | | - Jack Waters
- Allen Institute for Brain Science, Seattle, United States
| | - Lu Li
- Allen Institute for Brain Science, Seattle, United States.,Guangdong Provincial Key Laboratory of Malignant Tumor Epigenetics and Gene Regulation, Guangdong-Hong Kong Joint Laboratory for RNA Medicine, Medical Research Center, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, China
| |
Collapse
|
7
|
Siegle JH, Jia X, Durand S, Gale S, Bennett C, Graddis N, Heller G, Ramirez TK, Choi H, Luviano JA, Groblewski PA, Ahmed R, Arkhipov A, Bernard A, Billeh YN, Brown D, Buice MA, Cain N, Caldejon S, Casal L, Cho A, Chvilicek M, Cox TC, Dai K, Denman DJ, de Vries SEJ, Dietzman R, Esposito L, Farrell C, Feng D, Galbraith J, Garrett M, Gelfand EC, Hancock N, Harris JA, Howard R, Hu B, Hytnen R, Iyer R, Jessett E, Johnson K, Kato I, Kiggins J, Lambert S, Lecoq J, Ledochowitsch P, Lee JH, Leon A, Li Y, Liang E, Long F, Mace K, Melchior J, Millman D, Mollenkopf T, Nayan C, Ng L, Ngo K, Nguyen T, Nicovich PR, North K, Ocker GK, Ollerenshaw D, Oliver M, Pachitariu M, Perkins J, Reding M, Reid D, Robertson M, Ronellenfitch K, Seid S, Slaughterbeck C, Stoecklin M, Sullivan D, Sutton B, Swapp J, Thompson C, Turner K, Wakeman W, Whitesell JD, Williams D, Williford A, Young R, Zeng H, Naylor S, Phillips JW, Reid RC, Mihalas S, Olsen SR, Koch C. Survey of spiking in the mouse visual system reveals functional hierarchy. Nature 2021; 592:86-92. [PMID: 33473216 PMCID: PMC10399640 DOI: 10.1038/s41586-020-03171-x] [Citation(s) in RCA: 148] [Impact Index Per Article: 49.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2019] [Accepted: 12/09/2020] [Indexed: 12/14/2022]
Abstract
The anatomy of the mammalian visual system, from the retina to the neocortex, is organized hierarchically1. However, direct observation of cellular-level functional interactions across this hierarchy is lacking due to the challenge of simultaneously recording activity across numerous regions. Here we describe a large, open dataset-part of the Allen Brain Observatory2-that surveys spiking from tens of thousands of units in six cortical and two thalamic regions in the brains of mice responding to a battery of visual stimuli. Using cross-correlation analysis, we reveal that the organization of inter-area functional connectivity during visual stimulation mirrors the anatomical hierarchy from the Allen Mouse Brain Connectivity Atlas3. We find that four classical hierarchical measures-response latency, receptive-field size, phase-locking to drifting gratings and response decay timescale-are all correlated with the hierarchy. Moreover, recordings obtained during a visual task reveal that the correlation between neural activity and behavioural choice also increases along the hierarchy. Our study provides a foundation for understanding coding and signal propagation across hierarchically organized cortical and thalamic visual areas.
Collapse
Affiliation(s)
| | - Xiaoxuan Jia
- Allen Institute for Brain Science, Seattle, WA, USA.
| | | | - Sam Gale
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | - Nile Graddis
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | | | - Hannah Choi
- Allen Institute for Brain Science, Seattle, WA, USA.,Department of Applied Mathematics, University of Washington, Seattle, WA, USA
| | | | | | | | | | - Amy Bernard
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | - Dillan Brown
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | - Nicolas Cain
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | - Linzy Casal
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Andrew Cho
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | - Timothy C Cox
- University of Missouri-Kansas City School of Dentistry, Kansas City, MO, USA
| | - Kael Dai
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Daniel J Denman
- Allen Institute for Brain Science, Seattle, WA, USA.,The University of Colorado Denver, Anschutz Medical Campus, Aurora, CO, USA
| | | | | | | | | | - David Feng
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | | | | | | | | | | | - Brian Hu
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Ross Hytnen
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | | | | | - India Kato
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | | | - Jerome Lecoq
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | | | - Arielle Leon
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Yang Li
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | - Fuhui Long
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Kyla Mace
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | | | | | | | - Lydia Ng
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Kiet Ngo
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | | | - Kat North
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | | | | | | | - Jed Perkins
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | - David Reid
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | | | - Sam Seid
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | | | | | - Ben Sutton
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Jackie Swapp
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | | | | | | | | | | | - Rob Young
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Hongkui Zeng
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Sarah Naylor
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | - R Clay Reid
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | - Shawn R Olsen
- Allen Institute for Brain Science, Seattle, WA, USA.
| | | |
Collapse
|
8
|
Millman DJ, Ocker GK, Caldejon S, Kato I, Larkin JD, Lee EK, Luviano J, Nayan C, Nguyen TV, North K, Seid S, White C, Lecoq J, Reid C, Buice MA, de Vries SEJ. VIP interneurons in mouse primary visual cortex selectively enhance responses to weak but specific stimuli. eLife 2020; 9:e55130. [PMID: 33108272 PMCID: PMC7591255 DOI: 10.7554/elife.55130] [Citation(s) in RCA: 23] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2020] [Accepted: 10/11/2020] [Indexed: 01/20/2023] Open
Abstract
Vasoactive intestinal peptide-expressing (VIP) interneurons in the cortex regulate feedback inhibition of pyramidal neurons through suppression of somatostatin-expressing (SST) interneurons and, reciprocally, SST neurons inhibit VIP neurons. Although VIP neuron activity in the primary visual cortex (V1) of mouse is highly correlated with locomotion, the relevance of locomotion-related VIP neuron activity to visual coding is not known. Here we show that VIP neurons in mouse V1 respond strongly to low contrast front-to-back motion that is congruent with self-motion during locomotion but are suppressed by other directions and contrasts. VIP and SST neurons have complementary contrast tuning. Layer 2/3 contains a substantially larger population of low contrast preferring pyramidal neurons than deeper layers, and layer 2/3 (but not deeper layer) pyramidal neurons show bias for front-to-back motion specifically at low contrast. Network modeling indicates that VIP-SST mutual antagonism regulates the gain of the cortex to achieve sensitivity to specific weak stimuli without compromising network stability.
Collapse
Affiliation(s)
| | | | | | - India Kato
- Allen Institute for Brain ScienceSeattleUnited States
| | - Josh D Larkin
- Allen Institute for Brain ScienceSeattleUnited States
| | | | | | - Chelsea Nayan
- Allen Institute for Brain ScienceSeattleUnited States
| | | | - Kat North
- Allen Institute for Brain ScienceSeattleUnited States
| | - Sam Seid
- Allen Institute for Brain ScienceSeattleUnited States
| | | | - Jerome Lecoq
- Allen Institute for Brain ScienceSeattleUnited States
| | - Clay Reid
- Allen Institute for Brain ScienceSeattleUnited States
| | | | | |
Collapse
|
9
|
Ocker GK, Buice MA. Flexible neural connectivity under constraints on total connection strength. PLoS Comput Biol 2020; 16:e1008080. [PMID: 32745134 PMCID: PMC7425997 DOI: 10.1371/journal.pcbi.1008080] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2019] [Revised: 08/13/2020] [Accepted: 06/19/2020] [Indexed: 12/23/2022] Open
Abstract
Neural computation is determined by neurons’ dynamics and circuit connectivity. Uncertain and dynamic environments may require neural hardware to adapt to different computational tasks, each requiring different connectivity configurations. At the same time, connectivity is subject to a variety of constraints, placing limits on the possible computations a given neural circuit can perform. Here we examine the hypothesis that the organization of neural circuitry favors computational flexibility: that it makes many computational solutions available, given physiological constraints. From this hypothesis, we develop models of connectivity degree distributions based on constraints on a neuron’s total synaptic weight. To test these models, we examine reconstructions of the mushroom bodies from the first instar larva and adult Drosophila melanogaster. We perform a Bayesian model comparison for two constraint models and a random wiring null model. Overall, we find that flexibility under a homeostatically fixed total synaptic weight describes Kenyon cell connectivity better than other models, suggesting a principle shaping the apparently random structure of Kenyon cell wiring. Furthermore, we find evidence that larval Kenyon cells are more flexible earlier in development, suggesting a mechanism whereby neural circuits begin as flexible systems that develop into specialized computational circuits. High-throughput electron microscopic anatomical experiments have begun to yield detailed maps of neural circuit connectivity. Uncovering the principles that govern these circuit structures is a major challenge for systems neuroscience. Healthy neural circuits must be able to perform computational tasks while satisfying physiological constraints. Those constraints can restrict a neuron’s possible connectivity, and thus potentially restrict its computation. Here we examine simple models of constraints on total synaptic weights, and calculate the number of circuit configurations they allow: a simple measure of their computational flexibility. We propose probabilistic models of connectivity that weight the number of synaptic partners according to computational flexibility under a constraint and test them using recent wiring diagrams from a learning center, the mushroom body, in the fly brain. We compare constraints that fix or bound a neuron’s total connection strength to a simple random wiring null model. Of these models, the fixed total connection strength matched the overall connectivity best in mushroom bodies from both larval and adult flies. We also provide evidence suggesting that neural circuits are more flexible in early stages of development and lose this flexibility as they grow towards specialized function.
Collapse
Affiliation(s)
- Gabriel Koch Ocker
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- * E-mail:
| | - Michael A. Buice
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
| |
Collapse
|
10
|
Garrett M, Manavi S, Roll K, Ollerenshaw DR, Groblewski PA, Ponvert ND, Kiggins JT, Casal L, Mace K, Williford A, Leon A, Jia X, Ledochowitsch P, Buice MA, Wakeman W, Mihalas S, Olsen SR. Experience shapes activity dynamics and stimulus coding of VIP inhibitory cells. eLife 2020; 9:e50340. [PMID: 32101169 PMCID: PMC7043888 DOI: 10.7554/elife.50340] [Citation(s) in RCA: 36] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2019] [Accepted: 02/05/2020] [Indexed: 02/07/2023] Open
Abstract
Cortical circuits can flexibly change with experience and learning, but the effects on specific cell types, including distinct inhibitory types, are not well understood. Here we investigated how excitatory and VIP inhibitory cells in layer 2/3 of mouse visual cortex were impacted by visual experience in the context of a behavioral task. Mice learned a visual change detection task with a set of eight natural scene images. Subsequently, during 2-photon imaging experiments, mice performed the task with these familiar images and three sets of novel images. Strikingly, the temporal dynamics of VIP activity differed markedly between novel and familiar images: VIP cells were stimulus-driven by novel images but were suppressed by familiar stimuli and showed ramping activity when expected stimuli were omitted from a temporally predictable sequence. This prominent change in VIP activity suggests that these cells may adopt different modes of processing under novel versus familiar conditions.
Collapse
Affiliation(s)
| | - Sahar Manavi
- Allen Institute for Brain ScienceSeattleUnited States
| | - Kate Roll
- Allen Institute for Brain ScienceSeattleUnited States
| | | | | | | | | | - Linzy Casal
- Allen Institute for Brain ScienceSeattleUnited States
| | - Kyla Mace
- Allen Institute for Brain ScienceSeattleUnited States
| | - Ali Williford
- Allen Institute for Brain ScienceSeattleUnited States
| | - Arielle Leon
- Allen Institute for Brain ScienceSeattleUnited States
| | - Xiaoxuan Jia
- Allen Institute for Brain ScienceSeattleUnited States
| | | | | | - Wayne Wakeman
- Allen Institute for Brain ScienceSeattleUnited States
| | | | - Shawn R Olsen
- Allen Institute for Brain ScienceSeattleUnited States
| |
Collapse
|
11
|
de Vries SEJ, Lecoq JA, Buice MA, Groblewski PA, Ocker GK, Oliver M, Feng D, Cain N, Ledochowitsch P, Millman D, Roll K, Garrett M, Keenan T, Kuan L, Mihalas S, Olsen S, Thompson C, Wakeman W, Waters J, Williams D, Barber C, Berbesque N, Blanchard B, Bowles N, Caldejon SD, Casal L, Cho A, Cross S, Dang C, Dolbeare T, Edwards M, Galbraith J, Gaudreault N, Gilbert TL, Griffin F, Hargrave P, Howard R, Huang L, Jewell S, Keller N, Knoblich U, Larkin JD, Larsen R, Lau C, Lee E, Lee F, Leon A, Li L, Long F, Luviano J, Mace K, Nguyen T, Perkins J, Robertson M, Seid S, Shea-Brown E, Shi J, Sjoquist N, Slaughterbeck C, Sullivan D, Valenza R, White C, Williford A, Witten DM, Zhuang J, Zeng H, Farrell C, Ng L, Bernard A, Phillips JW, Reid RC, Koch C. A large-scale standardized physiological survey reveals functional organization of the mouse visual cortex. Nat Neurosci 2020; 23:138-151. [PMID: 31844315 PMCID: PMC6948932 DOI: 10.1038/s41593-019-0550-9] [Citation(s) in RCA: 134] [Impact Index Per Article: 33.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2019] [Accepted: 10/28/2019] [Indexed: 11/16/2022]
Abstract
To understand how the brain processes sensory information to guide behavior, we must know how stimulus representations are transformed throughout the visual cortex. Here we report an open, large-scale physiological survey of activity in the awake mouse visual cortex: the Allen Brain Observatory Visual Coding dataset. This publicly available dataset includes the cortical activity of nearly 60,000 neurons from six visual areas, four layers, and 12 transgenic mouse lines in a total of 243 adult mice, in response to a systematic set of visual stimuli. We classify neurons on the basis of joint reliabilities to multiple stimuli and validate this functional classification with models of visual responses. While most classes are characterized by responses to specific subsets of the stimuli, the largest class is not reliably responsive to any of the stimuli and becomes progressively larger in higher visual areas. These classes reveal a functional organization wherein putative dorsal areas show specialization for visual motion signals.
Collapse
Affiliation(s)
| | | | | | | | | | | | - David Feng
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | | | | | - Kate Roll
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | - Tom Keenan
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Leonard Kuan
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | - Shawn Olsen
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | | | - Jack Waters
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | - Chris Barber
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | | | | | | | - Linzy Casal
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Andrew Cho
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Sissy Cross
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Chinh Dang
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Tim Dolbeare
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | | | | | | | | | | | | | | | - Sean Jewell
- Department of Statistics, University of Washington, Seattle, WA, USA
| | - Nika Keller
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Ulf Knoblich
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | | | - Chris Lau
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Eric Lee
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Felix Lee
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Arielle Leon
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Lu Li
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Fuhui Long
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | - Kyla Mace
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | - Jed Perkins
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | - Sam Seid
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Eric Shea-Brown
- Allen Institute for Brain Science, Seattle, WA, USA
- Department of Applied Mathematics, University of Washington, Seattle, WA, USA
| | - Jianghong Shi
- Department of Applied Mathematics, University of Washington, Seattle, WA, USA
| | | | | | | | - Ryan Valenza
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Casey White
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | - Daniela M Witten
- Department of Statistics, University of Washington, Seattle, WA, USA
- Department of Biostatistics, University of Washington, Seattle, WA, USA
| | - Jun Zhuang
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Hongkui Zeng
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | - Lydia Ng
- Allen Institute for Brain Science, Seattle, WA, USA
| | - Amy Bernard
- Allen Institute for Brain Science, Seattle, WA, USA
| | | | - R Clay Reid
- Allen Institute for Brain Science, Seattle, WA, USA
| | | |
Collapse
|
12
|
Recanatesi S, Ocker GK, Buice MA, Shea-Brown E. Dimensionality in recurrent spiking networks: Global trends in activity and local origins in connectivity. PLoS Comput Biol 2019; 15:e1006446. [PMID: 31299044 PMCID: PMC6655892 DOI: 10.1371/journal.pcbi.1006446] [Citation(s) in RCA: 31] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2018] [Revised: 07/24/2019] [Accepted: 04/03/2019] [Indexed: 11/25/2022] Open
Abstract
The dimensionality of a network's collective activity is of increasing interest in neuroscience. This is because dimensionality provides a compact measure of how coordinated network-wide activity is, in terms of the number of modes (or degrees of freedom) that it can independently explore. A low number of modes suggests a compressed low dimensional neural code and reveals interpretable dynamics [1], while findings of high dimension may suggest flexible computations [2, 3]. Here, we address the fundamental question of how dimensionality is related to connectivity, in both autonomous and stimulus-driven networks. Working with a simple spiking network model, we derive three main findings. First, the dimensionality of global activity patterns can be strongly, and systematically, regulated by local connectivity structures. Second, the dimensionality is a better indicator than average correlations in determining how constrained neural activity is. Third, stimulus evoked neural activity interacts systematically with neural connectivity patterns, leading to network responses of either greater or lesser dimensionality than the stimulus.
Collapse
Affiliation(s)
- Stefano Recanatesi
- Center for Computational Neuroscience, University of Washington, Seattle, Washington, United States of America
| | - Gabriel Koch Ocker
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Michael A. Buice
- Center for Computational Neuroscience, University of Washington, Seattle, Washington, United States of America
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
| | - Eric Shea-Brown
- Center for Computational Neuroscience, University of Washington, Seattle, Washington, United States of America
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
| |
Collapse
|
13
|
Brinkman BAW, Rieke F, Shea-Brown E, Buice MA. Predicting how and when hidden neurons skew measured synaptic interactions. PLoS Comput Biol 2018; 14:e1006490. [PMID: 30346943 PMCID: PMC6219819 DOI: 10.1371/journal.pcbi.1006490] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2017] [Revised: 11/06/2018] [Accepted: 09/05/2018] [Indexed: 11/18/2022] Open
Abstract
A major obstacle to understanding neural coding and computation is the fact that experimental recordings typically sample only a small fraction of the neurons in a circuit. Measured neural properties are skewed by interactions between recorded neurons and the “hidden” portion of the network. To properly interpret neural data and determine how biological structure gives rise to neural circuit function, we thus need a better understanding of the relationships between measured effective neural properties and the true underlying physiological properties. Here, we focus on how the effective spatiotemporal dynamics of the synaptic interactions between neurons are reshaped by coupling to unobserved neurons. We find that the effective interactions from a pre-synaptic neuron r′ to a post-synaptic neuron r can be decomposed into a sum of the true interaction from r′ to r plus corrections from every directed path from r′ to r through unobserved neurons. Importantly, the resulting formula reveals when the hidden units have—or do not have—major effects on reshaping the interactions among observed neurons. As a particular example of interest, we derive a formula for the impact of hidden units in random networks with “strong” coupling—connection weights that scale with 1/N, where N is the network size, precisely the scaling observed in recent experiments. With this quantitative relationship between measured and true interactions, we can study how network properties shape effective interactions, which properties are relevant for neural computations, and how to manipulate effective interactions. No experiment in neuroscience can record from more than a tiny fraction of the total number of neurons present in a circuit. This severely complicates measurement of a network’s true properties, as unobserved neurons skew measurements away from what would be measured if all neurons were observed. For example, the measured post-synaptic response of a neuron to a spike from a particular pre-synaptic neuron incorporates direct connections between the two neurons as well as the effect of any number of indirect connections, including through unobserved neurons. To understand how measured quantities are distorted by unobserved neurons, we calculate a general relationship between measured “effective” synaptic interactions and the ground-truth interactions in the network. This allows us to identify conditions under which hidden neurons substantially alter measured interactions. Moreover, it provides a foundation for future work on manipulating effective interactions between neurons to better understand and potentially alter circuit function—or dysfunction.
Collapse
Affiliation(s)
- Braden A W Brinkman
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America.,Department of Physiology and Biophysics, University of Washington, Seattle, Washington, United States of America
| | - Fred Rieke
- Department of Physiology and Biophysics, University of Washington, Seattle, Washington, United States of America.,Graduate Program in Neuroscience, University of Washington, Seattle, Washington, United States of America
| | - Eric Shea-Brown
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America.,Department of Physiology and Biophysics, University of Washington, Seattle, Washington, United States of America.,Graduate Program in Neuroscience, University of Washington, Seattle, Washington, United States of America.,Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Michael A Buice
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America.,Allen Institute for Brain Science, Seattle, Washington, United States of America
| |
Collapse
|
14
|
Denman DJ, Luviano JA, Ollerenshaw DR, Cross S, Williams D, Buice MA, Olsen SR, Reid RC. Mouse color and wavelength-specific luminance contrast sensitivity are non-uniform across visual space. eLife 2018; 7:e31209. [PMID: 29319502 PMCID: PMC5762155 DOI: 10.7554/elife.31209] [Citation(s) in RCA: 50] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2017] [Accepted: 12/13/2017] [Indexed: 01/10/2023] Open
Abstract
Mammalian visual behaviors, as well as responses in the neural systems underlying these behaviors, are driven by luminance and color contrast. With constantly improving tools for measuring activity in cell-type-specific populations in the mouse during visual behavior, it is important to define the extent of luminance and color information that is behaviorally accessible to the mouse. A non-uniform distribution of cone opsins in the mouse retina potentially complicates both luminance and color sensitivity; opposing gradients of short (UV-shifted) and middle (blue/green) cone opsins suggest that color discrimination and wavelength-specific luminance contrast sensitivity may differ with retinotopic location. Here we ask how well mice can discriminate color and wavelength-specific luminance changes across visuotopic space. We found that mice were able to discriminate color and were able to do so more broadly across visuotopic space than expected from the cone-opsin distribution. We also found wavelength-band-specific differences in luminance sensitivity.
Collapse
Affiliation(s)
| | | | | | - Sissy Cross
- Allen Institute for Brain ScienceSeattleUnited States
| | | | | | - Shawn R Olsen
- Allen Institute for Brain ScienceSeattleUnited States
| | - R Clay Reid
- Allen Institute for Brain ScienceSeattleUnited States
| |
Collapse
|
15
|
Ocker GK, Hu Y, Buice MA, Doiron B, Josić K, Rosenbaum R, Shea-Brown E. From the statistics of connectivity to the statistics of spike times in neuronal networks. Curr Opin Neurobiol 2017; 46:109-119. [PMID: 28863386 DOI: 10.1016/j.conb.2017.07.011] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2017] [Revised: 07/21/2017] [Accepted: 07/27/2017] [Indexed: 10/19/2022]
Abstract
An essential step toward understanding neural circuits is linking their structure and their dynamics. In general, this relationship can be almost arbitrarily complex. Recent theoretical work has, however, begun to identify some broad principles underlying collective spiking activity in neural circuits. The first is that local features of network connectivity can be surprisingly effective in predicting global statistics of activity across a network. The second is that, for the important case of large networks with excitatory-inhibitory balance, correlated spiking persists or vanishes depending on the spatial scales of recurrent and feedforward connectivity. We close by showing how these ideas, together with plasticity rules, can help to close the loop between network structure and activity statistics.
Collapse
Affiliation(s)
| | - Yu Hu
- Center for Brain Science, Harvard University, United States
| | - Michael A Buice
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States
| | - Brent Doiron
- Department of Mathematics, University of Pittsburgh, United States; Center for the Neural Basis of Cognition, Pittsburgh, United States
| | - Krešimir Josić
- Department of Mathematics, University of Houston, United States; Department of Biology and Biochemistry, University of Houston, United States; Department of BioSciences, Rice University, United States
| | - Robert Rosenbaum
- Department of Mathematics, University of Notre Dame, United States
| | - Eric Shea-Brown
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States; Department of Physiology and Biophysics, and University of Washington Institute for Neuroengineering, United States.
| |
Collapse
|
16
|
Abstract
Recent experimental advances are producing an avalanche of data on both neural connectivity and neural activity. To take full advantage of these two emerging datasets we need a framework that links them, revealing how collective neural activity arises from the structure of neural connectivity and intrinsic neural dynamics. This problem of structure-driven activity has drawn major interest in computational neuroscience. Existing methods for relating activity and architecture in spiking networks rely on linearizing activity around a central operating point and thus fail to capture the nonlinear responses of individual neurons that are the hallmark of neural information processing. Here, we overcome this limitation and present a new relationship between connectivity and activity in networks of nonlinear spiking neurons by developing a diagrammatic fluctuation expansion based on statistical field theory. We explicitly show how recurrent network structure produces pairwise and higher-order correlated activity, and how nonlinearities impact the networks’ spiking activity. Our findings open new avenues to investigating how single-neuron nonlinearities—including those of different cell types—combine with connectivity to shape population activity and function. Neuronal networks, like many biological systems, exhibit variable activity. This activity is shaped by both the underlying biology of the component neurons and the structure of their interactions. How can we combine knowledge of these two things—that is, models of individual neurons and of their interactions—to predict the statistics of single- and multi-neuron activity? Current approaches rely on linearizing neural activity around a stationary state. In the face of neural nonlinearities, however, these linear methods can fail to predict spiking statistics and even fail to correctly predict whether activity is stable or pathological. Here, we show how to calculate any spike train cumulant in a broad class of models, while systematically accounting for nonlinear effects. We then study a fundamental effect of nonlinear input-rate transfer–coupling between different orders of spiking statistic–and how this depends on single-neuron and network properties.
Collapse
Affiliation(s)
- Gabriel Koch Ocker
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Krešimir Josić
- Department of Mathematics and Department of Biology and Biochemistry, University of Houston, Houston, Texas, United States of America
- Department of BioSciences, Rice University, Houston, Texas, United States of America
| | - Eric Shea-Brown
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
- Department of Physiology and Biophysics, and UW Institute of Neuroengineering, University of Washington, Seattle, Washington, United States of America
| | - Michael A. Buice
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
- * E-mail:
| |
Collapse
|
17
|
|
18
|
Abstract
Stochastic differential equations (SDEs) have multiple applications in mathematical neuroscience and are notoriously difficult. Here, we give a self-contained pedagogical review of perturbative field theoretic and path integral methods to calculate moments of the probability density function of SDEs. The methods can be extended to high dimensional systems such as networks of coupled neurons and even deterministic systems with quenched disorder.
Collapse
Affiliation(s)
- Carson C. Chow
- Mathematical Biology Section, Laboratory of Biological Modeling, NIDDK, NIH, Bethesda, MD 20892 USA
| | - Michael A. Buice
- Mathematical Biology Section, Laboratory of Biological Modeling, NIDDK, NIH, Bethesda, MD 20892 USA
| |
Collapse
|
19
|
Abstract
Much progress has been made in uncovering the computational capabilities of spiking neural networks. However, spiking neurons will always be more expensive to simulate compared to rate neurons because of the inherent disparity in time scales-the spike duration time is much shorter than the inter-spike time, which is much shorter than any learning time scale. In numerical analysis, this is a classic stiff problem. Spiking neurons are also much more difficult to study analytically. One possible approach to making spiking networks more tractable is to augment mean field activity models with some information about spiking correlations. For example, such a generalized activity model could carry information about spiking rates and correlations between spikes self-consistently. Here, we will show how this can be accomplished by constructing a complete formal probabilistic description of the network and then expanding around a small parameter such as the inverse of the number of neurons in the network. The mean field theory of the system gives a rate-like description. The first order terms in the perturbation expansion keep track of covariances.
Collapse
Affiliation(s)
- Michael A. Buice
- Modeling, Analysis and Theory Team, Allen Institute for Brain ScienceSeattle, WA, USA
| | - Carson C. Chow
- Laboratory of Biological Modeling, NIDDK, National Institutes of HealthBethesda, MD, USA
| |
Collapse
|
20
|
Abstract
Mean field theories have been a stalwart for studying the dynamics of networks of coupled neurons. They are convenient because they are relatively simple and possible to analyze. However, classical mean field theory neglects the effects of fluctuations and correlations due to single neuron effects. Here, we consider various possible approaches for going beyond mean field theory and incorporating correlation effects. Statistical field theory methods, in particular the Doi-Peliti-Janssen formalism, are particularly useful in this regard.
Collapse
Affiliation(s)
- Michael A Buice
- Center for Learning and Memory, University of Texas at Austin, Austin, TX, USA
| | - Carson C Chow
- Laboratory of Biological Modeling, NIDDK, NIH, Bethesda, MD, USA
| |
Collapse
|
21
|
Abstract
We investigate the dynamics of a deterministic finite-sized network of synaptically coupled spiking neurons and present a formalism for computing the network statistics in a perturbative expansion. The small parameter for the expansion is the inverse number of neurons in the network. The network dynamics are fully characterized by a neuron population density that obeys a conservation law analogous to the Klimontovich equation in the kinetic theory of plasmas. The Klimontovich equation does not possess well-behaved solutions but can be recast in terms of a coupled system of well-behaved moment equations, known as a moment hierarchy. The moment hierarchy is impossible to solve but in the mean field limit of an infinite number of neurons, it reduces to a single well-behaved conservation law for the mean neuron density. For a large but finite system, the moment hierarchy can be truncated perturbatively with the inverse system size as a small parameter but the resulting set of reduced moment equations that are still very difficult to solve. However, the entire moment hierarchy can also be re-expressed in terms of a functional probability distribution of the neuron density. The moments can then be computed perturbatively using methods from statistical field theory. Here we derive the complete mean field theory and the lowest order second moment corrections for physiologically relevant quantities. Although we focus on finite-size corrections, our method can be used to compute perturbative expansions in any parameter. One avenue towards understanding how the brain functions is to create computational and mathematical models. However, a human brain has on the order of a hundred billion neurons with a quadrillion synaptic connections. Each neuron is a complex cell comprised of multiple compartments hosting a myriad of ions, proteins and other molecules. Even if computing power continues to increase exponentially, directly simulating all the processes in the brain on a computer is not feasible in the foreseeable future and even if this could be achieved, the resulting simulation may be no simpler to understand than the brain itself. Hence, the need for more tractable models. Historically, systems with many interacting bodies are easier to understand in the two opposite limits of a small number or an infinite number of elements and most of the theoretical efforts in understanding neural networks have been devoted to these two limits. There has been relatively little effort directed to the very relevant but difficult regime of large but finite networks. In this paper, we introduce a new formalism that borrows from the methods of many-body statistical physics to analyze finite size effects in spiking neural networks.
Collapse
Affiliation(s)
- Michael A. Buice
- Laboratory of Biological Modeling, NIDDK, NIH, Bethesda, Maryland, United States of America
- * E-mail: (MAB); (CCC)
| | - Carson C. Chow
- Laboratory of Biological Modeling, NIDDK, NIH, Bethesda, Maryland, United States of America
- * E-mail: (MAB); (CCC)
| |
Collapse
|
22
|
Buice MA, Chow CC. Effective stochastic behavior in dynamical systems with incomplete information. Phys Rev E Stat Nonlin Soft Matter Phys 2011; 84:051120. [PMID: 22181382 PMCID: PMC3457716 DOI: 10.1103/physreve.84.051120] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/25/2011] [Revised: 09/12/2011] [Indexed: 05/26/2023]
Abstract
Complex systems are generally analytically intractable and difficult to simulate. We introduce a method for deriving an effective stochastic equation for a high-dimensional deterministic dynamical system for which some portion of the configuration is not precisely specified. We use a response function path integral to construct an equivalent distribution for the stochastic dynamics from the distribution of the incomplete information. We apply this method to the Kuramoto model of coupled oscillators to derive an effective stochastic equation for a single oscillator interacting with a bath of oscillators and also outline the procedure for other systems.
Collapse
Affiliation(s)
- Michael A Buice
- Laboratory of Biological Modeling, NIDDK, NIH, Bethesda, Maryland 20892, USA
| | | |
Collapse
|
23
|
Abstract
Population rate or activity equations are the foundation of a common approach to modeling for neural networks. These equations provide mean field dynamics for the firing rate or activity of neurons within a network given some connectivity. The shortcoming of these equations is that they take into account only the average firing rate, while leaving out higher-order statistics like correlations between firing. A stochastic theory of neural networks that includes statistics at all orders was recently formulated. We describe how this theory yields a systematic extension to population rate equations by introducing equations for correlations and appropriate coupling terms. Each level of the approximation yields closed equations; they depend only on the mean and specific correlations of interest, without an ad hoc criterion for doing so. We show in an example of an all-to-all connected network how our system of generalized activity equations captures phenomena missed by the mean field rate equations alone.
Collapse
|
24
|
|
25
|
Buice MA, Chow CC. Correlations, fluctuations, and stability of a finite-size network of coupled oscillators. Phys Rev E Stat Nonlin Soft Matter Phys 2007; 76:031118. [PMID: 17930210 DOI: 10.1103/physreve.76.031118] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/17/2007] [Indexed: 05/25/2023]
Abstract
The incoherent state of the Kuramoto model of coupled oscillators exhibits marginal modes in mean field theory. We demonstrate that corrections due to finite size effects render these modes stable in the subcritical case, i.e., when the population is not synchronous. This demonstration is facilitated by the construction of a nonequilibrium statistical field theoretic formulation of a generic model of coupled oscillators. This theory is consistent with previous results. In the all-to-all case, the fluctuations in this theory are due completely to finite size corrections, which can be calculated in an expansion in 1/N, where N is the number of oscillators. The N-->infinity limit of this theory is what is traditionally called mean field theory for the Kuramoto model.
Collapse
Affiliation(s)
- Michael A Buice
- Laboratory of Biological Modeling, NIDDK, NIH, Bethesda, Maryland 20892, USA
| | | |
Collapse
|
26
|
Buice MA, Cowan JD. Field-theoretic approach to fluctuation effects in neural networks. Phys Rev E Stat Nonlin Soft Matter Phys 2007; 75:051919. [PMID: 17677110 DOI: 10.1103/physreve.75.051919] [Citation(s) in RCA: 55] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/31/2006] [Revised: 01/26/2007] [Indexed: 05/08/2023]
Abstract
A well-defined stochastic theory for neural activity, which permits the calculation of arbitrary statistical moments and equations governing them, is a potentially valuable tool for theoretical neuroscience. We produce such a theory by analyzing the dynamics of neural activity using field theoretic methods for nonequilibrium statistical processes. Assuming that neural network activity is Markovian, we construct the effective spike model, which describes both neural fluctuations and response. This analysis leads to a systematic expansion of corrections to mean field theory, which for the effective spike model is a simple version of the Wilson-Cowan equation. We argue that neural activity governed by this model exhibits a dynamical phase transition which is in the universality class of directed percolation. More general models (which may incorporate refractoriness) can exhibit other universality classes, such as dynamic isotropic percolation. Because of the extremely high connectivity in typical networks, it is expected that higher-order terms in the systematic expansion are small for experimentally accessible measurements, and thus, consistent with measurements in neocortical slice preparations, we expect mean field exponents for the transition. We provide a quantitative criterion for the relative magnitude of each term in the systematic expansion, analogous to the Ginsburg criterion. Experimental identification of dynamic universality classes in vivo is an outstanding and important question for neuroscience.
Collapse
Affiliation(s)
- Michael A Buice
- NIH/NIDDK/LBM, Building 12A Room 4007, MSC 5621, Bethesda, MD 20892, USA.
| | | |
Collapse
|
27
|
Abstract
We present an approach for the description of fluctuations that are due to finite system size induced correlations in the Kuramoto model of coupled oscillators. We construct a hierarchy for the moments of the density of oscillators that is analogous to the Bogoliubov-Born-Green-Kirkwood-Yvon hierarchy in the kinetic theory of plasmas and gases. To calculate the lowest order system size effect, we truncate this hierarchy at second order and solve the resulting closed equations for the two-oscillator correlation function around the incoherent state. We use this correlation function to compute the fluctuations of the order parameter, including the effect of transients, and compare this computation with numerical simulations.
Collapse
Affiliation(s)
- Eric J Hildebrand
- Department of Physics and Astronomy, University of Pittsburgh, Pittsburgh, Pennsylvania, USA
| | | | | |
Collapse
|