1
|
Johnsen KA, Cruzado NA, Menard ZC, Willats AA, Charles AS, Markowitz JE, Rozell CJ. Bridging model and experiment in systems neuroscience with Cleo: the Closed-Loop, Electrophysiology, and Optophysiology simulation testbed. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2023.01.27.525963. [PMID: 39026717 PMCID: PMC11257437 DOI: 10.1101/2023.01.27.525963] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/20/2024]
Abstract
Systems neuroscience has experienced an explosion of new tools for reading and writing neural activity, enabling exciting new experiments such as all-optical or closed-loop control that effect powerful causal interventions. At the same time, improved computational models are capable of reproducing behavior and neural activity with increasing fidelity. Unfortunately, these advances have drastically increased the complexity of integrating different lines of research, resulting in the missed opportunities and untapped potential of suboptimal experiments. Experiment simulation can help bridge this gap, allowing model and experiment to better inform each other by providing a low-cost testbed for experiment design, model validation, and methods engineering. Specifically, this can be achieved by incorporating the simulation of the experimental interface into our models, but no existing tool integrates optogenetics, two-photon calcium imaging, electrode recording, and flexible closed-loop processing with neural population simulations. To address this need, we have developed Cleo: the Closed-Loop, Electrophysiology, and Optophysiology experiment simulation testbed. Cleo is a Python package enabling injection of recording and stimulation devices as well as closed-loop control with realistic latency into a Brian spiking neural network model. It is the only publicly available tool currently supporting two-photon and multi-opsin/wavelength optogenetics. To facilitate adoption and extension by the community, Cleo is open-source, modular, tested, and documented, and can export results to various data formats. Here we describe the design and features of Cleo, validate output of individual components and integrated experiments, and demonstrate its utility for advancing optogenetic techniques in prospective experiments using previously published systems neuroscience models.
Collapse
Affiliation(s)
- Kyle A. Johnsen
- Coulter Department of Biomedical Engineering, Georgia Institute of Technology and Emory University, Atlanta, GA, USA
| | | | - Zachary C. Menard
- Coulter Department of Biomedical Engineering, Georgia Institute of Technology and Emory University, Atlanta, GA, USA
| | - Adam A. Willats
- Coulter Department of Biomedical Engineering, Georgia Institute of Technology and Emory University, Atlanta, GA, USA
| | - Adam S. Charles
- Department of Biomedical Engineering, The Johns Hopkins University, Baltimore, MD, USA
| | - Jeffrey E. Markowitz
- Coulter Department of Biomedical Engineering, Georgia Institute of Technology and Emory University, Atlanta, GA, USA
| | | |
Collapse
|
2
|
Elbaz M, Buterman R, Ezra Tsur E. NeuroConstruct-based implementation of structured-light stimulated retinal circuitry. BMC Neurosci 2020; 21:28. [PMID: 32580768 PMCID: PMC7315481 DOI: 10.1186/s12868-020-00578-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2019] [Accepted: 06/17/2020] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Retinal circuitry provides a fundamental window to neural networks, featuring widely investigated visual phenomena ranging from direction selectivity to fast detection of approaching motion. As the divide between experimental and theoretical visual neuroscience is fading, neuronal modeling has proven to be important for retinal research. In neuronal modeling a delicate balance is maintained between bio-plausibility and model tractability, giving rise to myriad modeling frameworks. One biologically detailed framework for neuro modeling is NeuroConstruct, which facilitates the creation, visualization and analysis of neural networks in 3D. RESULTS Here, we extended NeuroConstruct to support the generation of structured visual stimuli, to feature different synaptic dynamics, to allow for heterogeneous synapse distribution and to enable rule-based synaptic connectivity between cell populations. We utilized this framework to demonstrate a simulation of a dense plexus of biologically realistic and morphologically detailed starburst amacrine cells. The amacrine cells were connected to a ganglion cell and stimulated with expanding and collapsing rings of light. CONCLUSIONS This framework provides a powerful toolset for the investigation of the yet elusive underlying mechanisms of retinal computations such as direction selectivity. Particularly, we showcased the way NeuroConstruct can be extended to support advanced field-specific neuro-modeling.
Collapse
Affiliation(s)
- Miriam Elbaz
- Jerusalem College of Technology, Jerusalem, Israel
| | | | - Elishai Ezra Tsur
- Jerusalem College of Technology, Jerusalem, Israel. .,Neuro-Biomorphic Engineering Lab, Department of Mathematics and Computer Science, Open University of Israel, Raanana, Israel.
| |
Collapse
|
3
|
Stimberg M, Brette R, Goodman DFM. Brian 2, an intuitive and efficient neural simulator. eLife 2019; 8:e47314. [PMID: 31429824 PMCID: PMC6786860 DOI: 10.7554/elife.47314] [Citation(s) in RCA: 184] [Impact Index Per Article: 36.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2019] [Accepted: 08/19/2019] [Indexed: 01/20/2023] Open
Abstract
Brian 2 allows scientists to simply and efficiently simulate spiking neural network models. These models can feature novel dynamical equations, their interactions with the environment, and experimental protocols. To preserve high performance when defining new models, most simulators offer two options: low-level programming or description languages. The first option requires expertise, is prone to errors, and is problematic for reproducibility. The second option cannot describe all aspects of a computational experiment, such as the potentially complex logic of a stimulation protocol. Brian addresses these issues using runtime code generation. Scientists write code with simple and concise high-level descriptions, and Brian transforms them into efficient low-level code that can run interleaved with their code. We illustrate this with several challenging examples: a plastic model of the pyloric network, a closed-loop sensorimotor model, a programmatic exploration of a neuron model, and an auditory model with real-time input.
Collapse
Affiliation(s)
- Marcel Stimberg
- Sorbonne Université, INSERM, CNRS, Institut de la VisionParisFrance
| | - Romain Brette
- Sorbonne Université, INSERM, CNRS, Institut de la VisionParisFrance
| | - Dan FM Goodman
- Department of Electrical and Electronic EngineeringImperial College LondonLondonUnited Kingdom
| |
Collapse
|
4
|
Khalil R, Karim AA, Khedr E, Moftah M, Moustafa AA. Dynamic Communications Between GABA A Switch, Local Connectivity, and Synapses During Cortical Development: A Computational Study. Front Cell Neurosci 2018; 12:468. [PMID: 30618625 PMCID: PMC6304749 DOI: 10.3389/fncel.2018.00468] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2018] [Accepted: 11/16/2018] [Indexed: 11/13/2022] Open
Abstract
Several factors regulate cortical development, such as changes in local connectivity and the influences of dynamical synapses. In this study, we simulated various factors affecting the regulation of neural network activity during cortical development. Previous studies have shown that during early cortical development, the reversal potential of GABAA shifts from depolarizing to hyperpolarizing. Here we provide the first integrative computational model to simulate the combined effects of these factors in a unified framework (building on our prior work: Khalil et al., 2017a,b). In the current study, we extend our model to monitor firing activity in response to the excitatory action of GABAA. Precisely, we created a Spiking Neural Network model that included certain biophysical parameters for lateral connectivity (distance between adjacent neurons) and nearby local connectivity (complex connections involving those between neuronal groups). We simulated different network scenarios (for immature and mature conditions) based on these biophysical parameters. Then, we implemented two forms of Short-term synaptic plasticity (depression and facilitation). Each form has two distinct kinds according to its synaptic time constant value. Finally, in both sets of networks, we compared firing rate activity responses before and after simulating dynamical synapses. Based on simulation results, we found that the modulation effect of dynamical synapses for evaluating and shaping the firing activity of the neural network is strongly dependent on the physiological state of GABAA. Moreover, the STP mechanism acts differently in every network scenario, mirroring the crucial modulating roles of these critical parameters during cortical development. Clinical implications for pathological alterations of GABAergic signaling in neurological and psychiatric disorders are discussed.
Collapse
Affiliation(s)
- Radwa Khalil
- Department of Psychology and Methods, Jacobs University Bremen, Bremen, Germany
| | - Ahmed A Karim
- Department of Psychology and Methods, Jacobs University Bremen, Bremen, Germany.,University Clinic of Psychiatry and Psychotherapy, Tübingen, Germany
| | - Eman Khedr
- Department of Neuropsychiatry, Faculty of Medicine, Assiut University, Assiut, Egypt
| | - Marie Moftah
- Zoology Department, Faculty of Science, Alexandria University, Alexandria, Egypt
| | - Ahmed A Moustafa
- MARCS Institute for Brain and Behaviour, Western Sydney University, Sydney, NSW, Australia.,Department of Social Sciences, College of Arts and Sciences, Qatar University, Doha, Qatar
| |
Collapse
|
5
|
Gleeson P, Lung D, Grosu R, Hasani R, Larson SD. c302: a multiscale framework for modelling the nervous system of Caenorhabditis elegans. Philos Trans R Soc Lond B Biol Sci 2018; 373:rstb.2017.0379. [PMID: 30201842 PMCID: PMC6158223 DOI: 10.1098/rstb.2017.0379] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/01/2018] [Indexed: 01/03/2023] Open
Abstract
The OpenWorm project has the ambitious goal of producing a highly detailed in silico model of the nematode Caenorhabditis elegans. A crucial part of this work will be a model of the nervous system encompassing all known cell types and connections. The appropriate level of biophysical detail required in the neuronal model to reproduce observed high-level behaviours in the worm has yet to be determined. For this reason, we have developed a framework, c302, that allows different instances of neuronal networks to be generated incorporating varying levels of anatomical and physiological detail, which can be investigated and refined independently or linked to other tools developed in the OpenWorm modelling toolchain. This article is part of a discussion meeting issue ‘Connectome to behaviour: modelling C. elegans at cellular resolution’.
Collapse
Affiliation(s)
- Padraig Gleeson
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, UK
| | - David Lung
- Cyber-Physical Systems, Technische Universität Wien, Vienna, Austria
| | - Radu Grosu
- Cyber-Physical Systems, Technische Universität Wien, Vienna, Austria
| | - Ramin Hasani
- Cyber-Physical Systems, Technische Universität Wien, Vienna, Austria
| | | |
Collapse
|
6
|
Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow. Neuroinformatics 2018; 15:333-342. [PMID: 28770487 DOI: 10.1007/s12021-017-9337-x] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.
Collapse
|
7
|
Khalil R, Moftah MZ, Moustafa AA. The effects of dynamical synapses on firing rate activity: a spiking neural network model. Eur J Neurosci 2017; 46:2445-2470. [PMID: 28921686 DOI: 10.1111/ejn.13712] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2015] [Revised: 09/01/2017] [Accepted: 09/06/2017] [Indexed: 11/28/2022]
Abstract
Accumulating evidence relates the fine-tuning of synaptic maturation and regulation of neural network activity to several key factors, including GABAA signaling and a lateral spread length between neighboring neurons (i.e., local connectivity). Furthermore, a number of studies consider short-term synaptic plasticity (STP) as an essential element in the instant modification of synaptic efficacy in the neuronal network and in modulating responses to sustained ranges of external Poisson input frequency (IF). Nevertheless, evaluating the firing activity in response to the dynamical interaction between STP (triggered by ranges of IF) and these key parameters in vitro remains elusive. Therefore, we designed a spiking neural network (SNN) model in which we incorporated the following parameters: local density of arbor essences and a lateral spread length between neighboring neurons. We also created several network scenarios based on these key parameters. Then, we implemented two classes of STP: (1) short-term synaptic depression (STD) and (2) short-term synaptic facilitation (STF). Each class has two differential forms based on the parametric value of its synaptic time constant (either for depressing or facilitating synapses). Lastly, we compared the neural firing responses before and after the treatment with STP. We found that dynamical synapses (STP) have a critical differential role on evaluating and modulating the firing rate activity in each network scenario. Moreover, we investigated the impact of changing the balance between excitation (E) and inhibition (I) on stabilizing this firing activity.
Collapse
Affiliation(s)
- Radwa Khalil
- Institute for Pharmacology and Toxicology, Faculty of Medicine, Otto-von-Guericke University, Magdeburg, Germany
| | - Marie Z Moftah
- Zoology Department, Faculty of Science, Alexandria University, Alexandria, Egypt
| | - Ahmed A Moustafa
- Marcs Institute for Brain and Behaviour, Western Sydney University, Sydney, NSW, Australia
| |
Collapse
|
8
|
McDougal RA, Bulanova AS, Lytton WW. Reproducibility in Computational Neuroscience Models and Simulations. IEEE Trans Biomed Eng 2016; 63:2021-35. [PMID: 27046845 PMCID: PMC5016202 DOI: 10.1109/tbme.2016.2539602] [Citation(s) in RCA: 37] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/28/2022]
Abstract
OBJECTIVE Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. METHODS Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. RESULTS Building on these standard practices, model-sharing sites and tools have been developed that fit into several categories: 1) standardized neural simulators; 2) shared computational resources; 3) declarative model descriptors, ontologies, and standardized annotations; and 4) model-sharing repositories and sharing standards. CONCLUSION A number of complementary innovations have been proposed to enhance sharing, transparency, and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation, and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. SIGNIFICANCE Model management will become increasingly important as multiscale models become larger, more detailed, and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment.
Collapse
|
9
|
Hinkel G, Groenda H, Krach S, Vannucci L, Denninger O, Cauli N, Ulbrich S, Roennau A, Falotico E, Gewaltig MO, Knoll A, Dillmann R, Laschi C, Reussner R. A Framework for Coupled Simulations of Robots and Spiking Neuronal Networks. J INTELL ROBOT SYST 2016. [DOI: 10.1007/s10846-016-0412-6] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
10
|
Lytton WW, Seidenstein AH, Dura-Bernal S, McDougal RA, Schürmann F, Hines ML. Simulation Neurotechnologies for Advancing Brain Research: Parallelizing Large Networks in NEURON. Neural Comput 2016; 28:2063-90. [PMID: 27557104 DOI: 10.1162/neco_a_00876] [Citation(s) in RCA: 30] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/22/2023]
Abstract
Large multiscale neuronal network simulations are of increasing value as more big data are gathered about brain wiring and organization under the auspices of a current major research initiative, such as Brain Research through Advancing Innovative Neurotechnologies. The development of these models requires new simulation technologies. We describe here the current use of the NEURON simulator with message passing interface (MPI) for simulation in the domain of moderately large networks on commonly available high-performance computers (HPCs). We discuss the basic layout of such simulations, including the methods of simulation setup, the run-time spike-passing paradigm, and postsimulation data storage and data management approaches. Using the Neuroscience Gateway, a portal for computational neuroscience that provides access to large HPCs, we benchmark simulations of neuronal networks of different sizes (500-100,000 cells), and using different numbers of nodes (1-256). We compare three types of networks, composed of either Izhikevich integrate-and-fire neurons (I&F), single-compartment Hodgkin-Huxley (HH) cells, or a hybrid network with half of each. Results show simulation run time increased approximately linearly with network size and decreased almost linearly with the number of nodes. Networks with I&F neurons were faster than HH networks, although differences were small since all tested cells were point neurons with a single compartment.
Collapse
Affiliation(s)
- William W Lytton
- Departments of Physiology, Pharmacology, Biomedical Engineering, and Neurology, SUNY Downstate Medical Center, Brooklyn 11023, New York, and Kings County Hospital Center, Brooklyn 11203, New York, U.S.A.
| | - Alexandra H Seidenstein
- Departments of Physiology, Pharmacology, Biomedical Engineering, and Neurology, SUNY Downstate Medical Center, Brooklyn, NY 11023, and Department of Chemical and Biomolecular Engineering, Tandon School of Engineering, New York University, Brooklyn, NY 11201, U.S.A.
| | - Salvador Dura-Bernal
- Departments of Physiology, Pharmacology, Biomedical Engineering, and Neurology, SUNY Downstate Medical Center, Brooklyn, NY 11023, U.S.A.
| | - Robert A McDougal
- Department of Neuroscience, Yale University, New Haven, CT 06520, U.S.A.
| | - Felix Schürmann
- Blue Brain Project, Brain Mind Institute, Ecole Polytechnique Fédérale de Lausanne, 1015 Geneva, Switzerland
| | - Michael L Hines
- Blue Brain Project, Brain Mind Institute, Ecole Polytechnique Fédérale de Lausanne, 1015 Geneva, Switzerland, and Department of Neuroscience, Yale University, New Haven, CT 06520, U.S.A.
| |
Collapse
|
11
|
Muller E, Bednar JA, Diesmann M, Gewaltig MO, Hines M, Davison AP. Python in neuroscience. Front Neuroinform 2015; 9:11. [PMID: 25926788 PMCID: PMC4396193 DOI: 10.3389/fninf.2015.00011] [Citation(s) in RCA: 39] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2015] [Accepted: 03/28/2015] [Indexed: 11/13/2022] Open
Affiliation(s)
- Eilif Muller
- Center for Brain Simulation, Ecole Polytechnique Fédérale de Lausanne Geneva, Switzerland
| | - James A Bednar
- Institute for Adaptive and Neural Computation, University of Edinburgh Edinburgh, UK
| | - Markus Diesmann
- Jülich Research Center and Jülich Aachen Research Alliance, Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) Jülich, Germany ; Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen University Aachen, Germany ; Department of Physics, Faculty 1, RWTH Aachen University Aachen, Germany
| | - Marc-Oliver Gewaltig
- Center for Brain Simulation, Ecole Polytechnique Fédérale de Lausanne Geneva, Switzerland
| | - Michael Hines
- Department of Neurobiology, Yale University New Haven, CT, USA
| | - Andrew P Davison
- Neuroinformatics group Unité de Neurosciences, Information et Complexité, Centre National de la Recherche Scientifique Gif sur Yvette, France
| |
Collapse
|
12
|
Subramaniyam S, Solinas S, Perin P, Locatelli F, Masetto S, D'Angelo E. Computational modeling predicts the ionic mechanism of late-onset responses in unipolar brush cells. Front Cell Neurosci 2014; 8:237. [PMID: 25191224 PMCID: PMC4138490 DOI: 10.3389/fncel.2014.00237] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2014] [Accepted: 07/27/2014] [Indexed: 11/29/2022] Open
Abstract
Unipolar Brush Cells (UBCs) have been suggested to play a critical role in cerebellar functioning, yet the corresponding cellular mechanisms remain poorly understood. UBCs have recently been reported to generate, in addition to early-onset glutamate receptor-dependent synaptic responses, a late-onset response (LOR) composed of a slow depolarizing ramp followed by a spike burst (Locatelli et al., 2013). The LOR activates as a consequence of synaptic activity and involves an intracellular cascade modulating H- and TRP-current gating. In order to assess the LOR mechanisms, we have developed a UBC multi-compartmental model (including soma, dendrite, initial segment, and axon) incorporating biologically realistic representations of ionic currents and a cytoplasmic coupling mechanism regulating TRP and H channel gating. The model finely reproduced UBC responses to current injection, including a burst triggered by a low-threshold spike (LTS) sustained by CaLVA currents, a persistent discharge sustained by CaHVA currents, and a rebound burst following hyperpolarization sustained by H- and CaLVA-currents. Moreover, the model predicted that H- and TRP-current regulation was necessary and sufficient to generate the LOR and its dependence on the intensity and duration of mossy fiber activity. Therefore, the model showed that, using a basic set of ionic channels, UBCs generate a rich repertoire of bursts, which could effectively implement tunable delay-lines in the local microcircuit.
Collapse
Affiliation(s)
- Sathyaa Subramaniyam
- Neurophysiology Unit, Department of Brain and Behavioral Science, University of Pavia Pavia, Italy ; Consorzio Interuniversitario per le Scienze Fisiche della Materia (CNISM) Pavia, Italy
| | - Sergio Solinas
- Neurophysiology Unit, Brain Connectivity Center, Istituto Neurologico IRCCS C. Mondino Pavia, Italy
| | - Paola Perin
- Neurophysiology Unit, Department of Brain and Behavioral Science, University of Pavia Pavia, Italy
| | - Francesca Locatelli
- Neurophysiology Unit, Department of Brain and Behavioral Science, University of Pavia Pavia, Italy
| | - Sergio Masetto
- Neurophysiology Unit, Department of Brain and Behavioral Science, University of Pavia Pavia, Italy
| | - Egidio D'Angelo
- Neurophysiology Unit, Department of Brain and Behavioral Science, University of Pavia Pavia, Italy ; Neurophysiology Unit, Brain Connectivity Center, Istituto Neurologico IRCCS C. Mondino Pavia, Italy
| |
Collapse
|
13
|
Pecevski D, Kappel D, Jonke Z. NEVESIM: event-driven neural simulation framework with a Python interface. Front Neuroinform 2014; 8:70. [PMID: 25177291 PMCID: PMC4132371 DOI: 10.3389/fninf.2014.00070] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2013] [Accepted: 07/17/2014] [Indexed: 12/02/2022] Open
Abstract
NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.
Collapse
Affiliation(s)
- Dejan Pecevski
- Institute for Theoretical Computer Science, Graz University of Technology Graz, Austria
| | - David Kappel
- Institute for Theoretical Computer Science, Graz University of Technology Graz, Austria
| | - Zeno Jonke
- Institute for Theoretical Computer Science, Graz University of Technology Graz, Austria
| |
Collapse
|
14
|
Guzman SJ, Schlögl A, Schmidt-Hieber C. Stimfit: quantifying electrophysiological data with Python. Front Neuroinform 2014; 8:16. [PMID: 24600389 PMCID: PMC3931263 DOI: 10.3389/fninf.2014.00016] [Citation(s) in RCA: 104] [Impact Index Per Article: 10.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2013] [Accepted: 02/04/2014] [Indexed: 11/27/2022] Open
Abstract
Intracellular electrophysiological recordings provide crucial insights into elementary neuronal signals such as action potentials and synaptic currents. Analyzing and interpreting these signals is essential for a quantitative understanding of neuronal information processing, and requires both fast data visualization and ready access to complex analysis routines. To achieve this goal, we have developed Stimfit, a free software package for cellular neurophysiology with a Python scripting interface and a built-in Python shell. The program supports most standard file formats for cellular neurophysiology and other biomedical signals through the Biosig library. To quantify and interpret the activity of single neurons and communication between neurons, the program includes algorithms to characterize the kinetics of presynaptic action potentials and postsynaptic currents, estimate latencies between pre- and postsynaptic events, and detect spontaneously occurring events. We validate and benchmark these algorithms, give estimation errors, and provide sample use cases, showing that Stimfit represents an efficient, accessible and extensible way to accurately analyze and interpret neuronal signals.
Collapse
Affiliation(s)
- Segundo J Guzman
- Institute of Science and Technology Austria Klosterneuburg, Austria
| | - Alois Schlögl
- Institute of Science and Technology Austria Klosterneuburg, Austria
| | - Christoph Schmidt-Hieber
- Wolfson Institute for Biomedical Research, University College London London, UK ; Department of Neuroscience, Physiology and Pharmacology, University College London London, UK
| |
Collapse
|
15
|
McDougal RA, Hines ML, Lytton WW. Reaction-diffusion in the NEURON simulator. Front Neuroinform 2013; 7:28. [PMID: 24298253 PMCID: PMC3828620 DOI: 10.3389/fninf.2013.00028] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2013] [Accepted: 10/25/2013] [Indexed: 12/29/2022] Open
Abstract
In order to support research on the role of cell biological principles (genomics, proteomics, signaling cascades and reaction dynamics) on the dynamics of neuronal response in health and disease, NEURON's Reaction-Diffusion (rxd) module in Python provides specification and simulation for these dynamics, coupled with the electrophysiological dynamics of the cell membrane. Arithmetic operations on species and parameters are overloaded, allowing arbitrary reaction formulas to be specified using Python syntax. These expressions are then transparently compiled into bytecode that uses NumPy for fast vectorized calculations. At each time step, rxd combines NEURON's integrators with SciPy's sparse linear algebra library.
Collapse
Affiliation(s)
| | | | - William W. Lytton
- Department Physiology and Pharmacology, SUNY DownstateBrooklyn, NY, USA
- Department of Neurology, SUNY DownstateBrooklyn, NY, USA
- Kings County HospitalBrooklyn, NY, USA
| |
Collapse
|
16
|
Majka P, Kowalski JM, Chlodzinska N, Wójcik DK. 3D brain atlas reconstructor service--online repository of three-dimensional models of brain structures. Neuroinformatics 2013; 11:507-18. [PMID: 23943281 PMCID: PMC3824210 DOI: 10.1007/s12021-013-9199-9] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
Brain atlases are important tools of neuroscience. Traditionally prepared in paper book format, more and more commonly they take digital form which extends their utility. To simplify work with different atlases, to lay the ground for developing universal tools which could abstract from the origin of the atlas, efforts are being made to provide common interfaces to these atlases. 3D Brain Atlas Reconstructor service (3dBARs) described here is a repository of digital representations of different brain atlases in CAF format which we recently proposed and a repository of 3D models of brain structures. A graphical front-end is provided for creating and viewing the reconstructed models as well as the underlying 2D atlas data. An application programming interface (API) facilitates programmatic access to the service contents from other websites. From a typical user's point of view, 3dBARs offers an accessible way to mine publicly available atlasing data with a convenient browser based interface, without the need to install extra software. For a developer of services related to brain atlases, 3dBARs supplies mechanisms for enhancing functionality of other software. The policy of the service is to accept new datasets as delivered by interested parties and we work with the researchers who obtain original data to make them available to the neuroscience community at large. The functionality offered by the 3dBARs situates it at the core of present and future general atlasing services tying it strongly to the global atlasing neuroinformatics infrastructure.
Collapse
Affiliation(s)
- Piotr Majka
- Nencki Institute of Experimental Biology, 3 Pasteur Street, 02-093, Warsaw, Poland,
| | | | | | | |
Collapse
|
17
|
Abstract
In neural network simulators, models are specified according to a language, either specific or based on a general programming language (e.g. Python). There are also ongoing efforts to develop standardized languages, for example NeuroML. When designing these languages, efforts are often focused on expressivity, that is, on maximizing the number of model types than can be described and simulated. I argue that a complementary goal should be to minimize the cognitive effort required on the part of the user to use the language. I try to formalize this notion with the concept of "language entropy", and I propose a few practical guidelines to minimize the entropy of languages for neural simulation.
Collapse
Affiliation(s)
- Romain Brette
- Laboratoire Psychologie de la Perception, CNRS, Université Paris Descartes, Paris, France.
| |
Collapse
|
18
|
Gold MG. A frontier in the understanding of synaptic plasticity: solving the structure of the postsynaptic density. Bioessays 2012; 34:599-608. [PMID: 22528972 PMCID: PMC3492911 DOI: 10.1002/bies.201200009] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/19/2022]
Abstract
The postsynaptic density (PSD) is a massive multi-protein complex whose functions include positioning signalling molecules for induction of long-term potentiation (LTP) and depression (LTD) of synaptic strength. These processes are thought to underlie memory formation. To understand how the PSD coordinates bidirectional synaptic plasticity with different synaptic activation patterns, it is necessary to determine its three-dimensional structure. A structural model of the PSD is emerging from investigation of its molecular composition and connectivity, in addition to structural studies at different levels of resolution. Technical innovations including mass spectrometry of cross-linked proteins and super-resolution light microscopy can drive progress. Integrating different information relating to PSD structure is challenging since the structure is so large and complex. The reconstruction of a PSD subcomplex anchored by AKAP79 exemplifies on a small scale how integration can be achieved. With its entire molecular structure coming into focus, this is a unique opportunity to study the PSD.
Collapse
Affiliation(s)
- Matthew G Gold
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, UK.
| |
Collapse
|
19
|
Majka P, Kublik E, Furga G, Wójcik DK. Common atlas format and 3D brain atlas reconstructor: infrastructure for constructing 3D brain atlases. Neuroinformatics 2012; 10:181-97. [PMID: 22227717 PMCID: PMC3325030 DOI: 10.1007/s12021-011-9138-6] [Citation(s) in RCA: 35] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
One of the challenges of modern neuroscience is integrating voluminous data of diferent modalities derived from a variety of specimens. This task requires a common spatial framework that can be provided by brain atlases. The first atlases were limited to two-dimentional presentation of structural data. Recently, attempts at creating 3D atlases have been made to offer navigation within non-standard anatomical planes and improve capability of localization of different types of data within the brain volume. The 3D atlases available so far have been created using frameworks which make it difficult for other researchers to replicate the results. To facilitate reproducible research and data sharing in the field we propose an SVG-based Common Atlas Format (CAF) to store 2D atlas delineations or other compatible data and 3D Brain Atlas Reconstructor (3dBAR), software dedicated to automated reconstruction of three-dimensional brain structures from 2D atlas data. The basic functionality is provided by (1) a set of parsers which translate various atlases from a number of formats into the CAF, and (2) a module generating 3D models from CAF datasets. The whole reconstruction process is reproducible and can easily be configured, tracked and reviewed, which facilitates fixing errors. Manual corrections can be made when automatic reconstruction is not sufficient. The software was designed to simplify interoperability with other neuroinformatics tools by using open file formats. The content can easily be exchanged at any stage of data processing. The framework allows for the addition of new public or proprietary content.
Collapse
Affiliation(s)
- Piotr Majka
- Department of Neurophysiology, Nencki Institute of Experimental Biology, 3 Pasteur Street, 02-093, Warsaw, Poland.
| | | | | | | |
Collapse
|
20
|
Grassia F, Buhry L, Lévi T, Tomas J, Destexhe A, Saïghi S. Tunable neuromimetic integrated system for emulating cortical neuron models. Front Neurosci 2011; 5:134. [PMID: 22163213 PMCID: PMC3233664 DOI: 10.3389/fnins.2011.00134] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2011] [Accepted: 11/18/2011] [Indexed: 11/13/2022] Open
Abstract
Nowadays, many software solutions are currently available for simulating neuron models. Less conventional than software-based systems, hardware-based solutions generally combine digital and analog forms of computation. In previous work, we designed several neuromimetic chips, including the Galway chip that we used for this paper. These silicon neurons are based on the Hodgkin–Huxley formalism and they are optimized for reproducing a large variety of neuron behaviors thanks to tunable parameters. Due to process variation and device mismatch in analog chips, we use a full-custom fitting method in voltage-clamp mode to tune our neuromimetic integrated circuits. By comparing them with experimental electrophysiological data of these cells, we show that the circuits can reproduce the main firing features of cortical cell types. In this paper, we present the experimental measurements of our system which mimic the four most prominent biological cells: fast spiking, regular spiking, intrinsically bursting, and low-threshold spiking neurons into analog neuromimetic integrated circuit dedicated to cortical neuron simulations. This hardware and software platform will allow to improve the hybrid technique, also called “dynamic-clamp,” that consists of connecting artificial and biological neurons to study the function of neuronal circuits.
Collapse
Affiliation(s)
- Filippo Grassia
- Laboratoire d'Intégration du Matériau au Système, UMR CNRS 5218, Université de Bordeaux Talence, France
| | | | | | | | | | | |
Collapse
|
21
|
Gerhard S, Daducci A, Lemkaddem A, Meuli R, Thiran JP, Hagmann P. The connectome viewer toolkit: an open source framework to manage, analyze, and visualize connectomes. Front Neuroinform 2011; 5:3. [PMID: 21713110 PMCID: PMC3112315 DOI: 10.3389/fninf.2011.00003] [Citation(s) in RCA: 76] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2011] [Accepted: 05/18/2011] [Indexed: 01/04/2023] Open
Abstract
Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/
Collapse
Affiliation(s)
- Stephan Gerhard
- Signal Processing Laboratory 5, Ecole Polytechnique Fédérale de Lausanne Lausanne, Switzerland
| | | | | | | | | | | |
Collapse
|
22
|
Democratic population decisions result in robust policy-gradient learning: a parametric study with GPU simulations. PLoS One 2011; 6:e18539. [PMID: 21572529 PMCID: PMC3087717 DOI: 10.1371/journal.pone.0018539] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2010] [Accepted: 03/03/2011] [Indexed: 11/28/2022] Open
Abstract
High performance computing on the Graphics Processing Unit (GPU) is an emerging
field driven by the promise of high computational power at a low cost. However,
GPU programming is a non-trivial task and moreover architectural limitations
raise the question of whether investing effort in this direction may be
worthwhile. In this work, we use GPU programming to simulate a two-layer network
of Integrate-and-Fire neurons with varying degrees of recurrent connectivity and
investigate its ability to learn a simplified navigation task using a
policy-gradient learning rule stemming from Reinforcement Learning. The purpose
of this paper is twofold. First, we want to support the use of GPUs in the field
of Computational Neuroscience. Second, using GPU computing power, we investigate
the conditions under which the said architecture and learning rule demonstrate
best performance. Our work indicates that networks featuring strong
Mexican-Hat-shaped recurrent connections in the top layer, where decision making
is governed by the formation of a stable activity bump in the neural population
(a “non-democratic” mechanism), achieve mediocre learning results at
best. In absence of recurrent connections, where all neurons “vote”
independently (“democratic”) for a decision via population vector
readout, the task is generally learned better and more robustly. Our study would
have been extremely difficult on a desktop computer without the use of GPU
programming. We present the routines developed for this purpose and show that a
speed improvement of 5x up to 42x is provided versus optimised Python code. The
higher speed is achieved when we exploit the parallelism of the GPU in the
search of learning parameters. This suggests that efficient GPU programming can
significantly reduce the time needed for simulating networks of spiking neurons,
particularly when multiple parameter configurations are investigated.
Collapse
|
23
|
Kochubey S, Semyanov A, Savtchenko L. Network with shunting synapses as a non-linear frequency modulator. Neural Netw 2011; 24:407-16. [PMID: 21444192 DOI: 10.1016/j.neunet.2011.03.005] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2010] [Revised: 03/01/2011] [Accepted: 03/02/2011] [Indexed: 11/25/2022]
Abstract
The role of 'noisy' excitation in synchronizing interneuron networks with shunting synapses was studied. The excitatory input was simulated as a Poisson pattern of presynaptic conductance with varying frequencies and amplitudes. We find that higher excitation frequencies induce stronger synchronisation of the network. Within the range of 1-10000 Hz, only frequencies between 20 Hz and 200 Hz affected network synchronisation. No detectable network synchronisation was found at excitation frequencies below 20 Hz, and the network's synchronisation was either almost independent of the external input or falling down to zero when the input frequency was greater than 200 Hz. Thus the network transformed the input signals with frequencies above 20 Hz into output signals with the network's synchronisation frequency. The network's synchronisation frequency in our model ranged from 20 to 68 Hz depending on the frequency of the excitatory input. We conclude that a network of interconnected interneurons is capable of converting an asynchronous excitatory input into a synchronous inhibitory output as a frequency amplifier with the amplification coefficient dependent on the number of converging excitatory inputs. Another important result of our work revealed that the external frequency may affect, in opposite ways, the frequency of the network with shunting synapses depending on the excitatory synaptic conductance and the magnitude of leak conductance.
Collapse
|