1
|
Dura-Bernal S, Griffith EY, Barczak A, O'Connell MN, McGinnis T, Moreira JVS, Schroeder CE, Lytton WW, Lakatos P, Neymotin SA. Data-driven multiscale model of macaque auditory thalamocortical circuits reproduces in vivo dynamics. Cell Rep 2023; 42:113378. [PMID: 37925640 PMCID: PMC10727489 DOI: 10.1016/j.celrep.2023.113378] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2022] [Revised: 09/05/2023] [Accepted: 10/19/2023] [Indexed: 11/07/2023] Open
Abstract
We developed a detailed model of macaque auditory thalamocortical circuits, including primary auditory cortex (A1), medial geniculate body (MGB), and thalamic reticular nucleus, utilizing the NEURON simulator and NetPyNE tool. The A1 model simulates a cortical column with over 12,000 neurons and 25 million synapses, incorporating data on cell-type-specific neuron densities, morphology, and connectivity across six cortical layers. It is reciprocally connected to the MGB thalamus, which includes interneurons and core and matrix-layer-specific projections to A1. The model simulates multiscale measures, including physiological firing rates, local field potentials (LFPs), current source densities (CSDs), and electroencephalography (EEG) signals. Laminar CSD patterns, during spontaneous activity and in response to broadband noise stimulus trains, mirror experimental findings. Physiological oscillations emerge spontaneously across frequency bands comparable to those recorded in vivo. We elucidate population-specific contributions to observed oscillation events and relate them to firing and presynaptic input patterns. The model offers a quantitative theoretical framework to integrate and interpret experimental data and predict its underlying cellular and circuit mechanisms.
Collapse
Affiliation(s)
- Salvador Dura-Bernal
- Department of Physiology and Pharmacology, State University of New York (SUNY) Downstate Health Sciences University, Brooklyn, NY, USA; Center for Biomedical Imaging and Neuromodulation, Nathan S. Kline Institute for Psychiatric Research, Orangeburg, NY, USA.
| | - Erica Y Griffith
- Department of Physiology and Pharmacology, State University of New York (SUNY) Downstate Health Sciences University, Brooklyn, NY, USA; Center for Biomedical Imaging and Neuromodulation, Nathan S. Kline Institute for Psychiatric Research, Orangeburg, NY, USA.
| | - Annamaria Barczak
- Center for Biomedical Imaging and Neuromodulation, Nathan S. Kline Institute for Psychiatric Research, Orangeburg, NY, USA
| | - Monica N O'Connell
- Center for Biomedical Imaging and Neuromodulation, Nathan S. Kline Institute for Psychiatric Research, Orangeburg, NY, USA
| | - Tammy McGinnis
- Center for Biomedical Imaging and Neuromodulation, Nathan S. Kline Institute for Psychiatric Research, Orangeburg, NY, USA
| | - Joao V S Moreira
- Department of Physiology and Pharmacology, State University of New York (SUNY) Downstate Health Sciences University, Brooklyn, NY, USA
| | - Charles E Schroeder
- Center for Biomedical Imaging and Neuromodulation, Nathan S. Kline Institute for Psychiatric Research, Orangeburg, NY, USA; Departments of Psychiatry and Neurology, Columbia University Medical Center, New York, NY, USA
| | - William W Lytton
- Department of Physiology and Pharmacology, State University of New York (SUNY) Downstate Health Sciences University, Brooklyn, NY, USA; Kings County Hospital Center, Brooklyn, NY, USA
| | - Peter Lakatos
- Center for Biomedical Imaging and Neuromodulation, Nathan S. Kline Institute for Psychiatric Research, Orangeburg, NY, USA; Department Psychiatry, NYU Grossman School of Medicine, New York, NY, USA
| | - Samuel A Neymotin
- Center for Biomedical Imaging and Neuromodulation, Nathan S. Kline Institute for Psychiatric Research, Orangeburg, NY, USA; Department Psychiatry, NYU Grossman School of Medicine, New York, NY, USA.
| |
Collapse
|
2
|
Zhang Y, He G, Ma L, Liu X, Hjorth JJJ, Kozlov A, He Y, Zhang S, Kotaleski JH, Tian Y, Grillner S, Du K, Huang T. A GPU-based computational framework that bridges neuron simulation and artificial intelligence. Nat Commun 2023; 14:5798. [PMID: 37723170 PMCID: PMC10507119 DOI: 10.1038/s41467-023-41553-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2022] [Accepted: 09/08/2023] [Indexed: 09/20/2023] Open
Abstract
Biophysically detailed multi-compartment models are powerful tools to explore computational principles of the brain and also serve as a theoretical framework to generate algorithms for artificial intelligence (AI) systems. However, the expensive computational cost severely limits the applications in both the neuroscience and AI fields. The major bottleneck during simulating detailed compartment models is the ability of a simulator to solve large systems of linear equations. Here, we present a novel Dendritic Hierarchical Scheduling (DHS) method to markedly accelerate such a process. We theoretically prove that the DHS implementation is computationally optimal and accurate. This GPU-based method performs with 2-3 orders of magnitude higher speed than that of the classic serial Hines method in the conventional CPU platform. We build a DeepDendrite framework, which integrates the DHS method and the GPU computing engine of the NEURON simulator and demonstrate applications of DeepDendrite in neuroscience tasks. We investigate how spatial patterns of spine inputs affect neuronal excitability in a detailed human pyramidal neuron model with 25,000 spines. Furthermore, we provide a brief discussion on the potential of DeepDendrite for AI, specifically highlighting its ability to enable the efficient training of biophysically detailed models in typical image classification tasks.
Collapse
Affiliation(s)
- Yichen Zhang
- National Key Laboratory for Multimedia Information Processing, School of Computer Science, Peking University, Beijing, 100871, China
| | - Gan He
- National Key Laboratory for Multimedia Information Processing, School of Computer Science, Peking University, Beijing, 100871, China
| | - Lei Ma
- National Key Laboratory for Multimedia Information Processing, School of Computer Science, Peking University, Beijing, 100871, China
- Beijing Academy of Artificial Intelligence (BAAI), Beijing, 100084, China
| | - Xiaofei Liu
- National Key Laboratory for Multimedia Information Processing, School of Computer Science, Peking University, Beijing, 100871, China
- School of Information Science and Engineering, Yunnan University, Kunming, 650500, China
| | - J J Johannes Hjorth
- Science for Life Laboratory, School of Electrical Engineering and Computer Science, Royal Institute of Technology KTH, Stockholm, SE-10044, Sweden
| | - Alexander Kozlov
- Science for Life Laboratory, School of Electrical Engineering and Computer Science, Royal Institute of Technology KTH, Stockholm, SE-10044, Sweden
- Department of Neuroscience, Karolinska Institute, Stockholm, SE-17165, Sweden
| | - Yutao He
- National Key Laboratory for Multimedia Information Processing, School of Computer Science, Peking University, Beijing, 100871, China
| | - Shenjian Zhang
- National Key Laboratory for Multimedia Information Processing, School of Computer Science, Peking University, Beijing, 100871, China
| | - Jeanette Hellgren Kotaleski
- Science for Life Laboratory, School of Electrical Engineering and Computer Science, Royal Institute of Technology KTH, Stockholm, SE-10044, Sweden
- Department of Neuroscience, Karolinska Institute, Stockholm, SE-17165, Sweden
| | - Yonghong Tian
- National Key Laboratory for Multimedia Information Processing, School of Computer Science, Peking University, Beijing, 100871, China
- School of Electrical and Computer Engineering, Shenzhen Graduate School, Peking University, Shenzhen, 518055, China
| | - Sten Grillner
- Department of Neuroscience, Karolinska Institute, Stockholm, SE-17165, Sweden
| | - Kai Du
- Institute for Artificial Intelligence, Peking University, Beijing, 100871, China.
| | - Tiejun Huang
- National Key Laboratory for Multimedia Information Processing, School of Computer Science, Peking University, Beijing, 100871, China
- Beijing Academy of Artificial Intelligence (BAAI), Beijing, 100084, China
- Institute for Artificial Intelligence, Peking University, Beijing, 100871, China
| |
Collapse
|
3
|
Milstein AD, Tran S, Ng G, Soltesz I. Offline memory replay in recurrent neuronal networks emerges from constraints on online dynamics. J Physiol 2023; 601:3241-3264. [PMID: 35907087 PMCID: PMC9885000 DOI: 10.1113/jp283216] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2022] [Accepted: 07/22/2022] [Indexed: 02/01/2023] Open
Abstract
During spatial exploration, neural circuits in the hippocampus store memories of sequences of sensory events encountered in the environment. When sensory information is absent during 'offline' resting periods, brief neuronal population bursts can 'replay' sequences of activity that resemble bouts of sensory experience. These sequences can occur in either forward or reverse order, and can even include spatial trajectories that have not been experienced, but are consistent with the topology of the environment. The neural circuit mechanisms underlying this variable and flexible sequence generation are unknown. Here we demonstrate in a recurrent spiking network model of hippocampal area CA3 that experimental constraints on network dynamics such as population sparsity, stimulus selectivity, rhythmicity and spike rate adaptation, as well as associative synaptic connectivity, enable additional emergent properties, including variable offline memory replay. In an online stimulus-driven state, we observed the emergence of neuronal sequences that swept from representations of past to future stimuli on the timescale of the theta rhythm. In an offline state driven only by noise, the network generated both forward and reverse neuronal sequences, and recapitulated the experimental observation that offline memory replay events tend to include salient locations like the site of a reward. These results demonstrate that biological constraints on the dynamics of recurrent neural circuits are sufficient to enable memories of sensory events stored in the strengths of synaptic connections to be flexibly read out during rest and sleep, which is thought to be important for memory consolidation and planning of future behaviour. KEY POINTS: A recurrent spiking network model of hippocampal area CA3 was optimized to recapitulate experimentally observed network dynamics during simulated spatial exploration. During simulated offline rest, the network exhibited the emergent property of generating flexible forward, reverse and mixed direction memory replay events. Network perturbations and analysis of model diversity and degeneracy identified associative synaptic connectivity and key features of network dynamics as important for offline sequence generation. Network simulations demonstrate that population over-representation of salient positions like the site of reward results in biased memory replay.
Collapse
Affiliation(s)
- Aaron D. Milstein
- Department of Neurosurgery, Stanford University School of Medicine, Stanford CA
- Department of Neuroscience and Cell Biology, Robert Wood Johnson Medical School and Center for Advanced Biotechnology and Medicine, Rutgers University, Piscataway, NJ
| | - Sarah Tran
- Department of Neurosurgery, Stanford University School of Medicine, Stanford CA
| | - Grace Ng
- Department of Neurosurgery, Stanford University School of Medicine, Stanford CA
| | - Ivan Soltesz
- Department of Neurosurgery, Stanford University School of Medicine, Stanford CA
| |
Collapse
|
4
|
Fietkiewicz C, McDougal RA, Corrales Marco D, Chiel HJ, Thomas PJ. Tutorial: using NEURON for neuromechanical simulations. Front Comput Neurosci 2023; 17:1143323. [PMID: 37583894 PMCID: PMC10424731 DOI: 10.3389/fncom.2023.1143323] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2023] [Accepted: 06/20/2023] [Indexed: 08/17/2023] Open
Abstract
The dynamical properties of the brain and the dynamics of the body strongly influence one another. Their interaction generates complex adaptive behavior. While a wide variety of simulation tools exist for neural dynamics or biomechanics separately, there are few options for integrated brain-body modeling. Here, we provide a tutorial to demonstrate how the widely-used NEURON simulation platform can support integrated neuromechanical modeling. As a first step toward incorporating biomechanics into a NEURON simulation, we provide a framework for integrating inputs from a "periphery" and outputs to that periphery. In other words, "body" dynamics are driven in part by "brain" variables, such as voltages or firing rates, and "brain" dynamics are influenced by "body" variables through sensory feedback. To couple the "brain" and "body" components, we use NEURON's pointer construct to share information between "brain" and "body" modules. This approach allows separate specification of brain and body dynamics and code reuse. Though simple in concept, the use of pointers can be challenging due to a complicated syntax and several different programming options. In this paper, we present five different computational models, with increasing levels of complexity, to demonstrate the concepts of code modularity using pointers and the integration of neural and biomechanical modeling within NEURON. The models include: (1) a neuromuscular model of calcium dynamics and muscle force, (2) a neuromechanical, closed-loop model of a half-center oscillator coupled to a rudimentary motor system, (3) a closed-loop model of neural control for respiration, (4) a pedagogical model of a non-smooth "brain/body" system, and (5) a closed-loop model of feeding behavior in the sea hare Aplysia californica that incorporates biologically-motivated non-smooth dynamics. This tutorial illustrates how NEURON can be integrated with a broad range of neuromechanical models. Code available at https://github.com/fietkiewicz/PointerBuilder.
Collapse
Affiliation(s)
- Chris Fietkiewicz
- Department of Mathematics and Computer Science, Hobart and William Smith Colleges, Geneva, NY, United States
| | - Robert A. McDougal
- Department of Biostatistics, Yale School of Public Health, New Haven, CT, United States
- Wu Tsai Institute, Yale University, New Haven, CT, United States
- Program in Computational Biology and Bioinformatics, Yale University, New Haven, CT, United States
- Section for Biomedical Informatics, Yale School of Medicine, New Haven, CT, United States
| | - David Corrales Marco
- Department of Mathematics and Computer Science, Hobart and William Smith Colleges, Geneva, NY, United States
| | - Hillel J. Chiel
- Department of Biology, Case Western Reserve University, Cleveland, OH, United States
- Department of Neurosciences, Case Western Reserve University, Cleveland, OH, United States
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, United States
| | - Peter J. Thomas
- Department of Biology, Case Western Reserve University, Cleveland, OH, United States
- Department of Mathematics, Applied Mathematics and Statistics, Case Western Reserve University, Cleveland, OH, United States
- Department of Cognitive Science, Case Western Reserve University, Cleveland, OH, United States
- Department of Electrical, Control, and Systems Engineering, Case Western Reserve University, Cleveland, OH, United States
- Department of Data and Computer Science, Case Western Reserve University, Cleveland, OH, United States
| |
Collapse
|
5
|
Zhang Y, Du K, Huang T. Heuristic Tree-Partition-Based Parallel Method for Biophysically Detailed Neuron Simulation. Neural Comput 2023; 35:627-644. [PMID: 36746142 DOI: 10.1162/neco_a_01565] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2022] [Accepted: 10/20/2022] [Indexed: 02/08/2023]
Abstract
Biophysically detailed neuron simulation is a powerful tool to explore the mechanisms behind biological experiments and bridge the gap between various scales in neuroscience research. However, the extremely high computational complexity of detailed neuron simulation restricts the modeling and exploration of detailed network models. The bottleneck is solving the system of linear equations. To accelerate detailed simulation, we propose a heuristic tree-partition-based parallel method (HTP) to parallelize the computation of the Hines algorithm, the kernel for solving linear equations, and leverage the strong parallel capability of the graphic processing unit (GPU) to achieve further speedup. We formulate the problem of how to get a fine parallel process as a tree-partition problem. Next, we present a heuristic partition algorithm to obtain an effective partition to efficiently parallelize the equation-solving process in detailed simulation. With further optimization on GPU, our HTP method achieves 2.2 to 8.5 folds speedup compared to the state-of-the-art GPU method and 36 to 660 folds speedup compared to the typical Hines algorithm.
Collapse
Affiliation(s)
- Yichen Zhang
- School of Computer Science, Peking University, Beijing 100871, China
| | - Kai Du
- School of Computer Science and Institute for Artificial Intelligence, Peking University, Beijing 100871, China
| | - Tiejun Huang
- School of Computer Science and Institute for Artificial Intelligence, Peking University, Beijing 100871, China
| |
Collapse
|
6
|
Cobb EAW, Petroccione MA, Scimemi A. NRN-EZ: an application to streamline biophysical modeling of synaptic integration using NEURON. Sci Rep 2023; 13:464. [PMID: 36627356 PMCID: PMC9832141 DOI: 10.1038/s41598-022-27302-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2022] [Accepted: 12/29/2022] [Indexed: 01/12/2023] Open
Abstract
One of the fundamental goals in neuroscience is to determine how the brain processes information and ultimately controls the execution of complex behaviors. Over the past four decades, there has been a steady growth in our knowledge of the morphological and functional diversity of neurons, the building blocks of the brain. These cells clearly differ not only for their anatomy and ion channel distribution, but also for the type, strength, location, and temporal pattern of activity of the many synaptic inputs they receive. Compartmental modeling programs like NEURON have become widely used in the neuroscience community to address a broad range of research questions, including how neurons integrate synaptic inputs and propagate information through complex neural networks. One of the main strengths of NEURON is its ability to incorporate user-defined information about the realistic morphology and biophysical properties of different cell types. Although the graphical user interface of the program can be used to run initial exploratory simulations, introducing a stochastic representation of synaptic weights, locations and activation times typically requires users to develop their own codes, a task that can be overwhelming for some beginner users. Here we describe NRN-EZ, an interactive application that allows users to specify complex patterns of synaptic input activity that can be integrated as part of NEURON simulations. Through its graphical user interface, NRN-EZ aims to ease the learning curve to run computational models in NEURON, for users that do not necessarily have a computer science background.
Collapse
Affiliation(s)
- Evan A. W. Cobb
- grid.265850.c0000 0001 2151 7947Department of Biology, SUNY Albany, 1400 Washington Avenue, Albany, NY 12222-0100 USA ,grid.265850.c0000 0001 2151 7947Department of Computer Science, SUNY Albany, 1400 Washington Avenue, Albany, NY 12222-0100 USA
| | - Maurice A. Petroccione
- grid.265850.c0000 0001 2151 7947Department of Biology, SUNY Albany, 1400 Washington Avenue, Albany, NY 12222-0100 USA
| | - Annalisa Scimemi
- Department of Biology, SUNY Albany, 1400 Washington Avenue, Albany, NY, 12222-0100, USA.
| |
Collapse
|
7
|
Oláh VJ, Pedersen NP, Rowan MJM. Ultrafast simulation of large-scale neocortical microcircuitry with biophysically realistic neurons. eLife 2022; 11:e79535. [PMID: 36341568 PMCID: PMC9640191 DOI: 10.7554/elife.79535] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2022] [Accepted: 10/23/2022] [Indexed: 11/09/2022] Open
Abstract
Understanding the activity of the mammalian brain requires an integrative knowledge of circuits at distinct scales, ranging from ion channel gating to circuit connectomics. Computational models are regularly employed to understand how multiple parameters contribute synergistically to circuit behavior. However, traditional models of anatomically and biophysically realistic neurons are computationally demanding, especially when scaled to model local circuits. To overcome this limitation, we trained several artificial neural network (ANN) architectures to model the activity of realistic multicompartmental cortical neurons. We identified an ANN architecture that accurately predicted subthreshold activity and action potential firing. The ANN could correctly generalize to previously unobserved synaptic input, including in models containing nonlinear dendritic properties. When scaled, processing times were orders of magnitude faster compared with traditional approaches, allowing for rapid parameter-space mapping in a circuit model of Rett syndrome. Thus, we present a novel ANN approach allowing for rapid, detailed network experiments using inexpensive and commonly available computational resources.
Collapse
Affiliation(s)
- Viktor J Oláh
- Department of Cell Biology, Emory University School of MedicineAtlantaUnited States
| | - Nigel P Pedersen
- Department of Neurology, Emory University School of MedicineAtlantaUnited States
| | - Matthew JM Rowan
- Department of Cell Biology, Emory University School of MedicineAtlantaUnited States
| |
Collapse
|
8
|
Borges FS, Moreira JVS, Takarabe LM, Lytton WW, Dura-Bernal S. Large-scale biophysically detailed model of somatosensory thalamocortical circuits in NetPyNE. Front Neuroinform 2022; 16:884245. [PMID: 36213546 PMCID: PMC9536213 DOI: 10.3389/fninf.2022.884245] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2022] [Accepted: 07/27/2022] [Indexed: 11/13/2022] Open
Abstract
The primary somatosensory cortex (S1) of mammals is critically important in the perception of touch and related sensorimotor behaviors. In 2015, the Blue Brain Project (BBP) developed a groundbreaking rat S1 microcircuit simulation with over 31,000 neurons with 207 morpho-electrical neuron types, and 37 million synapses, incorporating anatomical and physiological information from a wide range of experimental studies. We have implemented this highly detailed and complex S1 model in NetPyNE, using the data available in the Neocortical Microcircuit Collaboration Portal. NetPyNE provides a Python high-level interface to NEURON and allows defining complicated multiscale models using an intuitive declarative standardized language. It also facilitates running parallel simulations, automates the optimization and exploration of parameters using supercomputers, and provides a wide range of built-in analysis functions. This will make the S1 model more accessible and simpler to scale, modify and extend in order to explore research questions or interconnect to other existing models. Despite some implementation differences, the NetPyNE model preserved the original cell morphologies, electrophysiological responses and spatial distribution for all 207 cell types; and the connectivity properties of all 1941 pathways, including synaptic dynamics and short-term plasticity (STP). The NetPyNE S1 simulations produced reasonable physiological firing rates and activity patterns across all populations. When STP was included, the network generated a 1 Hz oscillation comparable to the original model in vitro-like state. By then reducing the extracellular calcium concentration, the model reproduced the original S1 in vivo-like states with asynchronous activity. These results validate the original study using a new modeling tool. Simulated local field potentials (LFPs) exhibited realistic oscillatory patterns and features, including distance- and frequency-dependent attenuation. The model was extended by adding thalamic circuits, including 6 distinct thalamic populations with intrathalamic, thalamocortical (TC) and corticothalamic connectivity derived from experimental data. The thalamic model reproduced single known cell and circuit-level dynamics, including burst and tonic firing modes and oscillatory patterns, providing a more realistic input to cortex and enabling study of TC interactions. Overall, our work provides a widely accessible, data-driven and biophysically-detailed model of the somatosensory TC circuits that can be employed as a community tool for researchers to study neural dynamics, function and disease.
Collapse
Affiliation(s)
- Fernando S. Borges
- Department of Physiology and Pharmacology, State University of New York Downstate Health Sciences University, Brooklyn, NY, United States
- Center for Mathematics, Computation, and Cognition, Federal University of ABC, São Paulo, Brazil
| | - Joao V. S. Moreira
- Department of Physiology and Pharmacology, State University of New York Downstate Health Sciences University, Brooklyn, NY, United States
| | - Lavinia M. Takarabe
- Center for Mathematics, Computation, and Cognition, Federal University of ABC, São Paulo, Brazil
| | - William W. Lytton
- Department of Physiology and Pharmacology, State University of New York Downstate Health Sciences University, Brooklyn, NY, United States
- Department of Neurology, Kings County Hospital Center, Brooklyn, NY, United States
- Aligning Science Across Parkinson’s (ASAP) Collaborative Research Network, Chevy Chase, MD, United States
| | - Salvador Dura-Bernal
- Department of Physiology and Pharmacology, State University of New York Downstate Health Sciences University, Brooklyn, NY, United States
- Nathan Kline Institute for Psychiatric Research, Orangeburg, NY, United States
| |
Collapse
|
9
|
Awile O, Kumbhar P, Cornu N, Dura-Bernal S, King JG, Lupton O, Magkanaris I, McDougal RA, Newton AJH, Pereira F, Săvulescu A, Carnevale NT, Lytton WW, Hines ML, Schürmann F. Modernizing the NEURON Simulator for Sustainability, Portability, and Performance. Front Neuroinform 2022; 16:884046. [PMID: 35832575 PMCID: PMC9272742 DOI: 10.3389/fninf.2022.884046] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2022] [Accepted: 05/26/2022] [Indexed: 12/25/2022] Open
Abstract
The need for reproducible, credible, multiscale biological modeling has led to the development of standardized simulation platforms, such as the widely-used NEURON environment for computational neuroscience. Developing and maintaining NEURON over several decades has required attention to the competing needs of backwards compatibility, evolving computer architectures, the addition of new scales and physical processes, accessibility to new users, and efficiency and flexibility for specialists. In order to meet these challenges, we have now substantially modernized NEURON, providing continuous integration, an improved build system and release workflow, and better documentation. With the help of a new source-to-source compiler of the NMODL domain-specific language we have enhanced NEURON's ability to run efficiently, via the CoreNEURON simulation engine, on a variety of hardware platforms, including GPUs. Through the implementation of an optimized in-memory transfer mechanism this performance optimized backend is made easily accessible to users, providing training and model-development paths from laptop to workstation to supercomputer and cloud platform. Similarly, we have been able to accelerate NEURON's reaction-diffusion simulation performance through the use of just-in-time compilation. We show that these efforts have led to a growing developer base, a simpler and more robust software distribution, a wider range of supported computer architectures, a better integration of NEURON with other scientific workflows, and substantially improved performance for the simulation of biophysical and biochemical models.
Collapse
Affiliation(s)
- Omar Awile
- Blue Brain Project, École Polytechnique Fédérale de Lausanne (EPFL), Geneva, Switzerland
| | - Pramod Kumbhar
- Blue Brain Project, École Polytechnique Fédérale de Lausanne (EPFL), Geneva, Switzerland
| | - Nicolas Cornu
- Blue Brain Project, École Polytechnique Fédérale de Lausanne (EPFL), Geneva, Switzerland
| | - Salvador Dura-Bernal
- Department Physiology and Pharmacology, SUNY Downstate, Brooklyn, NY, United States
- Center for Biomedical Imaging and Neuromodulation, Nathan Kline Institute for Psychiatric Research, Orangeburg, NY, United States
| | - James Gonzalo King
- Blue Brain Project, École Polytechnique Fédérale de Lausanne (EPFL), Geneva, Switzerland
| | - Olli Lupton
- Blue Brain Project, École Polytechnique Fédérale de Lausanne (EPFL), Geneva, Switzerland
| | - Ioannis Magkanaris
- Blue Brain Project, École Polytechnique Fédérale de Lausanne (EPFL), Geneva, Switzerland
| | - Robert A. McDougal
- Department of Biostatistics, Yale School of Public Health, New Haven, CT, United States
- Program in Computational Biology and Bioinformatics, Yale University, New Haven, CT, United States
- Yale Center for Medical Informatics, Yale University, New Haven, CT, United States
| | - Adam J. H. Newton
- Department Physiology and Pharmacology, SUNY Downstate, Brooklyn, NY, United States
- Department of Biostatistics, Yale School of Public Health, New Haven, CT, United States
| | - Fernando Pereira
- Blue Brain Project, École Polytechnique Fédérale de Lausanne (EPFL), Geneva, Switzerland
| | - Alexandru Săvulescu
- Blue Brain Project, École Polytechnique Fédérale de Lausanne (EPFL), Geneva, Switzerland
| | | | - William W. Lytton
- Center for Biomedical Imaging and Neuromodulation, Nathan Kline Institute for Psychiatric Research, Orangeburg, NY, United States
| | - Michael L. Hines
- Department of Neuroscience, Yale University, New Haven, CT, United States
| | - Felix Schürmann
- Blue Brain Project, École Polytechnique Fédérale de Lausanne (EPFL), Geneva, Switzerland
| |
Collapse
|
10
|
Yu Y, Han F, Wang Q. Exploring phase–amplitude coupling from primary motor cortex-basal ganglia-thalamus network model. Neural Netw 2022; 153:130-141. [DOI: 10.1016/j.neunet.2022.05.027] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2021] [Revised: 04/11/2022] [Accepted: 05/27/2022] [Indexed: 10/18/2022]
|
11
|
Albers J, Pronold J, Kurth AC, Vennemo SB, Haghighi Mood K, Patronis A, Terhorst D, Jordan J, Kunkel S, Tetzlaff T, Diesmann M, Senk J. A Modular Workflow for Performance Benchmarking of Neuronal Network Simulations. Front Neuroinform 2022; 16:837549. [PMID: 35645755 PMCID: PMC9131021 DOI: 10.3389/fninf.2022.837549] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2021] [Accepted: 03/11/2022] [Indexed: 11/13/2022] Open
Abstract
Modern computational neuroscience strives to develop complex network models to explain dynamics and function of brains in health and disease. This process goes hand in hand with advancements in the theory of neuronal networks and increasing availability of detailed anatomical data on brain connectivity. Large-scale models that study interactions between multiple brain areas with intricate connectivity and investigate phenomena on long time scales such as system-level learning require progress in simulation speed. The corresponding development of state-of-the-art simulation engines relies on information provided by benchmark simulations which assess the time-to-solution for scientifically relevant, complementary network models using various combinations of hardware and software revisions. However, maintaining comparability of benchmark results is difficult due to a lack of standardized specifications for measuring the scaling performance of simulators on high-performance computing (HPC) systems. Motivated by the challenging complexity of benchmarking, we define a generic workflow that decomposes the endeavor into unique segments consisting of separate modules. As a reference implementation for the conceptual workflow, we develop beNNch: an open-source software framework for the configuration, execution, and analysis of benchmarks for neuronal network simulations. The framework records benchmarking data and metadata in a unified way to foster reproducibility. For illustration, we measure the performance of various versions of the NEST simulator across network models with different levels of complexity on a contemporary HPC system, demonstrating how performance bottlenecks can be identified, ultimately guiding the development toward more efficient simulation technology.
Collapse
Affiliation(s)
- Jasper Albers
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- RWTH Aachen University, Aachen, Germany
- *Correspondence: Jasper Albers
| | - Jari Pronold
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- RWTH Aachen University, Aachen, Germany
| | - Anno Christopher Kurth
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- RWTH Aachen University, Aachen, Germany
| | - Stine Brekke Vennemo
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| | | | - Alexander Patronis
- Jülich Supercomputing Centre (JSC), Jülich Research Centre, Jülich, Germany
| | - Dennis Terhorst
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| | - Jakob Jordan
- Department of Physiology, University of Bern, Bern, Switzerland
| | - Susanne Kunkel
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| | - Tom Tetzlaff
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
- Department of Psychiatry, Psychotherapy and Psychosomatics, School of Medicine, RWTH Aachen University, Aachen, Germany
| | - Johanna Senk
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| |
Collapse
|
12
|
Romaro C, Najman FA, Lytton WW, Roque AC, Dura-Bernal S. NetPyNE Implementation and Scaling of the Potjans-Diesmann Cortical Microcircuit Model. Neural Comput 2021; 33:1993-2032. [PMID: 34411272 PMCID: PMC8382011 DOI: 10.1162/neco_a_01400] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2020] [Accepted: 02/16/2021] [Indexed: 11/04/2022]
Abstract
The Potjans-Diesmann cortical microcircuit model is a widely used model originally implemented in NEST. Here, we reimplemented the model using NetPyNE, a high-level Python interface to the NEURON simulator, and reproduced the findings of the original publication. We also implemented a method for scaling the network size that preserves first- and second-order statistics, building on existing work on network theory. Our new implementation enabled the use of more detailed neuron models with multicompartmental morphologies and multiple biophysically realistic ion channels. This opens the model to new research, including the study of dendritic processing, the influence of individual channel parameters, the relation to local field potentials, and other multiscale interactions. The scaling method we used provides flexibility to increase or decrease the network size as needed when running these CPU-intensive detailed simulations. Finally, NetPyNE facilitates modifying or extending the model using its declarative language; optimizing model parameters; running efficient, large-scale parallelized simulations; and analyzing the model through built-in methods, including local field potential calculation and information flow measures.
Collapse
Affiliation(s)
- Cecilia Romaro
- Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, Ribeirão Preto, SP 14049, Brazil
| | - Fernando Araujo Najman
- Institute of Mathematics and Statistics, University of São Paulo, São Paulo, SP 05508, Brazil
| | - William W Lytton
- Department of Physiology and Pharmacology, State University of New York Downstate Health Sciences University, New York, NY 11203, U.S.A.
| | - Antonio C Roque
- Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, Ribeirão Preto, SP 14049, Brazil
| | - Salvador Dura-Bernal
- Department of Physiology and Pharmacology, State University of New York Downstate Health Sciences University, New York, NY 11203, U.S.A., and Nathan Kline Institute for Psychiatric Research, New York, NY 10962, U.S.A.
| |
Collapse
|
13
|
Stapmanns J, Hahne J, Helias M, Bolten M, Diesmann M, Dahmen D. Event-Based Update of Synapses in Voltage-Based Learning Rules. Front Neuroinform 2021; 15:609147. [PMID: 34177505 PMCID: PMC8222618 DOI: 10.3389/fninf.2021.609147] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2020] [Accepted: 04/07/2021] [Indexed: 11/13/2022] Open
Abstract
Due to the point-like nature of neuronal spiking, efficient neural network simulators often employ event-based simulation schemes for synapses. Yet many types of synaptic plasticity rely on the membrane potential of the postsynaptic cell as a third factor in addition to pre- and postsynaptic spike times. In some learning rules membrane potentials not only influence synaptic weight changes at the time points of spike events but in a continuous manner. In these cases, synapses therefore require information on the full time course of membrane potentials to update their strength which a priori suggests a continuous update in a time-driven manner. The latter hinders scaling of simulations to realistic cortical network sizes and relevant time scales for learning. Here, we derive two efficient algorithms for archiving postsynaptic membrane potentials, both compatible with modern simulation engines based on event-based synapse updates. We theoretically contrast the two algorithms with a time-driven synapse update scheme to analyze advantages in terms of memory and computations. We further present a reference implementation in the spiking neural network simulator NEST for two prototypical voltage-based plasticity rules: the Clopath rule and the Urbanczik-Senn rule. For both rules, the two event-based algorithms significantly outperform the time-driven scheme. Depending on the amount of data to be stored for plasticity, which heavily differs between the rules, a strong performance increase can be achieved by compressing or sampling of information on membrane potentials. Our results on computational efficiency related to archiving of information provide guidelines for the design of learning rules in order to make them practically usable in large-scale networks.
Collapse
Affiliation(s)
- Jonas Stapmanns
- Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), JARA Institute Brain Structure Function Relationship (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Physics, Institute for Theoretical Solid State Physics, RWTH Aachen University, Aachen, Germany
| | - Jan Hahne
- School of Mathematics and Natural Sciences, Bergische Universität Wuppertal, Wuppertal, Germany
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), JARA Institute Brain Structure Function Relationship (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Physics, Institute for Theoretical Solid State Physics, RWTH Aachen University, Aachen, Germany
| | - Matthias Bolten
- School of Mathematics and Natural Sciences, Bergische Universität Wuppertal, Wuppertal, Germany
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), JARA Institute Brain Structure Function Relationship (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen University, Aachen, Germany
| | - David Dahmen
- Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), JARA Institute Brain Structure Function Relationship (INM-10), Jülich Research Centre, Jülich, Germany
| |
Collapse
|
14
|
Crone JC, Vindiola MM, Yu AB, Boothe DL, Beeman D, Oie KS, Franaszczuk PJ. Enabling Large-Scale Simulations With the GENESIS Neuronal Simulator. Front Neuroinform 2019; 13:69. [PMID: 31803040 PMCID: PMC6873326 DOI: 10.3389/fninf.2019.00069] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2019] [Accepted: 10/30/2019] [Indexed: 11/13/2022] Open
Abstract
In this paper, we evaluate the computational performance of the GEneral NEural SImulation System (GENESIS) for large scale simulations of neural networks. While many benchmark studies have been performed for large scale simulations with leaky integrate-and-fire neurons or neuronal models with only a few compartments, this work focuses on higher fidelity neuronal models represented by 50–74 compartments per neuron. After making some modifications to the source code for GENESIS and its parallel implementation, PGENESIS, particularly to improve memory usage, we find that PGENESIS is able to efficiently scale on supercomputing resources to network sizes as large as 9 × 106 neurons with 18 × 109 synapses and 2.2 × 106 neurons with 45 × 109 synapses. The modifications to GENESIS that enabled these large scale simulations have been incorporated into the May 2019 Official Release of PGENESIS 2.4 available for download from the GENESIS web site (genesis-sim.org).
Collapse
Affiliation(s)
- Joshua C Crone
- Computational and Information Sciences Directorate, Army Research Laboratory, Aberdeen Proving Ground, MD, United States
| | - Manuel M Vindiola
- Computational and Information Sciences Directorate, Army Research Laboratory, Aberdeen Proving Ground, MD, United States
| | - Alfred B Yu
- Human Research and Engineering Directorate, Army Research Laboratory, Aberdeen Proving Ground, MD, United States
| | - David L Boothe
- Human Research and Engineering Directorate, Army Research Laboratory, Aberdeen Proving Ground, MD, United States
| | - David Beeman
- Department of Electrical, Computer, and Energy Engineering, University of Colorado, Boulder, CO, United States
| | - Kelvin S Oie
- Human Research and Engineering Directorate, Army Research Laboratory, Aberdeen Proving Ground, MD, United States
| | - Piotr J Franaszczuk
- Human Research and Engineering Directorate, Army Research Laboratory, Aberdeen Proving Ground, MD, United States.,Department of Neurology, Johns Hopkins University School of Medicine, Baltimore, MD, United States
| |
Collapse
|
15
|
Dura-Bernal S, Suter BA, Gleeson P, Cantarelli M, Quintana A, Rodriguez F, Kedziora DJ, Chadderdon GL, Kerr CC, Neymotin SA, McDougal RA, Hines M, Shepherd GMG, Lytton WW. NetPyNE, a tool for data-driven multiscale modeling of brain circuits. eLife 2019; 8:e44494. [PMID: 31025934 PMCID: PMC6534378 DOI: 10.7554/elife.44494] [Citation(s) in RCA: 68] [Impact Index Per Article: 13.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2018] [Accepted: 04/25/2019] [Indexed: 12/22/2022] Open
Abstract
Biophysical modeling of neuronal networks helps to integrate and interpret rapidly growing and disparate experimental datasets at multiple scales. The NetPyNE tool (www.netpyne.org) provides both programmatic and graphical interfaces to develop data-driven multiscale network models in NEURON. NetPyNE clearly separates model parameters from implementation code. Users provide specifications at a high level via a standardized declarative language, for example connectivity rules, to create millions of cell-to-cell connections. NetPyNE then enables users to generate the NEURON network, run efficiently parallelized simulations, optimize and explore network parameters through automated batch runs, and use built-in functions for visualization and analysis - connectivity matrices, voltage traces, spike raster plots, local field potentials, and information theoretic measures. NetPyNE also facilitates model sharing by exporting and importing standardized formats (NeuroML and SONATA). NetPyNE is already being used to teach computational neuroscience students and by modelers to investigate brain regions and phenomena.
Collapse
Affiliation(s)
- Salvador Dura-Bernal
- Department of Physiology & PharmacologyState University of New York Downstate Medical CenterBrooklynUnited States
| | - Benjamin A Suter
- Department of PhysiologyNorthwestern UniversityChicagoUnited States
| | - Padraig Gleeson
- Department of Neuroscience, Physiology and PharmacologyUniversity College LondonLondonUnited Kingdom
| | | | | | - Facundo Rodriguez
- Department of Physiology & PharmacologyState University of New York Downstate Medical CenterBrooklynUnited States
- MetaCell LLCBostonUnited States
| | - David J Kedziora
- Complex Systems Group, School of PhysicsUniversity of SydneySydneyAustralia
| | - George L Chadderdon
- Department of Physiology & PharmacologyState University of New York Downstate Medical CenterBrooklynUnited States
| | - Cliff C Kerr
- Complex Systems Group, School of PhysicsUniversity of SydneySydneyAustralia
| | - Samuel A Neymotin
- Department of Physiology & PharmacologyState University of New York Downstate Medical CenterBrooklynUnited States
- Nathan Kline Institute for Psychiatric ResearchOrangeburgUnited States
| | - Robert A McDougal
- Department of Neuroscience and School of MedicineYale UniversityNew HavenUnited States
- Center for Medical InformaticsYale UniversityNew HavenUnited States
| | - Michael Hines
- Department of Neuroscience and School of MedicineYale UniversityNew HavenUnited States
| | | | - William W Lytton
- Department of Physiology & PharmacologyState University of New York Downstate Medical CenterBrooklynUnited States
- Department of NeurologyKings County HospitalBrooklynUnited States
| |
Collapse
|
16
|
Fernandez-Musoles C, Coca D, Richmond P. Communication Sparsity in Distributed Spiking Neural Network Simulations to Improve Scalability. Front Neuroinform 2019; 13:19. [PMID: 31001102 PMCID: PMC6454199 DOI: 10.3389/fninf.2019.00019] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2018] [Accepted: 03/11/2019] [Indexed: 11/30/2022] Open
Abstract
In the last decade there has been a surge in the number of big science projects interested in achieving a comprehensive understanding of the functions of the brain, using Spiking Neuronal Network (SNN) simulations to aid discovery and experimentation. Such an approach increases the computational demands on SNN simulators: if natural scale brain-size simulations are to be realized, it is necessary to use parallel and distributed models of computing. Communication is recognized as the dominant part of distributed SNN simulations. As the number of computational nodes increases, the proportion of time the simulation spends in useful computing (computational efficiency) is reduced and therefore applies a limit to scalability. This work targets the three phases of communication to improve overall computational efficiency in distributed simulations: implicit synchronization, process handshake and data exchange. We introduce a connectivity-aware allocation of neurons to compute nodes by modeling the SNN as a hypergraph. Partitioning the hypergraph to reduce interprocess communication increases the sparsity of the communication graph. We propose dynamic sparse exchange as an improvement over simple point-to-point exchange on sparse communications. Results show a combined gain when using hypergraph-based allocation and dynamic sparse communication, increasing computational efficiency by up to 40.8 percentage points and reducing simulation time by up to 73%. The findings are applicable to other distributed complex system simulations in which communication is modeled as a graph network.
Collapse
Affiliation(s)
| | - Daniel Coca
- Automatic Control and Systems Engineering, University of Sheffield, Sheffield, United Kingdom
| | - Paul Richmond
- Computer Science, University of Sheffield, Sheffield, United Kingdom
| |
Collapse
|
17
|
Continuing progress of spike sorting in the era of big data. Curr Opin Neurobiol 2019; 55:90-96. [PMID: 30856552 DOI: 10.1016/j.conb.2019.02.007] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2018] [Revised: 01/26/2019] [Accepted: 02/07/2019] [Indexed: 11/21/2022]
Abstract
Engineering efforts are currently attempting to build devices capable of collecting neural activity from one million neurons in the brain. Part of this effort focuses on developing dense multiple-electrode arrays, which require post-processing via 'spike sorting' to extract neural spike trains from the raw signal. Gathering information at this scale will facilitate fascinating science, but these dreams are only realizable if the spike sorting procedure and data pipeline are computationally scalable, at or superior to hand processing, and scientifically reproducible. These challenges are all being amplified as the data scale continues to increase. In this review, recent efforts to attack these challenges are discussed, which have primarily focused on increasing accuracy and reliability while being computationally scalable. These goals are addressed by adding additional stages to the data processing pipeline and using divide-and-conquer algorithmic approaches. These recent developments should prove useful to most research groups regardless of data scale, not just for cutting-edge devices.
Collapse
|
18
|
Chatzikonstantis G, Sidiropoulos H, Strydis C, Negrello M, Smaragdos G, De Zeeuw C, Soudris D. Multinode implementation of an extended Hodgkin–Huxley simulator. Neurocomputing 2019. [DOI: 10.1016/j.neucom.2018.10.062] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
19
|
Cantarelli M, Marin B, Quintana A, Earnshaw M, Court R, Gleeson P, Dura-Bernal S, Silver RA, Idili G. Geppetto: a reusable modular open platform for exploring neuroscience data and models. Philos Trans R Soc Lond B Biol Sci 2018; 373:rstb.2017.0380. [PMID: 30201843 PMCID: PMC6158222 DOI: 10.1098/rstb.2017.0380] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/18/2018] [Indexed: 11/25/2022] Open
Abstract
Geppetto is an open-source platform that provides generic middleware infrastructure for building both online and desktop tools for visualizing neuroscience models and data and managing simulations. Geppetto underpins a number of neuroscience applications, including Open Source Brain (OSB), Virtual Fly Brain (VFB), NEURON-UI and NetPyNE-UI. OSB is used by researchers to create and visualize computational neuroscience models described in NeuroML and simulate them through the browser. VFB is the reference hub for Drosophila melanogaster neural anatomy and imaging data including neuropil, segmented neurons, microscopy stacks and gene expression pattern data. Geppetto is also being used to build a new user interface for NEURON, a widely used neuronal simulation environment, and for NetPyNE, a Python package for network modelling using NEURON. Geppetto defines domain agnostic abstractions used by all these applications to represent their models and data and offers a set of modules and components to integrate, visualize and control simulations in a highly accessible way. The platform comprises a backend which can connect to external data sources, model repositories and simulators together with a highly customizable frontend. This article is part of a discussion meeting issue ‘Connectome to behaviour: modelling C. elegans at cellular resolution’.
Collapse
Affiliation(s)
- Matteo Cantarelli
- OpenWorm Foundation, USA .,MetaCell Limited, UK.,Department of Neuroscience, Physiology and Pharmacology, University College London, UK
| | - Boris Marin
- Department of Neuroscience, Physiology and Pharmacology, University College London, UK.,Departamento de Física, Faculdade de Filosofia, Ciências e Letras de Ribeirão Preto, Universidade de São Paulo, Brazil
| | - Adrian Quintana
- MetaCell Limited, UK.,Department of Neuroscience, Physiology and Pharmacology, University College London, UK.,EyeSeeTea Limited, UK
| | - Matt Earnshaw
- Department of Neuroscience, Physiology and Pharmacology, University College London, UK
| | - Robert Court
- Institute for Adaptive and Neural Computation, School of Informatics, University of Edinburgh, Edinburgh, UK
| | - Padraig Gleeson
- Department of Neuroscience, Physiology and Pharmacology, University College London, UK
| | | | - R Angus Silver
- Department of Neuroscience, Physiology and Pharmacology, University College London, UK
| | | |
Collapse
|
20
|
Mulugeta L, Drach A, Erdemir A, Hunt CA, Horner M, Ku JP, Myers JG, Vadigepalli R, Lytton WW. Credibility, Replicability, and Reproducibility in Simulation for Biomedicine and Clinical Applications in Neuroscience. Front Neuroinform 2018; 12:18. [PMID: 29713272 PMCID: PMC5911506 DOI: 10.3389/fninf.2018.00018] [Citation(s) in RCA: 26] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2018] [Accepted: 03/29/2018] [Indexed: 12/27/2022] Open
Abstract
Modeling and simulation in computational neuroscience is currently a research enterprise to better understand neural systems. It is not yet directly applicable to the problems of patients with brain disease. To be used for clinical applications, there must not only be considerable progress in the field but also a concerted effort to use best practices in order to demonstrate model credibility to regulatory bodies, to clinics and hospitals, to doctors, and to patients. In doing this for neuroscience, we can learn lessons from long-standing practices in other areas of simulation (aircraft, computer chips), from software engineering, and from other biomedical disciplines. In this manuscript, we introduce some basic concepts that will be important in the development of credible clinical neuroscience models: reproducibility and replicability; verification and validation; model configuration; and procedures and processes for credible mechanistic multiscale modeling. We also discuss how garnering strong community involvement can promote model credibility. Finally, in addition to direct usage with patients, we note the potential for simulation usage in the area of Simulation-Based Medical Education, an area which to date has been primarily reliant on physical models (mannequins) and scenario-based simulations rather than on numerical simulations.
Collapse
Affiliation(s)
| | - Andrew Drach
- The Institute for Computational Engineering and Sciences, The University of Texas at Austin, Austin, TX, United States
| | - Ahmet Erdemir
- Department of Biomedical Engineering and Computational Biomodeling (CoBi) Core, Lerner Research Institute, Cleveland Clinic, Cleveland, OH, United States
| | - C A Hunt
- Department of Bioengineering and Therapeutic Sciences, University of California, San Francisco, San Francisco, CA, United States
| | | | - Joy P Ku
- Department of Bioengineering, Stanford University, Stanford, CA, United States
| | - Jerry G Myers
- NASA Glenn Research Center, Cleveland, OH, United States
| | - Rajanikanth Vadigepalli
- Department of Pathology, Anatomy and Cell Biology, Daniel Baugh Institute for Functional Genomics and Computational Biology, Thomas Jefferson University, Philadelphia, PA, United States
| | - William W Lytton
- Department of Neurology, SUNY Downstate Medical Center, The State University of New York, New York, NY, United States.,Department of Physiology and Pharmacology, SUNY Downstate Medical Center, The State University of New York, New York, NY, United States.,Department of Neurology, Kings County Hospital Center, New York, NY, United States
| |
Collapse
|
21
|
Jordan J, Ippen T, Helias M, Kitayama I, Sato M, Igarashi J, Diesmann M, Kunkel S. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers. Front Neuroinform 2018; 12:2. [PMID: 29503613 PMCID: PMC5820465 DOI: 10.3389/fninf.2018.00002] [Citation(s) in RCA: 42] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2017] [Accepted: 01/18/2018] [Indexed: 11/13/2022] Open
Abstract
State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.
Collapse
Affiliation(s)
- Jakob Jordan
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| | - Tammo Ippen
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany.,Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany.,Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| | - Itaru Kitayama
- Advanced Institute for Computational Science, RIKEN, Kobe, Japan
| | - Mitsuhisa Sato
- Advanced Institute for Computational Science, RIKEN, Kobe, Japan
| | - Jun Igarashi
- Computational Engineering Applications Unit, RIKEN, Wako, Japan
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany.,Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany.,Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen University, Aachen, Germany
| | - Susanne Kunkel
- Department of Computational Science and Technology, School of Computer Science and Communication, KTH Royal Institute of Technology, Stockholm, Sweden.,Simulation Laboratory Neuroscience - Bernstein Facility for Simulation and Database Technology, Jülich Research Centre, Jülich, Germany
| |
Collapse
|
22
|
Dura-Bernal S, Neymotin SA, Kerr CC, Sivagnanam S, Majumdar A, Francis JT, Lytton WW. Evolutionary algorithm optimization of biological learning parameters in a biomimetic neuroprosthesis. IBM JOURNAL OF RESEARCH AND DEVELOPMENT 2017; 61:6.1-6.14. [PMID: 29200477 PMCID: PMC5708558 DOI: 10.1147/jrd.2017.2656758] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
Biomimetic simulation permits neuroscientists to better understand the complex neuronal dynamics of the brain. Embedding a biomimetic simulation in a closed-loop neuroprosthesis, which can read and write signals from the brain, will permit applications for amelioration of motor, psychiatric, and memory-related brain disorders. Biomimetic neuroprostheses require real-time adaptation to changes in the external environment, thus constituting an example of a dynamic data-driven application system. As model fidelity increases, so does the number of parameters and the complexity of finding appropriate parameter configurations. Instead of adapting synaptic weights via machine learning, we employed major biological learning methods: spike-timing dependent plasticity and reinforcement learning. We optimized the learning metaparameters using evolutionary algorithms, which were implemented in parallel and which used an island model approach to obtain sufficient speed. We employed these methods to train a cortical spiking model to utilize macaque brain activity, indicating a selected target, to drive a virtual musculoskeletal arm with realistic anatomical and biomechanical properties to reach to that target. The optimized system was able to reproduce macaque data from a comparable experimental motor task. These techniques can be used to efficiently tune the parameters of multiscale systems, linking realistic neuronal dynamics to behavior, and thus providing a useful tool for neuroscience and neuroprosthetics.
Collapse
|
23
|
Naveros F, Garrido JA, Carrillo RR, Ros E, Luque NR. Event- and Time-Driven Techniques Using Parallel CPU-GPU Co-processing for Spiking Neural Networks. Front Neuroinform 2017; 11:7. [PMID: 28223930 PMCID: PMC5293783 DOI: 10.3389/fninf.2017.00007] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2016] [Accepted: 01/18/2017] [Indexed: 12/12/2022] Open
Abstract
Modeling and simulating the neural structures which make up our central neural system is instrumental for deciphering the computational neural cues beneath. Higher levels of biological plausibility usually impose higher levels of complexity in mathematical modeling, from neural to behavioral levels. This paper focuses on overcoming the simulation problems (accuracy and performance) derived from using higher levels of mathematical complexity at a neural level. This study proposes different techniques for simulating neural models that hold incremental levels of mathematical complexity: leaky integrate-and-fire (LIF), adaptive exponential integrate-and-fire (AdEx), and Hodgkin-Huxley (HH) neural models (ranged from low to high neural complexity). The studied techniques are classified into two main families depending on how the neural-model dynamic evaluation is computed: the event-driven or the time-driven families. Whilst event-driven techniques pre-compile and store the neural dynamics within look-up tables, time-driven techniques compute the neural dynamics iteratively during the simulation time. We propose two modifications for the event-driven family: a look-up table recombination to better cope with the incremental neural complexity together with a better handling of the synchronous input activity. Regarding the time-driven family, we propose a modification in computing the neural dynamics: the bi-fixed-step integration method. This method automatically adjusts the simulation step size to better cope with the stiffness of the neural model dynamics running in CPU platforms. One version of this method is also implemented for hybrid CPU-GPU platforms. Finally, we analyze how the performance and accuracy of these modifications evolve with increasing levels of neural complexity. We also demonstrate how the proposed modifications which constitute the main contribution of this study systematically outperform the traditional event- and time-driven techniques under increasing levels of neural complexity.
Collapse
Affiliation(s)
- Francisco Naveros
- Department of Computer Architecture and Technology, Research Centre for Information and Communication Technologies, University of Granada Granada, Spain
| | - Jesus A Garrido
- Department of Computer Architecture and Technology, Research Centre for Information and Communication Technologies, University of Granada Granada, Spain
| | - Richard R Carrillo
- Department of Computer Architecture and Technology, Research Centre for Information and Communication Technologies, University of Granada Granada, Spain
| | - Eduardo Ros
- Department of Computer Architecture and Technology, Research Centre for Information and Communication Technologies, University of Granada Granada, Spain
| | - Niceto R Luque
- Vision Institute, Aging in Vision and Action LabParis, France; CNRS, INSERM, Pierre and Marie Curie UniversityParis, France
| |
Collapse
|