1
|
Lu W, Zeng L, Wang J, Xiang S, Qi Y, Zheng Q, Xu N, Feng J. Imitating and exploring the human brain's resting and task-performing states via brain computing: scaling and architecture. Natl Sci Rev 2024; 11:nwae080. [PMID: 38803564 PMCID: PMC11129584 DOI: 10.1093/nsr/nwae080] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2023] [Revised: 12/19/2023] [Accepted: 01/31/2024] [Indexed: 05/29/2024] Open
Abstract
A computational human brain model with the voxel-wise assimilation method was established based on individual structural and functional imaging data. We found that the more similar the brain model is to the biological counterpart in both scale and architecture, the more similarity was found between the assimilated model and the biological brain both in resting states and during tasks by quantitative metrics. The hypothesis that resting state activity reflects internal body states was validated by the interoceptive circuit's capability to enhance the similarity between the simulation model and the biological brain. We identified that the removal of connections from the primary visual cortex (V1) to downstream visual pathways significantly decreased the similarity at the hippocampus between the model and its biological counterpart, despite a slight influence on the whole brain. In conclusion, the model and methodology present a solid quantitative framework for a digital twin brain for discovering the relationship between brain architecture and functions, and for digitally trying and testing diverse cognitive, medical and lesioning approaches that would otherwise be unfeasible in real subjects.
Collapse
Affiliation(s)
- Wenlian Lu
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai 200433, China
- Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence (Fudan University), Ministry of Education, Fudan University, Shanghai 200433, China
- Shanghai Center for Mathematical Sciences, Fudan University, Shanghai 200433, China
| | - Longbin Zeng
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai 200433, China
| | - Jiexiang Wang
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai 200433, China
| | - Shitong Xiang
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai 200433, China
- Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence (Fudan University), Ministry of Education, Fudan University, Shanghai 200433, China
| | - Yang Qi
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai 200433, China
- Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence (Fudan University), Ministry of Education, Fudan University, Shanghai 200433, China
| | - Qibao Zheng
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai 200433, China
- Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence (Fudan University), Ministry of Education, Fudan University, Shanghai 200433, China
| | - Ningsheng Xu
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai 200433, China
| | - Jianfeng Feng
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai 200433, China
- Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence (Fudan University), Ministry of Education, Fudan University, Shanghai 200433, China
- Shanghai Center for Mathematical Sciences, Fudan University, Shanghai 200433, China
- Department of Computer Science, University of Warwick, Coventry CV4 7AL, UK
- Zhangjiang Fudan International Innovation Center, Fudan University, Shanghai 200433, China
| |
Collapse
|
2
|
Shinji Y, Okuno H, Hirata Y. Artificial cerebellum on FPGA: realistic real-time cerebellar spiking neural network model capable of real-world adaptive motor control. Front Neurosci 2024; 18:1220908. [PMID: 38726031 PMCID: PMC11079192 DOI: 10.3389/fnins.2024.1220908] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2023] [Accepted: 04/09/2024] [Indexed: 05/12/2024] Open
Abstract
The cerebellum plays a central role in motor control and learning. Its neuronal network architecture, firing characteristics of component neurons, and learning rules at their synapses have been well understood in terms of anatomy and physiology. A realistic artificial cerebellum with mimetic network architecture and synaptic plasticity mechanisms may allow us to analyze cerebellar information processing in the real world by applying it to adaptive control of actual machines. Several artificial cerebellums have previously been constructed, but they require high-performance hardware to run in real-time for real-world machine control. Presently, we implemented an artificial cerebellum with the size of 104 spiking neuron models on a field-programmable gate array (FPGA) which is compact, lightweight, portable, and low-power-consumption. In the implementation three novel techniques are employed: (1) 16-bit fixed-point operation and randomized rounding, (2) fully connected spike information transmission, and (3) alternative memory that uses pseudo-random number generators. We demonstrate that the FPGA artificial cerebellum runs in real-time, and its component neuron models behave as those in the corresponding artificial cerebellum configured on a personal computer in Python. We applied the FPGA artificial cerebellum to the adaptive control of a machine in the real world and demonstrated that the artificial cerebellum is capable of adaptively reducing control error after sudden load changes. This is the first implementation and demonstration of a spiking artificial cerebellum on an FPGA applicable to real-world adaptive control. The FPGA artificial cerebellum may provide neuroscientific insights into cerebellar information processing in adaptive motor control and may be applied to various neuro-devices to augment and extend human motor control capabilities.
Collapse
Affiliation(s)
- Yusuke Shinji
- Department of Computer Science, Graduate School of Engineering, Chubu University, Kasugai, Japan
| | - Hirotsugu Okuno
- Faculty of Information Science and Technology, Osaka Institute of Technology, Hirakata, Japan
| | - Yutaka Hirata
- Department of Artificial Intelligence and Robotics, College of Engineering, Chubu University, Kasugai, Japan
- Center for Mathematical Science and Artificial Intelligence, Chubu University, Kasugai, Japan
- Academy of Emerging Sciences, Chubu University, Kasugai, Japan
| |
Collapse
|
3
|
Jimbo T, Matsuo H, Imoto Y, Sodemura T, Nishimori M, Fukui Y, Hayashi T, Furuyashiki T, Yokoyama R. Accelerated preprocessing of large numbers of brain images by parallel computing on supercomputers. Sci Rep 2023; 13:19901. [PMID: 37963952 PMCID: PMC10646110 DOI: 10.1038/s41598-023-46073-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2023] [Accepted: 10/27/2023] [Indexed: 11/16/2023] Open
Abstract
"Preprocessing" is the first step required in brain image analysis that improves the overall quality and reliability of the results. However, it is computationally demanding and time-consuming, particularly to handle and parcellate complicatedly folded cortical ribbons of the human brain. In this study, we aimed to shorten the analysis time for data preprocessing of 1410 brain images simultaneously on one of the world's highest-performing supercomputers, "Fugaku." The FreeSurfer was used as a benchmark preprocessing software for cortical surface reconstruction. All the brain images were processed simultaneously and successfully analyzed in a calculation time of 17.33 h. This result indicates that using a supercomputer for brain image preprocessing allows big data analysis to be completed shortly and flexibly, thus suggesting the possibility of supercomputers being used for expanding large data analysis and parameter optimization of preprocessing in the future.
Collapse
Affiliation(s)
- Takehiro Jimbo
- Japan Research Activity Support Inc., Kobe, Japan
- Department of Urology, Kobe University Graduate School of Medicine, Kobe, Japan
- Laboratory for Brain Connectomics Imaging, RIKEN Center for Biosystems Dynamics Research, Kobe, Japan
| | - Hidetoshi Matsuo
- Department of Radiology, Kobe University Graduate School of Medicine, Kobe, Japan
- Mediest Co., Kobe, Japan
| | - Yuya Imoto
- Japan Research Activity Support Inc., Kobe, Japan
| | | | - Makoto Nishimori
- Mediest Co., Kobe, Japan
- Division of Molecular Epidemiology, Kobe University Graduate School of Medicine, Kobe, Japan
| | - Yoshinari Fukui
- Department of Mathematics, Faculty of Science, Tokyo University of Science, Tokyo, Japan
| | - Takuya Hayashi
- Laboratory for Brain Connectomics Imaging, RIKEN Center for Biosystems Dynamics Research, Kobe, Japan
- Department of Brain Connectomics, Kyoto University Graduate School of Medicine, Kyoto, Japan
| | - Tomoyuki Furuyashiki
- Division of Pharmacology, Graduate School of Medicine, Kobe University, Kobe, Japan
| | - Ryoichi Yokoyama
- Department of Extended Intelligence for Medicine, The Ishii-Ishibashi Laboratory, Keio University, 35 Shinanomachi, Shinjuku-ku, Tokyo, 160-8582, Japan.
- Yokoyama Lab, Tokyo, Japan.
| |
Collapse
|
4
|
Schmitt FJ, Rostami V, Nawrot MP. Efficient parameter calibration and real-time simulation of large-scale spiking neural networks with GeNN and NEST. Front Neuroinform 2023; 17:941696. [PMID: 36844916 PMCID: PMC9950635 DOI: 10.3389/fninf.2023.941696] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2022] [Accepted: 01/16/2023] [Indexed: 02/12/2023] Open
Abstract
Spiking neural networks (SNNs) represent the state-of-the-art approach to the biologically realistic modeling of nervous system function. The systematic calibration for multiple free model parameters is necessary to achieve robust network function and demands high computing power and large memory resources. Special requirements arise from closed-loop model simulation in virtual environments and from real-time simulation in robotic application. Here, we compare two complementary approaches to efficient large-scale and real-time SNN simulation. The widely used NEural Simulation Tool (NEST) parallelizes simulation across multiple CPU cores. The GPU-enhanced Neural Network (GeNN) simulator uses the highly parallel GPU-based architecture to gain simulation speed. We quantify fixed and variable simulation costs on single machines with different hardware configurations. As a benchmark model, we use a spiking cortical attractor network with a topology of densely connected excitatory and inhibitory neuron clusters with homogeneous or distributed synaptic time constants and in comparison to the random balanced network. We show that simulation time scales linearly with the simulated biological model time and, for large networks, approximately linearly with the model size as dominated by the number of synaptic connections. Additional fixed costs with GeNN are almost independent of model size, while fixed costs with NEST increase linearly with model size. We demonstrate how GeNN can be used for simulating networks with up to 3.5 · 106 neurons (> 3 · 1012synapses) on a high-end GPU, and up to 250, 000 neurons (25 · 109 synapses) on a low-cost GPU. Real-time simulation was achieved for networks with 100, 000 neurons. Network calibration and parameter grid search can be efficiently achieved using batch processing. We discuss the advantages and disadvantages of both approaches for different use cases.
Collapse
Affiliation(s)
- Felix Johannes Schmitt
- Computational Systems Neuroscience, Institute of Zoology, University of Cologne, Cologne, Germany
| | - Vahid Rostami
- Computational Systems Neuroscience, Institute of Zoology, University of Cologne, Cologne, Germany
| | | |
Collapse
|
5
|
Kumar G, Ma CHE. Toward a cerebello-thalamo-cortical computational model of spinocerebellar ataxia. Neural Netw 2023; 162:541-556. [PMID: 37023628 DOI: 10.1016/j.neunet.2023.01.045] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2022] [Revised: 12/07/2022] [Accepted: 01/28/2023] [Indexed: 02/05/2023]
Abstract
Computational neural network modelling is an emerging approach for optimization of drug treatment of neurological disorders and fine-tuning of rehabilitation strategies. In the current study, we constructed a cerebello-thalamo-cortical computational neural network model to simulate a mouse model of cerebellar ataxia (pcd5J mice) by manipulating cerebellar bursts through reduction of GABAergic inhibitory input. Cerebellar output neurons were projected to the thalamus and bidirectionally connected with the cortical network. Our results showed that reduction of inhibitory input in the cerebellum orchestrated the cortical local field potential (LFP) dynamics to generate specific motor outputs of oscillations of the theta, alpha, and beta bands in the computational model as well as in mouse motor cortical neurons. The therapeutic potential of deep brain stimulation (DBS) was tested in the computational model by increasing the sensory input to restore cortical output. Ataxia mice showed normalization of the motor cortex LFP after cerebellum DBS. We provide a novel approach to computational modelling to investigate the effect of DBS by mimicking cerebellar ataxia involving degeneration of Purkinje cells. Simulated neural activity coincides with findings from neural recordings of ataxia mice. Our computational model could thus represent cerebellar pathologies and provide insight into how to improve disease symptoms by restoring neuronal electrophysiological properties using DBS.
Collapse
Affiliation(s)
- Gajendra Kumar
- Department of Neuroscience, City University of Hong Kong, Tat Chee Avenue, Hong Kong Special Administrative Region.
| | - Chi Him Eddie Ma
- Department of Neuroscience, City University of Hong Kong, Tat Chee Avenue, Hong Kong Special Administrative Region.
| |
Collapse
|
6
|
Haufler D, Ito S, Koch C, Arkhipov A. Simulations of cortical networks using spatially extended conductance-based neuronal models. J Physiol 2022. [PMID: 36567262 PMCID: PMC10290729 DOI: 10.1113/jp284030] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2022] [Accepted: 12/19/2022] [Indexed: 12/27/2022] Open
Abstract
The Hodgkin-Huxley model of action potential generation and propagation, published in the Journal of Physiology in 1952, initiated the field of biophysically detailed computational modelling in neuroscience, which has expanded to encompass a variety of species and components of the nervous system. Here we review the developments in this area with a focus on efforts in the community towards modelling the mammalian neocortex using spatially extended conductance-based neuronal models. The Hodgkin-Huxley formalism and related foundational contributions, such as Rall's cable theory, remain widely used in these efforts to the current day. We argue that at present the field is undergoing a qualitative change due to new very rich datasets describing the composition, connectivity and functional activity of cortical circuits, which are being integrated systematically into large-scale network models. This trend, combined with the accelerating development of convenient software tools supporting such complex modelling projects, is giving rise to highly detailed models of the cortex that are extensively constrained by the data, enabling computational investigation of a multitude of questions about cortical structure and function.
Collapse
Affiliation(s)
- Darrell Haufler
- Mindscope Program, Allen Institute, Seattle, Washington, USA
| | - Shinya Ito
- Mindscope Program, Allen Institute, Seattle, Washington, USA
| | - Christof Koch
- Mindscope Program, Allen Institute, Seattle, Washington, USA
| | - Anton Arkhipov
- Mindscope Program, Allen Institute, Seattle, Washington, USA
| |
Collapse
|
7
|
Vijayan A, Diwakar S. A cerebellum inspired spiking neural network as a multi-model for pattern classification and robotic trajectory prediction. Front Neurosci 2022; 16:909146. [DOI: 10.3389/fnins.2022.909146] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2022] [Accepted: 11/02/2022] [Indexed: 11/29/2022] Open
Abstract
Spiking neural networks were introduced to understand spatiotemporal information processing in neurons and have found their application in pattern encoding, data discrimination, and classification. Bioinspired network architectures are considered for event-driven tasks, and scientists have looked at different theories based on the architecture and functioning. Motor tasks, for example, have networks inspired by cerebellar architecture where the granular layer recodes sparse representations of the mossy fiber (MF) inputs and has more roles in motor learning. Using abstractions from cerebellar connections and learning rules of deep learning network (DLN), patterns were discriminated within datasets, and the same algorithm was used for trajectory optimization. In the current work, a cerebellum-inspired spiking neural network with dynamics of cerebellar neurons and learning mechanisms attributed to the granular layer, Purkinje cell (PC) layer, and cerebellar nuclei interconnected by excitatory and inhibitory synapses was implemented. The model’s pattern discrimination capability was tested for two tasks on standard machine learning (ML) datasets and on following a trajectory of a low-cost sensor-free robotic articulator. Tuned for supervised learning, the pattern classification capability of the cerebellum-inspired network algorithm has produced more generalized models than data-specific precision models on smaller training datasets. The model showed an accuracy of 72%, which was comparable to standard ML algorithms, such as MLP (78%), Dl4jMlpClassifier (64%), RBFNetwork (71.4%), and libSVM-linear (85.7%). The cerebellar model increased the network’s capability and decreased storage, augmenting faster computations. Additionally, the network model could also implicitly reconstruct the trajectory of a 6-degree of freedom (DOF) robotic arm with a low error rate by reconstructing the kinematic parameters. The variability between the actual and predicted trajectory points was noted to be ± 3 cm (while moving to a position in a cuboid space of 25 × 30 × 40 cm). Although a few known learning rules were implemented among known types of plasticity in the cerebellum, the network model showed a generalized processing capability for a range of signals, modulating the data through the interconnected neural populations. In addition to potential use on sensor-free or feed-forward based controllers for robotic arms and as a generalized pattern classification algorithm, this model adds implications to motor learning theory.
Collapse
|
8
|
Mascart C, Scarella G, Reynaud-Bouret P, Muzy A. Scalability of Large Neural Network Simulations via Activity Tracking With Time Asynchrony and Procedural Connectivity. Neural Comput 2022; 34:1915-1943. [PMID: 35896155 DOI: 10.1162/neco_a_01524] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2021] [Accepted: 04/27/2022] [Indexed: 11/04/2022]
Abstract
We present a new algorithm to efficiently simulate random models of large neural networks satisfying the property of time asynchrony. The model parameters (average firing rate, number of neurons, synaptic connection probability, and postsynaptic duration) are of the order of magnitude of a small mammalian brain or of human brain areas. Through the use of activity tracking and procedural connectivity (dynamical regeneration of synapses), computational and memory complexities of this algorithm are proved to be theoretically linear with the number of neurons. These results are experimentally validated by sequential simulations of millions of neurons and billions of synapses running in a few minutes using a single thread of an equivalent desktop computer.
Collapse
Affiliation(s)
| | - Gilles Scarella
- Université Côte d'Azur, CNRS, I3S, France.,Université Côte d'Azur, CNRS, LJAD, 06103 Nice, France
| | | | | |
Collapse
|
9
|
Feldotto B, Eppler JM, Jimenez-Romero C, Bignamini C, Gutierrez CE, Albanese U, Retamino E, Vorobev V, Zolfaghari V, Upton A, Sun Z, Yamaura H, Heidarinejad M, Klijn W, Morrison A, Cruz F, McMurtrie C, Knoll AC, Igarashi J, Yamazaki T, Doya K, Morin FO. Deploying and Optimizing Embodied Simulations of Large-Scale Spiking Neural Networks on HPC Infrastructure. Front Neuroinform 2022; 16:884180. [PMID: 35662903 PMCID: PMC9160925 DOI: 10.3389/fninf.2022.884180] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2022] [Accepted: 04/19/2022] [Indexed: 12/20/2022] Open
Abstract
Simulating the brain-body-environment trinity in closed loop is an attractive proposal to investigate how perception, motor activity and interactions with the environment shape brain activity, and vice versa. The relevance of this embodied approach, however, hinges entirely on the modeled complexity of the various simulated phenomena. In this article, we introduce a software framework that is capable of simulating large-scale, biologically realistic networks of spiking neurons embodied in a biomechanically accurate musculoskeletal system that interacts with a physically realistic virtual environment. We deploy this framework on the high performance computing resources of the EBRAINS research infrastructure and we investigate the scaling performance by distributing computation across an increasing number of interconnected compute nodes. Our architecture is based on requested compute nodes as well as persistent virtual machines; this provides a high-performance simulation environment that is accessible to multi-domain users without expert knowledge, with a view to enable users to instantiate and control simulations at custom scale via a web-based graphical user interface. Our simulation environment, entirely open source, is based on the Neurorobotics Platform developed in the context of the Human Brain Project, and the NEST simulator. We characterize the capabilities of our parallelized architecture for large-scale embodied brain simulations through two benchmark experiments, by investigating the effects of scaling compute resources on performance defined in terms of experiment runtime, brain instantiation and simulation time. The first benchmark is based on a large-scale balanced network, while the second one is a multi-region embodied brain simulation consisting of more than a million neurons and a billion synapses. Both benchmarks clearly show how scaling compute resources improves the aforementioned performance metrics in a near-linear fashion. The second benchmark in particular is indicative of both the potential and limitations of a highly distributed simulation in terms of a trade-off between computation speed and resource cost. Our simulation architecture is being prepared to be accessible for everyone as an EBRAINS service, thereby offering a community-wide tool with a unique workflow that should provide momentum to the investigation of closed-loop embodiment within the computational neuroscience community.
Collapse
Affiliation(s)
- Benedikt Feldotto
- Robotics, Artificial Intelligence and Real-Time Systems, Faculty of Informatics, Technical University of Munich, Munich, Germany
- *Correspondence: Benedikt Feldotto
| | - Jochen Martin Eppler
- Simulation and Data Lab Neuroscience, Jülich Supercomputing Centre (JSC), Institute for Advanced Simulation, JARA, Forschungszentrum Jülich GmbH, Jülich, Germany
| | - Cristian Jimenez-Romero
- Simulation and Data Lab Neuroscience, Jülich Supercomputing Centre (JSC), Institute for Advanced Simulation, JARA, Forschungszentrum Jülich GmbH, Jülich, Germany
| | | | - Carlos Enrique Gutierrez
- Neural Computation Unit, Okinawa Institute of Science and Technology Graduate University, Okinawa, Japan
| | - Ugo Albanese
- Department of Excellence in Robotics and AI, The BioRobotics Institute, Scuola Superiore Sant'Anna, Pontedera, Italy
| | - Eloy Retamino
- Department of Computer Architecture and Technology, Research Centre for Information and Communication Technologies, University of Granada, Granada, Spain
| | - Viktor Vorobev
- Robotics, Artificial Intelligence and Real-Time Systems, Faculty of Informatics, Technical University of Munich, Munich, Germany
| | - Vahid Zolfaghari
- Robotics, Artificial Intelligence and Real-Time Systems, Faculty of Informatics, Technical University of Munich, Munich, Germany
| | - Alex Upton
- Swiss National Supercomputing Centre (CSCS), ETH Zurich, Lugano, Switzerland
| | - Zhe Sun
- Image Processing Research Team, Center for Advanced Photonics, RIKEN, Wako, Japan
- Computational Engineering Applications Unit, Head Office for Information Systems and Cybersecurity, RIKEN, Wako, Japan
| | - Hiroshi Yamaura
- Graduate School of Informatics and Engineering, The University of Electro-Communications, Tokyo, Japan
| | - Morteza Heidarinejad
- Computational Engineering Applications Unit, Head Office for Information Systems and Cybersecurity, RIKEN, Wako, Japan
| | - Wouter Klijn
- Simulation and Data Lab Neuroscience, Jülich Supercomputing Centre (JSC), Institute for Advanced Simulation, JARA, Forschungszentrum Jülich GmbH, Jülich, Germany
| | - Abigail Morrison
- Simulation and Data Lab Neuroscience, Jülich Supercomputing Centre (JSC), Institute for Advanced Simulation, JARA, Forschungszentrum Jülich GmbH, Jülich, Germany
- Jülich Research Centre, Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), JARA BRAIN Institute I, Jülich, Germany
- Computer Science 3-Software Engineering, RWTH Aachen University, Aachen, Germany
| | - Felipe Cruz
- Swiss National Supercomputing Centre (CSCS), ETH Zurich, Lugano, Switzerland
| | - Colin McMurtrie
- Swiss National Supercomputing Centre (CSCS), ETH Zurich, Lugano, Switzerland
| | - Alois C. Knoll
- Robotics, Artificial Intelligence and Real-Time Systems, Faculty of Informatics, Technical University of Munich, Munich, Germany
| | - Jun Igarashi
- Computational Engineering Applications Unit, Head Office for Information Systems and Cybersecurity, RIKEN, Wako, Japan
- Center for Computational Science, RIKEN, Kobe, Japan
| | - Tadashi Yamazaki
- Graduate School of Informatics and Engineering, The University of Electro-Communications, Tokyo, Japan
| | - Kenji Doya
- Neural Computation Unit, Okinawa Institute of Science and Technology Graduate University, Okinawa, Japan
| | - Fabrice O. Morin
- Robotics, Artificial Intelligence and Real-Time Systems, Faculty of Informatics, Technical University of Munich, Munich, Germany
| |
Collapse
|
10
|
Bouattour Y, Sautou V, Hmede R, El Ouadhi Y, Gouot D, Chennell P, Lapusta Y, Chapelle F, Lemaire JJ. A Minireview on Brain Models Simulating Geometrical, Physical, and Biochemical Properties of the Human Brain. Front Bioeng Biotechnol 2022; 10:818201. [PMID: 35419353 PMCID: PMC8996142 DOI: 10.3389/fbioe.2022.818201] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2021] [Accepted: 03/08/2022] [Indexed: 11/13/2022] Open
Abstract
There is a growing body of evidences that brain surrogates will be of great interest for researchers and physicians in the medical field. They are currently mainly used for education and training purposes or to verify the appropriate functionality of medical devices. Depending on the purpose, a variety of materials have been used with specific and accurate mechanical and biophysical properties, More recently they have been used to assess the biocompatibility of implantable devices, but they are still not validated to study the migration of leaching components from devices. This minireview shows the large diversity of approaches and uses of brain phantoms, which converge punctually. All these phantoms are complementary to numeric models, which benefit, reciprocally, of their respective advances. It also suggests avenues of research for the analysis of leaching components from implantable devices.
Collapse
Affiliation(s)
- Yassine Bouattour
- Université Clermont Auvergne, CHU Clermont Ferrand, Clermont Auvergne INP, CNRS, ICCF, F-63000, Clermont-Ferrand, France
- *Correspondence: Yassine Bouattour, ; Jean-Jacques Lemaire,
| | - Valérie Sautou
- Université Clermont Auvergne, CHU Clermont Ferrand, Clermont Auvergne INP, CNRS, ICCF, F-63000, Clermont-Ferrand, France
| | - Rodayna Hmede
- Universite Clermont Auvergne, CNRS, Clermont Auvergne INP, Institut Pascal, F-63000, Clermont-Ferrand, France
| | - Youssef El Ouadhi
- Universite Clermont Auvergne, CNRS, Clermont Auvergne INP, Institut Pascal, F-63000, Clermont-Ferrand, France
- Service de Neurochirurgie, CHU Clermont Ferrand, F-63000, Clermont-Ferrand, France
| | - Dimitri Gouot
- Universite Clermont Auvergne, CNRS, Clermont Auvergne INP, Institut Pascal, F-63000, Clermont-Ferrand, France
| | - Philip Chennell
- Université Clermont Auvergne, CHU Clermont Ferrand, Clermont Auvergne INP, CNRS, ICCF, F-63000, Clermont-Ferrand, France
| | - Yuri Lapusta
- Universite Clermont Auvergne, CNRS, Clermont Auvergne INP, Institut Pascal, F-63000, Clermont-Ferrand, France
| | - Frédéric Chapelle
- Universite Clermont Auvergne, CNRS, Clermont Auvergne INP, Institut Pascal, F-63000, Clermont-Ferrand, France
| | - Jean-Jacques Lemaire
- Universite Clermont Auvergne, CNRS, Clermont Auvergne INP, Institut Pascal, F-63000, Clermont-Ferrand, France
- Service de Neurochirurgie, CHU Clermont Ferrand, F-63000, Clermont-Ferrand, France
- *Correspondence: Yassine Bouattour, ; Jean-Jacques Lemaire,
| |
Collapse
|
11
|
Kobayashi T, Kuriyama R, Yamazaki T. Testing an Explicit Method for Multi-compartment Neuron Model Simulation on a GPU. Cognit Comput 2021. [DOI: 10.1007/s12559-021-09942-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
12
|
Kuriyama R, Casellato C, D'Angelo E, Yamazaki T. Real-Time Simulation of a Cerebellar Scaffold Model on Graphics Processing Units. Front Cell Neurosci 2021; 15:623552. [PMID: 33897369 PMCID: PMC8058369 DOI: 10.3389/fncel.2021.623552] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2020] [Accepted: 03/15/2021] [Indexed: 11/13/2022] Open
Abstract
Large-scale simulation of detailed computational models of neuronal microcircuits plays a prominent role in reproducing and predicting the dynamics of the microcircuits. To reconstruct a microcircuit, one must choose neuron and synapse models, placements, connectivity, and numerical simulation methods according to anatomical and physiological constraints. For reconstruction and refinement, it is useful to be able to replace one module easily while leaving the others as they are. One way to achieve this is via a scaffolding approach, in which a simulation code is built on independent modules for placements, connections, and network simulations. Owing to the modularity of functions, this approach enables researchers to improve the performance of the entire simulation by simply replacing a problematic module with an improved one. Casali et al. (2019) developed a spiking network model of the cerebellar microcircuit using this approach, and while it reproduces electrophysiological properties of cerebellar neurons, it takes too much computational time. Here, we followed this scaffolding approach and replaced the simulation module with an accelerated version on graphics processing units (GPUs). Our cerebellar scaffold model ran roughly 100 times faster than the original version. In fact, our model is able to run faster than real time, with good weak and strong scaling properties. To demonstrate an application of real-time simulation, we implemented synaptic plasticity mechanisms at parallel fiber-Purkinje cell synapses, and carried out simulation of behavioral experiments known as gain adaptation of optokinetic response. We confirmed that the computer simulation reproduced experimental findings while being completed in real time. Actually, a computer simulation for 2 s of the biological time completed within 750 ms. These results suggest that the scaffolding approach is a promising concept for gradual development and refactoring of simulation codes for large-scale elaborate microcircuits. Moreover, a real-time version of the cerebellar scaffold model, which is enabled by parallel computing technology owing to GPUs, may be useful for large-scale simulations and engineering applications that require real-time signal processing and motor control.
Collapse
Affiliation(s)
- Rin Kuriyama
- Graduate School of Informatics and Engineering, The University of Electro-Communications, Tokyo, Japan
| | - Claudia Casellato
- Neurophysiology Unit, Neurocomputational Laboratory, Department of Brain and Behavioral Sciences, University of Pavia, Pavia, Italy
| | - Egidio D'Angelo
- Neurophysiology Unit, Neurocomputational Laboratory, Department of Brain and Behavioral Sciences, University of Pavia, Pavia, Italy
- IRCCS Mondino Foundation, Pavia, Italy
| | - Tadashi Yamazaki
- Graduate School of Informatics and Engineering, The University of Electro-Communications, Tokyo, Japan
| |
Collapse
|
13
|
Yamazaki T, Igarashi J, Yamaura H. Human-scale Brain Simulation via Supercomputer: A Case Study on the Cerebellum. Neuroscience 2021; 462:235-246. [PMID: 33482329 DOI: 10.1016/j.neuroscience.2021.01.014] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2020] [Revised: 12/30/2020] [Accepted: 01/06/2021] [Indexed: 01/03/2023]
Abstract
Performance of supercomputers has been steadily and exponentially increasing for the past 20 years, and is expected to increase further. This unprecedented computational power enables us to build and simulate large-scale neural network models composed of tens of billions of neurons and tens of trillions of synapses with detailed anatomical connections and realistic physiological parameters. Such "human-scale" brain simulation could be considered a milestone in computational neuroscience and even in general neuroscience. Towards this milestone, it is mandatory to introduce modern high-performance computing technology into neuroscience research. In this article, we provide an introductory landscape about large-scale brain simulation on supercomputers from the viewpoints of computational neuroscience and modern high-performance computing technology for specialists in experimental as well as computational neurosciences. This introduction to modeling and simulation methods is followed by a review of various representative large-scale simulation studies conducted to date. Then, we direct our attention to the cerebellum, with a review of more simulation studies specific to that region. Furthermore, we present recent simulation results of a human-scale cerebellar network model composed of 86 billion neurons on the Japanese flagship supercomputer K (now retired). Finally, we discuss the necessity and importance of human-scale brain simulation, and suggest future directions of such large-scale brain simulation research.
Collapse
Affiliation(s)
- Tadashi Yamazaki
- Graduate School of Informatics and Engineering, The University of Electro-Communications, Japan.
| | | | - Hiroshi Yamaura
- Graduate School of Informatics and Engineering, The University of Electro-Communications, Japan
| |
Collapse
|