1
|
Virtual Intelligence: A Systematic Review of the Development of Neural Networks in Brain Simulation Units. Brain Sci 2022; 12:brainsci12111552. [DOI: 10.3390/brainsci12111552] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2022] [Revised: 10/18/2022] [Accepted: 10/26/2022] [Indexed: 11/17/2022] Open
Abstract
The functioning of the brain has been a complex and enigmatic phenomenon. From the first approaches made by Descartes about this organism as the vehicle of the mind to contemporary studies that consider the brain as an organism with emergent activities of primary and higher order, this organism has been the object of continuous exploration. It has been possible to develop a more profound study of brain functions through imaging techniques, the implementation of digital platforms or simulators through different programming languages and the use of multiple processors to emulate the speed at which synaptic processes are executed in the brain. The use of various computational architectures raises innumerable questions about the possible scope of disciplines such as computational neurosciences in the study of the brain and the possibility of deep knowledge into different devices with the support that information technology (IT) brings. One of the main interests of cognitive science is the opportunity to develop human intelligence in a system or mechanism. This paper takes the principal articles of three databases oriented to computational sciences (EbscoHost Web, IEEE Xplore and Compendex Engineering Village) to understand the current objectives of neural networks in studying the brain. The possible use of this kind of technology is to develop artificial intelligence (AI) systems that can replicate more complex human brain tasks (such as those involving consciousness). The results show the principal findings in research and topics in developing studies about neural networks in computational neurosciences. One of the principal developments is the use of neural networks as the basis of much computational architecture using multiple techniques such as computational neuromorphic chips, MRI images and brain–computer interfaces (BCI) to enhance the capacity to simulate brain activities. This article aims to review and analyze those studies carried out on the development of different computational architectures that focus on affecting various brain activities through neural networks. The aim is to determine the orientation and the main lines of research on this topic and work in routes that allow interdisciplinary collaboration.
Collapse
|
2
|
Dabelow L, Ueda M. Three learning stages and accuracy–efficiency tradeoff of restricted Boltzmann machines. Nat Commun 2022; 13:5474. [PMID: 36115845 PMCID: PMC9482660 DOI: 10.1038/s41467-022-33126-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2022] [Accepted: 09/01/2022] [Indexed: 11/15/2022] Open
Abstract
Restricted Boltzmann Machines (RBMs) offer a versatile architecture for unsupervised machine learning that can in principle approximate any target probability distribution with arbitrary accuracy. However, the RBM model is usually not directly accessible due to its computational complexity, and Markov-chain sampling is invoked to analyze the learned probability distribution. For training and eventual applications, it is thus desirable to have a sampler that is both accurate and efficient. We highlight that these two goals generally compete with each other and cannot be achieved simultaneously. More specifically, we identify and quantitatively characterize three regimes of RBM learning: independent learning, where the accuracy improves without losing efficiency; correlation learning, where higher accuracy entails lower efficiency; and degradation, where both accuracy and efficiency no longer improve or even deteriorate. These findings are based on numerical experiments and heuristic arguments. Restricted Boltzmann Machines are unsupervised machine learning model that have been applied for various tasks from image analysis to many-body physics. The authors elaborate the interplay of accuracy and efficiency of this model and define possible balance regimes for applications.
Collapse
|
3
|
Müller E, Schmitt S, Mauch C, Billaudelle S, Grübl A, Güttler M, Husmann D, Ilmberger J, Jeltsch S, Kaiser J, Klähn J, Kleider M, Koke C, Montes J, Müller P, Partzsch J, Passenberg F, Schmidt H, Vogginger B, Weidner J, Mayr C, Schemmel J. The operating system of the neuromorphic BrainScaleS-1 system. Neurocomputing 2022. [DOI: 10.1016/j.neucom.2022.05.081] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
4
|
Klassert R, Baumbach A, Petrovici MA, Gärttner M. Variational learning of quantum ground states on spiking neuromorphic hardware. iScience 2022; 25:104707. [PMID: 35992070 PMCID: PMC9386107 DOI: 10.1016/j.isci.2022.104707] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2022] [Revised: 05/05/2022] [Accepted: 06/28/2022] [Indexed: 11/26/2022] Open
Abstract
Recent research has demonstrated the usefulness of neural networks as variational ansatz functions for quantum many-body states. However, high-dimensional sampling spaces and transient autocorrelations confront these approaches with a challenging computational bottleneck. Compared to conventional neural networks, physical model devices offer a fast, efficient and inherently parallel substrate capable of related forms of Markov chain Monte Carlo sampling. Here, we demonstrate the ability of a neuromorphic chip to represent the ground states of quantum spin models by variational energy minimization. We develop a training algorithm and apply it to the transverse field Ising model, showing good performance at moderate system sizes (N≤10). A systematic hyperparameter study shows that performance depends on sample quality, which is limited by temporal parameter variations on the analog neuromorphic chip. Our work thus provides an important step towards harnessing the capabilities of neuromorphic hardware for tackling the curse of dimensionality in quantum many-body problems. Variational scheme for representing quantum ground states with neuromorphic hardware Accelerated physical system yields system-size independent sample generation time Accurate learning of ground states across a quantum phase transition Detailed analysis of algorithmic and technical limitations
Collapse
|
5
|
Müller E, Arnold E, Breitwieser O, Czierlinski M, Emmel A, Kaiser J, Mauch C, Schmitt S, Spilger P, Stock R, Stradmann Y, Weis J, Baumbach A, Billaudelle S, Cramer B, Ebert F, Göltz J, Ilmberger J, Karasenko V, Kleider M, Leibfried A, Pehle C, Schemmel J. A Scalable Approach to Modeling on Accelerated Neuromorphic Hardware. Front Neurosci 2022; 16:884128. [PMID: 35663548 PMCID: PMC9157770 DOI: 10.3389/fnins.2022.884128] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2022] [Accepted: 04/20/2022] [Indexed: 11/29/2022] Open
Abstract
Neuromorphic systems open up opportunities to enlarge the explorative space for computational research. However, it is often challenging to unite efficiency and usability. This work presents the software aspects of this endeavor for the BrainScaleS-2 system, a hybrid accelerated neuromorphic hardware architecture based on physical modeling. We introduce key aspects of the BrainScaleS-2 Operating System: experiment workflow, API layering, software design, and platform operation. We present use cases to discuss and derive requirements for the software and showcase the implementation. The focus lies on novel system and software features such as multi-compartmental neurons, fast re-configuration for hardware-in-the-loop training, applications for the embedded processors, the non-spiking operation mode, interactive platform access, and sustainable hardware/software co-development. Finally, we discuss further developments in terms of hardware scale-up, system usability, and efficiency.
Collapse
Affiliation(s)
- Eric Müller
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Elias Arnold
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Oliver Breitwieser
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Milena Czierlinski
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Arne Emmel
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Jakob Kaiser
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Christian Mauch
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Sebastian Schmitt
- Third Institute of Physics, University of Göttingen, Göttingen, Germany
| | - Philipp Spilger
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Raphael Stock
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Yannik Stradmann
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Johannes Weis
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Andreas Baumbach
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
- Department of Physiology, University of Bern, Bern, Switzerland
| | | | - Benjamin Cramer
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Falk Ebert
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Julian Göltz
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
- Department of Physiology, University of Bern, Bern, Switzerland
| | - Joscha Ilmberger
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Vitali Karasenko
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Mitja Kleider
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Aron Leibfried
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Christian Pehle
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Johannes Schemmel
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| |
Collapse
|
6
|
Korcsak-Gorzo A, Müller MG, Baumbach A, Leng L, Breitwieser OJ, van Albada SJ, Senn W, Meier K, Legenstein R, Petrovici MA. Cortical oscillations support sampling-based computations in spiking neural networks. PLoS Comput Biol 2022; 18:e1009753. [PMID: 35324886 PMCID: PMC8947809 DOI: 10.1371/journal.pcbi.1009753] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2021] [Accepted: 12/14/2021] [Indexed: 11/19/2022] Open
Abstract
Being permanently confronted with an uncertain world, brains have faced evolutionary pressure to represent this uncertainty in order to respond appropriately. Often, this requires visiting multiple interpretations of the available information or multiple solutions to an encountered problem. This gives rise to the so-called mixing problem: since all of these "valid" states represent powerful attractors, but between themselves can be very dissimilar, switching between such states can be difficult. We propose that cortical oscillations can be effectively used to overcome this challenge. By acting as an effective temperature, background spiking activity modulates exploration. Rhythmic changes induced by cortical oscillations can then be interpreted as a form of simulated tempering. We provide a rigorous mathematical discussion of this link and study some of its phenomenological implications in computer simulations. This identifies a new computational role of cortical oscillations and connects them to various phenomena in the brain, such as sampling-based probabilistic inference, memory replay, multisensory cue combination, and place cell flickering.
Collapse
Affiliation(s)
- Agnes Korcsak-Gorzo
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- RWTH Aachen University, Aachen, Germany
| | - Michael G. Müller
- Institute of Theoretical Computer Science, Graz University of Technology, Graz, Austria
| | - Andreas Baumbach
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
- Department of Physiology, University of Bern, Bern, Switzerland
| | - Luziwei Leng
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | | | - Sacha J. van Albada
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Institute of Zoology, University of Cologne, Cologne, Germany
| | - Walter Senn
- Department of Physiology, University of Bern, Bern, Switzerland
| | - Karlheinz Meier
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Robert Legenstein
- Institute of Theoretical Computer Science, Graz University of Technology, Graz, Austria
| | - Mihai A. Petrovici
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
- Department of Physiology, University of Bern, Bern, Switzerland
| |
Collapse
|
7
|
Bagheriye L, Kwisthout J. Brain-Inspired Hardware Solutions for Inference in Bayesian Networks. Front Neurosci 2021; 15:728086. [PMID: 34924925 PMCID: PMC8677599 DOI: 10.3389/fnins.2021.728086] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2021] [Accepted: 10/11/2021] [Indexed: 11/23/2022] Open
Abstract
The implementation of inference (i.e., computing posterior probabilities) in Bayesian networks using a conventional computing paradigm turns out to be inefficient in terms of energy, time, and space, due to the substantial resources required by floating-point operations. A departure from conventional computing systems to make use of the high parallelism of Bayesian inference has attracted recent attention, particularly in the hardware implementation of Bayesian networks. These efforts lead to several implementations ranging from digital circuits, mixed-signal circuits, to analog circuits by leveraging new emerging nonvolatile devices. Several stochastic computing architectures using Bayesian stochastic variables have been proposed, from FPGA-like architectures to brain-inspired architectures such as crossbar arrays. This comprehensive review paper discusses different hardware implementations of Bayesian networks considering different devices, circuits, and architectures, as well as a more futuristic overview to solve existing hardware implementation problems.
Collapse
Affiliation(s)
- Leila Bagheriye
- Foundations of Natural and Stochastic Computing, Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, Netherlands
| | | |
Collapse
|
8
|
|