1
|
Mittal D, Narayanan R. Network motifs in cellular neurophysiology. Trends Neurosci 2024; 47:506-521. [PMID: 38806296 DOI: 10.1016/j.tins.2024.04.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2024] [Revised: 04/08/2024] [Accepted: 04/29/2024] [Indexed: 05/30/2024]
Abstract
Concepts from network science and graph theory, including the framework of network motifs, have been frequently applied in studying neuronal networks and other biological complex systems. Network-based approaches can also be used to study the functions of individual neurons, where cellular elements such as ion channels and membrane voltage are conceptualized as nodes within a network, and their interactions are denoted by edges. Network motifs in this context provide functional building blocks that help to illuminate the principles of cellular neurophysiology. In this review we build a case that network motifs operating within neurons provide tools for defining the functional architecture of single-neuron physiology and neuronal adaptations. We highlight the presence of such computational motifs in the cellular mechanisms underlying action potential generation, neuronal oscillations, dendritic integration, and neuronal plasticity. Future work applying the network motifs perspective may help to decipher the functional complexities of neurons and their adaptation during health and disease.
Collapse
Affiliation(s)
- Divyansh Mittal
- Centre for Integrative Genomics, Faculty of Biology and Medicine, University of Lausanne, Lausanne, Switzerland
| | - Rishikesh Narayanan
- Cellular Neurophysiology Laboratory, Molecular Biophysics Unit, Indian Institute of Science, Bangalore, India.
| |
Collapse
|
2
|
Ramaswamy S. Data-driven multiscale computational models of cortical and subcortical regions. Curr Opin Neurobiol 2024; 85:102842. [PMID: 38320453 DOI: 10.1016/j.conb.2024.102842] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2023] [Revised: 01/04/2024] [Accepted: 01/05/2024] [Indexed: 02/08/2024]
Abstract
Data-driven computational models of neurons, synapses, microcircuits, and mesocircuits have become essential tools in modern brain research. The goal of these multiscale models is to integrate and synthesize information from different levels of brain organization, from cellular properties, dendritic excitability, and synaptic dynamics to microcircuits, mesocircuits, and ultimately behavior. This article surveys recent advances in the genesis of data-driven computational models of mammalian neural networks in cortical and subcortical areas. I discuss the challenges and opportunities in developing data-driven multiscale models, including the need for interdisciplinary collaborations, the importance of model validation and comparison, and the potential impact on basic and translational neuroscience research. Finally, I highlight future directions and emerging technologies that will enable more comprehensive and predictive data-driven models of brain function and dysfunction.
Collapse
Affiliation(s)
- Srikanth Ramaswamy
- Neural Circuits Laboratory, Biosciences Institute, Newcastle University, Newcastle Upon Tyne, NE2 4HH, United Kingdom.
| |
Collapse
|
3
|
Bast A, Fruengel R, de Kock CPJ, Oberlaender M. Network-neuron interactions underlying sensory responses of layer 5 pyramidal tract neurons in barrel cortex. PLoS Comput Biol 2024; 20:e1011468. [PMID: 38626210 PMCID: PMC11051592 DOI: 10.1371/journal.pcbi.1011468] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2023] [Revised: 04/26/2024] [Accepted: 03/14/2024] [Indexed: 04/18/2024] Open
Abstract
Neurons in the cerebral cortex receive thousands of synaptic inputs per second from thousands of presynaptic neurons. How the dendritic location of inputs, their timing, strength, and presynaptic origin, in conjunction with complex dendritic physiology, impact the transformation of synaptic input into action potential (AP) output remains generally unknown for in vivo conditions. Here, we introduce a computational approach to reveal which properties of the input causally underlie AP output, and how this neuronal input-output computation is influenced by the morphology and biophysical properties of the dendrites. We demonstrate that this approach allows dissecting of how different input populations drive in vivo observed APs. For this purpose, we focus on fast and broadly tuned responses that pyramidal tract neurons in layer 5 (L5PTs) of the rat barrel cortex elicit upon passive single whisker deflections. By reducing a multi-scale model that we reported previously, we show that three features are sufficient to predict with high accuracy the sensory responses and receptive fields of L5PTs under these specific in vivo conditions: the count of active excitatory versus inhibitory synapses preceding the response, their spatial distribution on the dendrites, and the AP history. Based on these three features, we derive an analytically tractable description of the input-output computation of L5PTs, which enabled us to dissect how synaptic input from thalamus and different cell types in barrel cortex contribute to these responses. We show that the input-output computation is preserved across L5PTs despite morphological and biophysical diversity of their dendrites. We found that trial-to-trial variability in L5PT responses, and cell-to-cell variability in their receptive fields, are sufficiently explained by variability in synaptic input from the network, whereas variability in biophysical and morphological properties have minor contributions. Our approach to derive analytically tractable models of input-output computations in L5PTs provides a roadmap to dissect network-neuron interactions underlying L5PT responses across different in vivo conditions and for other cell types.
Collapse
Affiliation(s)
- Arco Bast
- In Silico Brain Sciences Group, Max Planck Institute for Neurobiology of Behavior ˗ caesar, Bonn, Germany
- International Max Planck Research School (IMPRS) for Brain and Behavior, Bonn, Germany
| | - Rieke Fruengel
- In Silico Brain Sciences Group, Max Planck Institute for Neurobiology of Behavior ˗ caesar, Bonn, Germany
- International Max Planck Research School (IMPRS) for Brain and Behavior, Bonn, Germany
| | - Christiaan P. J. de Kock
- Department of Integrative Neurophysiology, Center for Neurogenomics and Cognitive Research, Vrije Universiteit Amsterdam, Amsterdam, the Netherlands
| | - Marcel Oberlaender
- In Silico Brain Sciences Group, Max Planck Institute for Neurobiology of Behavior ˗ caesar, Bonn, Germany
- Department of Integrative Neurophysiology, Center for Neurogenomics and Cognitive Research, Vrije Universiteit Amsterdam, Amsterdam, the Netherlands
| |
Collapse
|
4
|
Gowers RP, Schreiber S. How neuronal morphology impacts the synchronisation state of neuronal networks. PLoS Comput Biol 2024; 20:e1011874. [PMID: 38437226 PMCID: PMC10939433 DOI: 10.1371/journal.pcbi.1011874] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2023] [Revised: 03/14/2024] [Accepted: 01/30/2024] [Indexed: 03/06/2024] Open
Abstract
The biophysical properties of neurons not only affect how information is processed within cells, they can also impact the dynamical states of the network. Specifically, the cellular dynamics of action-potential generation have shown relevance for setting the (de)synchronisation state of the network. The dynamics of tonically spiking neurons typically fall into one of three qualitatively distinct types that arise from distinct mathematical bifurcations of voltage dynamics at the onset of spiking. Accordingly, changes in ion channel composition or even external factors, like temperature, have been demonstrated to switch network behaviour via changes in the spike onset bifurcation and hence its associated dynamical type. A thus far less addressed modulator of neuronal dynamics is cellular morphology. Based on simplified and anatomically realistic mathematical neuron models, we show here that the extent of dendritic arborisation has an influence on the neuronal dynamical spiking type and therefore on the (de)synchronisation state of the network. Specifically, larger dendritic trees prime neuronal dynamics for in-phase-synchronised or splayed-out activity in weakly coupled networks, in contrast to cells with otherwise identical properties yet smaller dendrites. Our biophysical insights hold for generic multicompartmental classes of spiking neuron models (from ball-and-stick-type to anatomically reconstructed models) and establish a connection between neuronal morphology and the susceptibility of neural tissue to synchronisation in health and disease.
Collapse
Affiliation(s)
- Robert P Gowers
- Institute for Theoretical Biology, Humboldt-University of Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience, Berlin, Germany
| | - Susanne Schreiber
- Institute for Theoretical Biology, Humboldt-University of Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience, Berlin, Germany
| |
Collapse
|
5
|
O'Donnell C. Nonlinear slow-timescale mechanisms in synaptic plasticity. Curr Opin Neurobiol 2023; 82:102778. [PMID: 37657186 DOI: 10.1016/j.conb.2023.102778] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2023] [Revised: 08/07/2023] [Accepted: 08/09/2023] [Indexed: 09/03/2023]
Abstract
Learning and memory rely on synapses changing their strengths in response to neural activity. However, there is a substantial gap between the timescales of neural electrical dynamics (1-100 ms) and organism behaviour during learning (seconds-minutes). What mechanisms bridge this timescale gap? What are the implications for theories of brain learning? Here I first cover experimental evidence for slow-timescale factors in plasticity induction. Then I review possible underlying cellular and synaptic mechanisms, and insights from recent computational models that incorporate such slow-timescale variables. I conclude that future progress in understanding brain learning across timescales will require both experimental and computational modelling studies that map out the nonlinearities implemented by both fast and slow plasticity mechanisms at synapses, and crucially, their joint interactions.
Collapse
Affiliation(s)
- Cian O'Donnell
- School of Computing, Engineering, and Intelligent Systems, Magee Campus, Ulster University, Derry/Londonderry, UK; School of Computer Science, Electrical and Electronic Engineering, and Engineering Maths, University of Bristol, Bristol, UK.
| |
Collapse
|
6
|
Pagkalos M, Chavlis S, Poirazi P. Introducing the Dendrify framework for incorporating dendrites to spiking neural networks. Nat Commun 2023; 14:131. [PMID: 36627284 PMCID: PMC9832130 DOI: 10.1038/s41467-022-35747-8] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2022] [Accepted: 12/22/2022] [Indexed: 01/11/2023] Open
Abstract
Computational modeling has been indispensable for understanding how subcellular neuronal features influence circuit processing. However, the role of dendritic computations in network-level operations remains largely unexplored. This is partly because existing tools do not allow the development of realistic and efficient network models that account for dendrites. Current spiking neural networks, although efficient, are usually quite simplistic, overlooking essential dendritic properties. Conversely, circuit models with morphologically detailed neuron models are computationally costly, thus impractical for large-network simulations. To bridge the gap between these two extremes and facilitate the adoption of dendritic features in spiking neural networks, we introduce Dendrify, an open-source Python package based on Brian 2. Dendrify, through simple commands, automatically generates reduced compartmental neuron models with simplified yet biologically relevant dendritic and synaptic integrative properties. Such models strike a good balance between flexibility, performance, and biological accuracy, allowing us to explore dendritic contributions to network-level functions while paving the way for developing more powerful neuromorphic systems.
Collapse
Affiliation(s)
- Michalis Pagkalos
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece
- Department of Biology, University of Crete, Heraklion, 70013, Greece
| | - Spyridon Chavlis
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece
| | - Panayiota Poirazi
- Institute of Molecular Biology and Biotechnology (IMBB), Foundation for Research and Technology Hellas (FORTH), Heraklion, 70013, Greece.
| |
Collapse
|
7
|
Deng Y, Liu B, Huang Z, Liu X, He S, Li Q, Guo D. Fractional Spiking Neuron: Fractional Leaky Integrate-and-Fire Circuit Described with Dendritic Fractal Model. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 2022; 16:1375-1386. [PMID: 36315548 DOI: 10.1109/tbcas.2022.3218294] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/05/2023]
Abstract
As dendrites are essential parts of neurons, they are crucial factors for neuronal activities to follow multiple timescale dynamics, which ultimately affect information processing and cognition. However, in the common SNN (Spiking Neural Networks), the hardware-based LIF (Leaky Integrate-and-Fire) circuit only simulates the single timescale dynamic of soma without relating dendritic morphologies, which may limit the capability of simulating neurons to process information. This study proposes the dendritic fractal model mainly for quantifying dendritic morphological effects containing branch and length. To realize this model, We design multiple analog fractional-order circuits (AFCs) which match their extended structures and parameters with the dendritic features. Then introducing AFC into FLIF (Fractional Leaky Integrate-and-Fire) neuron circuits can demonstrate the same multiple timescale dynamics of spiking patterns as biological neurons, including spiking adaptation, inter-spike variability with power-law distribution, first-spike latency, and intrinsic memory. By contrast, it further enhances the degree of mimicry of neuron models and provides a more accurate model for understanding neural computation and cognition mechanisms.
Collapse
|
8
|
Virtual Intelligence: A Systematic Review of the Development of Neural Networks in Brain Simulation Units. Brain Sci 2022; 12:brainsci12111552. [DOI: 10.3390/brainsci12111552] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2022] [Revised: 10/18/2022] [Accepted: 10/26/2022] [Indexed: 11/17/2022] Open
Abstract
The functioning of the brain has been a complex and enigmatic phenomenon. From the first approaches made by Descartes about this organism as the vehicle of the mind to contemporary studies that consider the brain as an organism with emergent activities of primary and higher order, this organism has been the object of continuous exploration. It has been possible to develop a more profound study of brain functions through imaging techniques, the implementation of digital platforms or simulators through different programming languages and the use of multiple processors to emulate the speed at which synaptic processes are executed in the brain. The use of various computational architectures raises innumerable questions about the possible scope of disciplines such as computational neurosciences in the study of the brain and the possibility of deep knowledge into different devices with the support that information technology (IT) brings. One of the main interests of cognitive science is the opportunity to develop human intelligence in a system or mechanism. This paper takes the principal articles of three databases oriented to computational sciences (EbscoHost Web, IEEE Xplore and Compendex Engineering Village) to understand the current objectives of neural networks in studying the brain. The possible use of this kind of technology is to develop artificial intelligence (AI) systems that can replicate more complex human brain tasks (such as those involving consciousness). The results show the principal findings in research and topics in developing studies about neural networks in computational neurosciences. One of the principal developments is the use of neural networks as the basis of much computational architecture using multiple techniques such as computational neuromorphic chips, MRI images and brain–computer interfaces (BCI) to enhance the capacity to simulate brain activities. This article aims to review and analyze those studies carried out on the development of different computational architectures that focus on affecting various brain activities through neural networks. The aim is to determine the orientation and the main lines of research on this topic and work in routes that allow interdisciplinary collaboration.
Collapse
|
9
|
Oláh VJ, Pedersen NP, Rowan MJM. Ultrafast simulation of large-scale neocortical microcircuitry with biophysically realistic neurons. eLife 2022; 11:e79535. [PMID: 36341568 PMCID: PMC9640191 DOI: 10.7554/elife.79535] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2022] [Accepted: 10/23/2022] [Indexed: 11/09/2022] Open
Abstract
Understanding the activity of the mammalian brain requires an integrative knowledge of circuits at distinct scales, ranging from ion channel gating to circuit connectomics. Computational models are regularly employed to understand how multiple parameters contribute synergistically to circuit behavior. However, traditional models of anatomically and biophysically realistic neurons are computationally demanding, especially when scaled to model local circuits. To overcome this limitation, we trained several artificial neural network (ANN) architectures to model the activity of realistic multicompartmental cortical neurons. We identified an ANN architecture that accurately predicted subthreshold activity and action potential firing. The ANN could correctly generalize to previously unobserved synaptic input, including in models containing nonlinear dendritic properties. When scaled, processing times were orders of magnitude faster compared with traditional approaches, allowing for rapid parameter-space mapping in a circuit model of Rett syndrome. Thus, we present a novel ANN approach allowing for rapid, detailed network experiments using inexpensive and commonly available computational resources.
Collapse
Affiliation(s)
- Viktor J Oláh
- Department of Cell Biology, Emory University School of MedicineAtlantaUnited States
| | - Nigel P Pedersen
- Department of Neurology, Emory University School of MedicineAtlantaUnited States
| | - Matthew JM Rowan
- Department of Cell Biology, Emory University School of MedicineAtlantaUnited States
| |
Collapse
|
10
|
Hagen E, Magnusson SH, Ness TV, Halnes G, Babu PN, Linssen C, Morrison A, Einevoll GT. Brain signal predictions from multi-scale networks using a linearized framework. PLoS Comput Biol 2022; 18:e1010353. [PMID: 35960767 PMCID: PMC9401172 DOI: 10.1371/journal.pcbi.1010353] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2022] [Revised: 08/24/2022] [Accepted: 07/02/2022] [Indexed: 12/04/2022] Open
Abstract
Simulations of neural activity at different levels of detail are ubiquitous in modern neurosciences, aiding the interpretation of experimental data and underlying neural mechanisms at the level of cells and circuits. Extracellular measurements of brain signals reflecting transmembrane currents throughout the neural tissue remain commonplace. The lower frequencies (≲ 300Hz) of measured signals generally stem from synaptic activity driven by recurrent interactions among neural populations and computational models should also incorporate accurate predictions of such signals. Due to limited computational resources, large-scale neuronal network models (≳ 106 neurons or so) often require reducing the level of biophysical detail and account mainly for times of action potentials (‘spikes’) or spike rates. Corresponding extracellular signal predictions have thus poorly accounted for their biophysical origin. Here we propose a computational framework for predicting spatiotemporal filter kernels for such extracellular signals stemming from synaptic activity, accounting for the biophysics of neurons, populations, and recurrent connections. Signals are obtained by convolving population spike rates by appropriate kernels for each connection pathway and summing the contributions. Our main results are that kernels derived via linearized synapse and membrane dynamics, distributions of cells, conduction delay, and volume conductor model allow for accurately capturing the spatiotemporal dynamics of ground truth extracellular signals from conductance-based multicompartment neuron networks. One particular observation is that changes in the effective membrane time constants caused by persistent synapse activation must be accounted for. The work also constitutes a major advance in computational efficiency of accurate, biophysics-based signal predictions from large-scale spike and rate-based neuron network models drastically reducing signal prediction times compared to biophysically detailed network models. This work also provides insight into how experimentally recorded low-frequency extracellular signals of neuronal activity may be approximately linearly dependent on spiking activity. A new software tool LFPykernels serves as a reference implementation of the framework. Understanding the brain’s function and activity in healthy and pathological states across spatial scales and times spanning entire lives is one of humanity’s great undertakings. In experimental and clinical work probing the brain’s activity, a variety of electric and magnetic measurement techniques are routinely applied. However interpreting the extracellularly measured signals remains arduous due to multiple factors, mainly the large number of neurons contributing to the signals and complex interactions occurring in recurrently connected neuronal circuits. To understand how neurons give rise to such signals, mechanistic modeling combined with forward models derived using volume conductor theory has proven to be successful, but this approach currently does not scale to the systems level (encompassing millions of neurons or more) where simplified or abstract neuron representations typically are used. Motivated by experimental findings implying approximately linear relationships between times of neuronal action potentials and extracellular population signals, we provide a biophysics-based method for computing causal filters relating spikes and extracellular signals that can be applied with spike times or rates of large-scale neuronal network models for predictions of population signals without relying on ad hoc approximations.
Collapse
Affiliation(s)
- Espen Hagen
- Department of Data Science, Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
- * E-mail: (EH); (GTE)
| | - Steinn H. Magnusson
- Department of Physics, Faculty of Mathematics and Natural Sciences, University of Oslo, Oslo, Norway
| | - Torbjørn V. Ness
- Department of Physics, Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| | - Geir Halnes
- Department of Physics, Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| | - Pooja N. Babu
- Simulation & Data Lab Neuroscience, Institute for Advanced Simulation, Jülich Supercomputing Centre (JSC), Jülich Research Centre, Jülich, Germany
| | - Charl Linssen
- Simulation & Data Lab Neuroscience, Institute for Advanced Simulation, Jülich Supercomputing Centre (JSC), Jülich Research Centre, Jülich, Germany
- Institute of Neuroscience and Medicine (INM-6); Computational and Systems Neuroscience & Institute for Advanced Simulation (IAS-6); Theoretical Neuroscience & JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre and JARA, Jülich, Germany
| | - Abigail Morrison
- Simulation & Data Lab Neuroscience, Institute for Advanced Simulation, Jülich Supercomputing Centre (JSC), Jülich Research Centre, Jülich, Germany
- Institute of Neuroscience and Medicine (INM-6); Computational and Systems Neuroscience & Institute for Advanced Simulation (IAS-6); Theoretical Neuroscience & JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre and JARA, Jülich, Germany
- Software Engineering, Department of Computer Science 3, RWTH Aachen University, Aachen, Germany
| | - Gaute T. Einevoll
- Department of Physics, Faculty of Mathematics and Natural Sciences, University of Oslo, Oslo, Norway
- Department of Physics, Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
- * E-mail: (EH); (GTE)
| |
Collapse
|
11
|
D'Angelo E, Jirsa V. The quest for multiscale brain modeling. Trends Neurosci 2022; 45:777-790. [PMID: 35906100 DOI: 10.1016/j.tins.2022.06.007] [Citation(s) in RCA: 33] [Impact Index Per Article: 16.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2022] [Revised: 05/20/2022] [Accepted: 06/21/2022] [Indexed: 01/07/2023]
Abstract
Addressing the multiscale organization of the brain, which is fundamental to the dynamic repertoire of the organ, remains challenging. In principle, it should be possible to model neurons and synapses in detail and then connect them into large neuronal assemblies to explain the relationship between microscopic phenomena, large-scale brain functions, and behavior. It is more difficult to infer neuronal functions from ensemble measurements such as those currently obtained with brain activity recordings. In this article we consider theories and strategies for combining bottom-up models, generated from principles of neuronal biophysics, with top-down models based on ensemble representations of network activity and on functional principles. These integrative approaches are hoped to provide effective multiscale simulations in virtual brains and neurorobots, and pave the way to future applications in medicine and information technologies.
Collapse
Affiliation(s)
- Egidio D'Angelo
- Department of Brain and Behavioral Sciences, University of Pavia, and Brain Connectivity Center, Istituto di Ricovero e Cura a Carattere Scientifico (IRCCS) Mondino Foundation, Pavia, Italy.
| | - Viktor Jirsa
- Institut National de la Santé et de la Recherche Médicale (INSERM) Unité 1106, Centre National de la Recherche Scientifique (CNRS), and University of Aix-Marseille, Marseille, France
| |
Collapse
|
12
|
A Modular Workflow for Model Building, Analysis, and Parameter Estimation in Systems Biology and Neuroscience. Neuroinformatics 2022; 20:241-259. [PMID: 34709562 PMCID: PMC9537196 DOI: 10.1007/s12021-021-09546-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/03/2021] [Indexed: 01/07/2023]
Abstract
Neuroscience incorporates knowledge from a range of scales, from single molecules to brain wide neural networks. Modeling is a valuable tool in understanding processes at a single scale or the interactions between two adjacent scales and researchers use a variety of different software tools in the model building and analysis process. Here we focus on the scale of biochemical pathways, which is one of the main objects of study in systems biology. While systems biology is among the more standardized fields, conversion between different model formats and interoperability between various tools is still somewhat problematic. To offer our take on tackling these shortcomings and by keeping in mind the FAIR (findability, accessibility, interoperability, reusability) data principles, we have developed a workflow for building and analyzing biochemical pathway models, using pre-existing tools that could be utilized for the storage and refinement of models in all phases of development. We have chosen the SBtab format which allows the storage of biochemical models and associated data in a single file and provides a human readable set of syntax rules. Next, we implemented custom-made MATLAB® scripts to perform parameter estimation and global sensitivity analysis used in model refinement. Additionally, we have developed a web-based application for biochemical models that allows simulations with either a network free solver or stochastic solvers and incorporating geometry. Finally, we illustrate convertibility and use of a biochemical model in a biophysically detailed single neuron model by running multiscale simulations in NEURON. Using this workflow, we can simulate the same model in three different simulators, with a smooth conversion between the different model formats, enhancing the characterization of different aspects of the model.
Collapse
|
13
|
Beniaguev D, Segev I, London M. Single cortical neurons as deep artificial neural networks. Neuron 2021; 109:2727-2739.e3. [PMID: 34380016 DOI: 10.1016/j.neuron.2021.07.002] [Citation(s) in RCA: 44] [Impact Index Per Article: 14.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2020] [Revised: 03/04/2021] [Accepted: 06/30/2021] [Indexed: 11/17/2022]
Abstract
Utilizing recent advances in machine learning, we introduce a systematic approach to characterize neurons' input/output (I/O) mapping complexity. Deep neural networks (DNNs) were trained to faithfully replicate the I/O function of various biophysical models of cortical neurons at millisecond (spiking) resolution. A temporally convolutional DNN with five to eight layers was required to capture the I/O mapping of a realistic model of a layer 5 cortical pyramidal cell (L5PC). This DNN generalized well when presented with inputs widely outside the training distribution. When NMDA receptors were removed, a much simpler network (fully connected neural network with one hidden layer) was sufficient to fit the model. Analysis of the DNNs' weight matrices revealed that synaptic integration in dendritic branches could be conceptualized as pattern matching from a set of spatiotemporal templates. This study provides a unified characterization of the computational complexity of single neurons and suggests that cortical networks therefore have a unique architecture, potentially supporting their computational power.
Collapse
Affiliation(s)
- David Beniaguev
- Edmond and Lily Safra Center for Brain Sciences (ELSC), The Hebrew University of Jerusalem, Jerusalem 91904, Israel.
| | - Idan Segev
- Edmond and Lily Safra Center for Brain Sciences (ELSC), The Hebrew University of Jerusalem, Jerusalem 91904, Israel; Department of Neurobiology, The Hebrew University of Jerusalem, Jerusalem 91904, Israel
| | - Michael London
- Edmond and Lily Safra Center for Brain Sciences (ELSC), The Hebrew University of Jerusalem, Jerusalem 91904, Israel; Department of Neurobiology, The Hebrew University of Jerusalem, Jerusalem 91904, Israel
| |
Collapse
|
14
|
Liu W, Liu Q, Crozier RA, Davis RL. Analog Transmission of Action Potential Fine Structure in Spiral Ganglion Axons. J Neurophysiol 2021; 126:888-905. [PMID: 34346782 DOI: 10.1152/jn.00237.2021] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Action potential waveforms generated at the axon initial segment (AIS) are specialized between and within neuronal classes. But is the fine structure of each electrical event retained when transmitted along myelinated axons or is it rapidly and uniformly transmitted to be modified again at the axon terminal? To address this issue action potential axonal transmission was evaluated in a class of primary sensory afferents that possess numerous types of voltage-gated ion channels underlying a complex repertoire of endogenous firing patterns. In addition to their signature intrinsic electrophysiological heterogeneity, spiral ganglion neurons are uniquely designed. The bipolar, myelinated somata of type I neurons are located within the conduction pathway, requiring that action potentials generated at the first heminode must be conducted through their electrically excitable membrane. We utilized this unusual axonal-like morphology to serve as a window into action potential transmission to compare locally-evoked action potential profiles to those generated peripherally at their glutamatergic synaptic connections with hair cell receptors. These comparisons showed that the distinctively-shaped somatic action potentials were highly correlated with the nodally-generated, invading ones for each neuron. This result indicates that the fine structure of the action potential waveform is maintained axonally, thus supporting the concept that analog signaling is incorporated into each digitally-transmitted action potential in the specialized primary auditory afferents.
Collapse
Affiliation(s)
- Wenke Liu
- Department of Cell Biology and Neuroscience, Rutgers, The State University of New Jersey, Piscataway, NJ, United States.,Institute for System Genetics, New York University School of Medicine, New York, NY, United States
| | - Qing Liu
- Department of Cell Biology and Neuroscience, Rutgers, The State University of New Jersey, Piscataway, NJ, United States.,Inscopix, Inc., Palo Alto, California, United States
| | - Robert A Crozier
- Department of Cell Biology and Neuroscience, Rutgers, The State University of New Jersey, Piscataway, NJ, United States.,Synergy Pharmaceuticals Inc., New York, NY, United States
| | - Robin L Davis
- Department of Cell Biology and Neuroscience, Rutgers, The State University of New Jersey, Piscataway, NJ, United States
| |
Collapse
|
15
|
A convolutional neural-network framework for modelling auditory sensory cells and synapses. Commun Biol 2021; 4:827. [PMID: 34211095 PMCID: PMC8249591 DOI: 10.1038/s42003-021-02341-5] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2020] [Accepted: 06/09/2021] [Indexed: 12/02/2022] Open
Abstract
In classical computational neuroscience, analytical model descriptions are derived from neuronal recordings to mimic the underlying biological system. These neuronal models are typically slow to compute and cannot be integrated within large-scale neuronal simulation frameworks. We present a hybrid, machine-learning and computational-neuroscience approach that transforms analytical models of sensory neurons and synapses into deep-neural-network (DNN) neuronal units with the same biophysical properties. Our DNN-model architecture comprises parallel and differentiable equations that can be used for backpropagation in neuro-engineering applications, and offers a simulation run-time improvement factor of 70 and 280 on CPU or GPU systems respectively. We focussed our development on auditory neurons and synapses, and show that our DNN-model architecture can be extended to a variety of existing analytical models. We describe how our approach for auditory models can be applied to other neuron and synapse types to help accelerate the development of large-scale brain networks and DNN-based treatments of the pathological system. Drakopoulos et al developed a machine-learning and computational-neuroscience approach that transforms analytical models of sensory neurons and synapses into deep-neural-network (DNN) neuronal units with the same biophysical properties. Focusing on auditory neurons and synapses, they showed that their DNN-model architecture could be extended to a variety of existing analytical models and to other neuron and synapse types, thus potentially assisting the development of large-scale brain networks and DNN-based treatments.
Collapse
|
16
|
Baby D, Van Den Broucke A, Verhulst S. A convolutional neural-network model of human cochlear mechanics and filter tuning for real-time applications. NAT MACH INTELL 2021; 3:134-143. [PMID: 33629031 PMCID: PMC7116797 DOI: 10.1038/s42256-020-00286-8] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
Abstract
Auditory models are commonly used as feature extractors for automatic speech-recognition systems or as front-ends for robotics, machine-hearing and hearing-aid applications. Although auditory models can capture the biophysical and nonlinear properties of human hearing in great detail, these biophysical models are computationally expensive and cannot be used in real-time applications. We present a hybrid approach where convolutional neural networks are combined with computational neuroscience to yield a real-time end-to-end model for human cochlear mechanics, including level-dependent filter tuning (CoNNear). The CoNNear model was trained on acoustic speech material and its performance and applicability were evaluated using (unseen) sound stimuli commonly employed in cochlear mechanics research. The CoNNear model accurately simulates human cochlear frequency selectivity and its dependence on sound intensity, an essential quality for robust speech intelligibility at negative speech-to-background-noise ratios. The CoNNear architecture is based on parallel and differentiable computations and has the power to achieve real-time human performance. These unique CoNNear features will enable the next generation of human-like machine-hearing applications.
Collapse
Affiliation(s)
- Deepak Baby
- Hearing Technology @ WAVES, Dept. of Information Technology, Ghent University, 9000 Ghent, Belgium
| | - Arthur Van Den Broucke
- Hearing Technology @ WAVES, Dept. of Information Technology, Ghent University, 9000 Ghent, Belgium
| | - Sarah Verhulst
- Hearing Technology @ WAVES, Dept. of Information Technology, Ghent University, 9000 Ghent, Belgium
| |
Collapse
|
17
|
Wybo WA, Jordan J, Ellenberger B, Marti Mengual U, Nevian T, Senn W. Data-driven reduction of dendritic morphologies with preserved dendro-somatic responses. eLife 2021; 10:60936. [PMID: 33494860 PMCID: PMC7837682 DOI: 10.7554/elife.60936] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2020] [Accepted: 01/04/2021] [Indexed: 11/13/2022] Open
Abstract
Dendrites shape information flow in neurons. Yet, there is little consensus on the level of spatial complexity at which they operate. Through carefully chosen parameter fits, solvable in the least-squares sense, we obtain accurate reduced compartmental models at any level of complexity. We show that (back-propagating) action potentials, Ca2+ spikes, and N-methyl-D-aspartate spikes can all be reproduced with few compartments. We also investigate whether afferent spatial connectivity motifs admit simplification by ablating targeted branches and grouping affected synapses onto the next proximal dendrite. We find that voltage in the remaining branches is reproduced if temporal conductance fluctuations stay below a limit that depends on the average difference in input resistance between the ablated branches and the next proximal dendrite. Furthermore, our methodology fits reduced models directly from experimental data, without requiring morphological reconstructions. We provide software that automatizes the simplification, eliminating a common hurdle toward including dendritic computations in network models.
Collapse
Affiliation(s)
- Willem Am Wybo
- Department of Physiology, University of Bern, Bern, Switzerland
| | - Jakob Jordan
- Department of Physiology, University of Bern, Bern, Switzerland
| | | | | | - Thomas Nevian
- Department of Physiology, University of Bern, Bern, Switzerland
| | - Walter Senn
- Department of Physiology, University of Bern, Bern, Switzerland
| |
Collapse
|
18
|
Systematic Integration of Structural and Functional Data into Multi-scale Models of Mouse Primary Visual Cortex. Neuron 2020; 106:388-403.e18. [DOI: 10.1016/j.neuron.2020.01.040] [Citation(s) in RCA: 90] [Impact Index Per Article: 22.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/25/2019] [Revised: 10/17/2019] [Accepted: 01/27/2020] [Indexed: 01/08/2023]
|