1
|
Fukai T. Computational models of Idling brain activity for memory processing. Neurosci Res 2022; 189:75-82. [PMID: 36592825 DOI: 10.1016/j.neures.2022.12.024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2022] [Accepted: 12/29/2022] [Indexed: 01/01/2023]
Abstract
Studying the underlying neural mechanisms of cognitive functions of the brain is one of the central questions in modern biology. Moreover, it has significantly impacted the development of novel technologies in artificial intelligence. Spontaneous activity is a unique feature of the brain and is currently lacking in many artificially constructed intelligent machines. Spontaneous activity may represent the brain's idling states, which are internally driven by neuronal networks and possibly participate in offline processing during awake, sleep, and resting states. Evidence is accumulating that the brain's spontaneous activity is not mere noise but part of the mechanisms to process information about previous experiences. A bunch of literature has shown how previous sensory and behavioral experiences influence the subsequent patterns of brain activity with various methods in various animals. It seems, however, that the patterns of neural activity and their computational roles differ significantly from area to area and from function to function. In this article, I review the various forms of the brain's spontaneous activity, especially those observed during memory processing, and some attempts to model the generation mechanisms and computational roles of such activities.
Collapse
Affiliation(s)
- Tomoki Fukai
- Okinawa Institute of Science and Technology, Tancha 1919-1, Onna-son, Okinawa 904-0495, Japan.
| |
Collapse
|
2
|
Jaiton V, Rothomphiwat K, Ebeid E, Manoonpong P. Neural Control and Online Learning for Speed Adaptation of Unmanned Aerial Vehicles. Front Neural Circuits 2022; 16:839361. [PMID: 35547643 PMCID: PMC9082606 DOI: 10.3389/fncir.2022.839361] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2021] [Accepted: 03/25/2022] [Indexed: 11/13/2022] Open
Abstract
Unmanned aerial vehicles (UAVs) are involved in critical tasks such as inspection and exploration. Thus, they have to perform several intelligent functions. Various control approaches have been proposed to implement these functions. Most classical UAV control approaches, such as model predictive control, require a dynamic model to determine the optimal control parameters. Other control approaches use machine learning techniques that require multiple learning trials to obtain the proper control parameters. All these approaches are computationally expensive. Our goal is to develop an efficient control system for UAVs that does not require a dynamic model and allows them to learn control parameters online with only a few trials and inexpensive computations. To achieve this, we developed a neural control method with fast online learning. Neural control is based on a three-neuron network, whereas the online learning algorithm is derived from a neural correlation-based learning principle with predictive and reflexive sensory information. This neural control technique is used here for the speed adaptation of the UAV. The control technique relies on a simple input signal from a compact optical distance measurement sensor that can be converted into predictive and reflexive sensory information for the learning algorithm. Such speed adaptation is a fundamental function that can be used as part of other complex control functions, such as obstacle avoidance. The proposed technique was implemented on a real UAV system. Consequently, the UAV can quickly learn within 3–4 trials to proactively adapt its flying speed to brake at a safe distance from the obstacle or target in the horizontal and vertical planes. This speed adaptation is also robust against wind perturbation. We also demonstrated a combination of speed adaptation and obstacle avoidance for UAV navigations, which is an important intelligent function toward inspection and exploration.
Collapse
Affiliation(s)
- Vatsanai Jaiton
- Bio-Inspired Robotics and Neural Engineering Laboratory, School of Information Science and Technology, Vidyasirimedhi Institute of Science and Technology, Rayong, Thailand
| | - Kongkiat Rothomphiwat
- Bio-Inspired Robotics and Neural Engineering Laboratory, School of Information Science and Technology, Vidyasirimedhi Institute of Science and Technology, Rayong, Thailand
| | - Emad Ebeid
- SDU UAS Centre (Unmanned Aerial Systems), The Mærsk Mc-Kinney Møller Institute, University of Southern Denmark, Odense, Denmark
| | - Poramate Manoonpong
- Bio-Inspired Robotics and Neural Engineering Laboratory, School of Information Science and Technology, Vidyasirimedhi Institute of Science and Technology, Rayong, Thailand
- Embodied AI and Neurorobotics Laboratory, SDU Biorobotics, The Mærsk Mc-Kinney Møller Institute, University of Southern Denmark, Odense, Denmark
- *Correspondence: Poramate Manoonpong
| |
Collapse
|
3
|
Li KT, He X, Zhou G, Yang J, Li T, Hu H, Ji D, Zhou C, Ma H. Rational designing of oscillatory rhythmicity for memory rescue in plasticity-impaired learning networks. Cell Rep 2022; 39:110678. [PMID: 35417714 DOI: 10.1016/j.celrep.2022.110678] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2021] [Revised: 01/19/2022] [Accepted: 03/22/2022] [Indexed: 12/15/2022] Open
Abstract
In the brain, oscillatory strength embedded in network rhythmicity is important for processing experiences, and this process is disrupted in certain psychiatric disorders. The use of rhythmic network stimuli can change these oscillations and has shown promise in terms of improving cognitive function, although the underlying mechanisms are poorly understood. Here, we combine a two-layer learning model, with experiments involving genetically modified mice, that provides precise control of experience-driven oscillations by manipulating long-term potentiation of excitatory synapses onto inhibitory interneurons (LTPE→I). We find that, in the absence of LTPE→I, impaired network dynamics and memory are rescued by activating inhibitory neurons to augment the power in theta and gamma frequencies, which prevents network overexcitation with less inhibitory rebound. In contrast, increasing either theta or gamma power alone was less effective. Thus, inducing network changes at dual frequencies is involved in memory encoding, indicating a potentially feasible strategy for optimizing network-stimulating therapies.
Collapse
Affiliation(s)
- Kwan Tung Li
- Department of Physics, Centre for Nonlinear Studies, Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Hong Kong, China
| | - Xingzhi He
- Department of Neurobiology, Affiliated Mental Health Center and Hangzhou Seventh People's Hospital, Zhejiang University School of Medicine, Hangzhou 310058, China; Liangzhu Laboratory, MOE Frontier Science Center for Brain Research and Brain-Machine Integration, School of Brain Science and Brain Medicine, Zhejiang University, Hangzhou 310058, China
| | - Guangjun Zhou
- Department of Neurobiology, Affiliated Mental Health Center and Hangzhou Seventh People's Hospital, Zhejiang University School of Medicine, Hangzhou 310058, China; Liangzhu Laboratory, MOE Frontier Science Center for Brain Research and Brain-Machine Integration, School of Brain Science and Brain Medicine, Zhejiang University, Hangzhou 310058, China
| | - Jing Yang
- Department of Neurobiology, Affiliated Mental Health Center and Hangzhou Seventh People's Hospital, Zhejiang University School of Medicine, Hangzhou 310058, China; Liangzhu Laboratory, MOE Frontier Science Center for Brain Research and Brain-Machine Integration, School of Brain Science and Brain Medicine, Zhejiang University, Hangzhou 310058, China
| | - Tao Li
- Department of Neurobiology, Affiliated Mental Health Center and Hangzhou Seventh People's Hospital, Zhejiang University School of Medicine, Hangzhou 310058, China; Liangzhu Laboratory, MOE Frontier Science Center for Brain Research and Brain-Machine Integration, School of Brain Science and Brain Medicine, Zhejiang University, Hangzhou 310058, China
| | - Hailan Hu
- Department of Neurobiology, Affiliated Mental Health Center and Hangzhou Seventh People's Hospital, Zhejiang University School of Medicine, Hangzhou 310058, China; Liangzhu Laboratory, MOE Frontier Science Center for Brain Research and Brain-Machine Integration, School of Brain Science and Brain Medicine, Zhejiang University, Hangzhou 310058, China; Research Units for Emotion and Emotion disorders, Chinese Academy of Medical Sciences, Beijing 100730, China
| | - Daoyun Ji
- Department of Neuroscience, Baylor College of Medicine, Houston, TX 77030, USA; Department of Molecular and Cellular Biology, Baylor College of Medicine, Houston, TX 77030, USA
| | - Changsong Zhou
- Department of Physics, Centre for Nonlinear Studies, Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Hong Kong, China; Department of Physics, Zhejiang University, Hangzhou 310027, China.
| | - Huan Ma
- Department of Neurobiology, Affiliated Mental Health Center and Hangzhou Seventh People's Hospital, Zhejiang University School of Medicine, Hangzhou 310058, China; Liangzhu Laboratory, MOE Frontier Science Center for Brain Research and Brain-Machine Integration, School of Brain Science and Brain Medicine, Zhejiang University, Hangzhou 310058, China; Research Units for Emotion and Emotion disorders, Chinese Academy of Medical Sciences, Beijing 100730, China.
| |
Collapse
|
4
|
Abstract
This selective review explores biologically inspired learning as a model for intelligent robot control and sensing technology on the basis of specific examples. Hebbian synaptic learning is discussed as a functionally relevant model for machine learning and intelligence, as explained on the basis of examples from the highly plastic biological neural networks of invertebrates and vertebrates. Its potential for adaptive learning and control without supervision, the generation of functional complexity, and control architectures based on self-organization is brought forward. Learning without prior knowledge based on excitatory and inhibitory neural mechanisms accounts for the process through which survival-relevant or task-relevant representations are either reinforced or suppressed. The basic mechanisms of unsupervised biological learning drive synaptic plasticity and adaptation for behavioral success in living brains with different levels of complexity. The insights collected here point toward the Hebbian model as a choice solution for “intelligent” robotics and sensor systems.
Collapse
|
5
|
Jordan JT, Tong Y, Pytte CL. Transection of the ventral hippocampal commissure impairs spatial reference but not contextual or spatial working memory. Learn Mem 2022; 29:29-37. [PMID: 34911801 PMCID: PMC8686591 DOI: 10.1101/lm.053483.121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2021] [Accepted: 11/09/2021] [Indexed: 01/03/2023]
Abstract
Plasticity is a neural phenomenon in which experience induces long-lasting changes to neuronal circuits and is at the center of most neurobiological theories of learning and memory. However, too much plasticity is maladaptive and must be balanced with substrate stability. Area CA3 of the hippocampus provides such a balance via hemispheric lateralization, with the left hemisphere dominant in providing plasticity and the right specialized for stability. Left and right CA3 project bilaterally to CA1; however, it is not known whether this downstream merging of lateralized plasticity and stability is functional. We hypothesized that interhemispheric convergence of input from these pathways is essential for integrating spatial memory stored in the left CA3 with navigational working memory facilitated by the right CA3. To test this, we severed interhemispheric connections between the left and right hippocampi in mice and assessed learning and memory. Despite damage to this major hippocampal fiber tract, hippocampus-dependent navigational working memory and short- and long-term memory were both spared. However, tasks that required the integration of information retrieved from memory with ongoing navigational working memory and navigation were impaired. We propose that one function of interhemispheric communication in the mouse hippocampus is to integrate lateralized processing of plastic and stable circuits to facilitate memory-guided spatial navigation.
Collapse
Affiliation(s)
- Jake T. Jordan
- Department of Biology, The Graduate Center, City University of New York (CUNY), New York, New York 11016, USA,CUNY Neuroscience Collaborative, The Graduate Center, City University of New York, New York, New York 11016, USA
| | - Yi Tong
- Department of Psychology, Queens College, City University of New York, Flushing, New York 11367, USA
| | - Carolyn L. Pytte
- Department of Biology, The Graduate Center, City University of New York (CUNY), New York, New York 11016, USA,CUNY Neuroscience Collaborative, The Graduate Center, City University of New York, New York, New York 11016, USA,Department of Psychology, Queens College, City University of New York, Flushing, New York 11367, USA,Department of Psychology, The Graduate Center, City University of New York, New York, New York 11016, USA
| |
Collapse
|
6
|
Amorim FE, Chapot RL, Moulin TC, Lee JLC, Amaral OB. Memory destabilization during reconsolidation: a consequence of homeostatic plasticity? ACTA ACUST UNITED AC 2021; 28:371-389. [PMID: 34526382 DOI: 10.1101/lm.053418.121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2021] [Accepted: 07/14/2021] [Indexed: 11/24/2022]
Abstract
Remembering is not a static process: When retrieved, a memory can be destabilized and become prone to modifications. This phenomenon has been demonstrated in a number of brain regions, but the neuronal mechanisms that rule memory destabilization and its boundary conditions remain elusive. Using two distinct computational models that combine Hebbian plasticity and synaptic downscaling, we show that homeostatic plasticity can function as a destabilization mechanism, accounting for behavioral results of protein synthesis inhibition upon reactivation with different re-exposure times. Furthermore, by performing systematic reviews, we identify a series of overlapping molecular mechanisms between memory destabilization and synaptic downscaling, although direct experimental links between both phenomena remain scarce. In light of these results, we propose a theoretical framework where memory destabilization can emerge as an epiphenomenon of homeostatic adaptations prompted by memory retrieval.
Collapse
Affiliation(s)
- Felippe E Amorim
- Institute of Medical Biochemistry Leopoldo de Meis, Federal University of Rio de Janeiro, Rio de Janeiro 21941-902, Brazil
| | - Renata L Chapot
- Institute of Medical Biochemistry Leopoldo de Meis, Federal University of Rio de Janeiro, Rio de Janeiro 21941-902, Brazil
| | - Thiago C Moulin
- Functional Pharmacology Unit, Department of Neuroscience, Uppsala University, Uppsala 751 24, Sweden
| | - Jonathan L C Lee
- University of Birmingham, School of Psychology, Edgbaston, Birmingham B15 2TT, United Kingdom
| | - Olavo B Amaral
- Institute of Medical Biochemistry Leopoldo de Meis, Federal University of Rio de Janeiro, Rio de Janeiro 21941-902, Brazil
| |
Collapse
|
7
|
Wu CH, Ramos R, Katz DB, Turrigiano GG. Homeostatic synaptic scaling establishes the specificity of an associative memory. Curr Biol 2021; 31:2274-2285.e5. [PMID: 33798429 PMCID: PMC8187282 DOI: 10.1016/j.cub.2021.03.024] [Citation(s) in RCA: 22] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2021] [Revised: 03/01/2021] [Accepted: 03/08/2021] [Indexed: 12/21/2022]
Abstract
Correlation-based (Hebbian) forms of synaptic plasticity are crucial for the initial encoding of associative memories but likely insufficient to enable the stable storage of multiple specific memories within neural circuits. Theoretical studies have suggested that homeostatic synaptic normalization rules provide an essential countervailing force that can stabilize and expand memory storage capacity. Although such homeostatic mechanisms have been identified and studied for decades, experimental evidence that they play an important role in associative memory is lacking. Here, we show that synaptic scaling, a widely studied form of homeostatic synaptic plasticity that globally renormalizes synaptic strengths, is dispensable for initial associative memory formation but crucial for the establishment of memory specificity. We used conditioned taste aversion (CTA) learning, a form of associative learning that relies on Hebbian mechanisms within gustatory cortex (GC), to show that animals conditioned to avoid saccharin initially generalized this aversion to other novel tastants. Specificity of the aversion to saccharin emerged slowly over a time course of many hours and was associated with synaptic scaling down of excitatory synapses onto conditioning-active neuronal ensembles within gustatory cortex. Blocking synaptic scaling down in the gustatory cortex enhanced the persistence of synaptic strength increases induced by conditioning and prolonged the duration of memory generalization. Taken together, these findings demonstrate that synaptic scaling is crucial for sculpting the specificity of an associative memory and suggest that the relative strengths of Hebbian and homeostatic plasticity can modulate the balance between stable memory formation and memory generalization.
Collapse
Affiliation(s)
- Chi-Hong Wu
- Department of Biology, Brandeis University, Waltham, MA 02453, USA
| | - Raul Ramos
- Department of Biology, Brandeis University, Waltham, MA 02453, USA
| | - Donald B Katz
- Department of Psychology, Brandeis University, Waltham, MA 02453, USA
| | | |
Collapse
|
8
|
Weidel P, Duarte R, Morrison A. Unsupervised Learning and Clustered Connectivity Enhance Reinforcement Learning in Spiking Neural Networks. Front Comput Neurosci 2021; 15:543872. [PMID: 33746728 PMCID: PMC7970044 DOI: 10.3389/fncom.2021.543872] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2020] [Accepted: 02/08/2021] [Indexed: 11/13/2022] Open
Abstract
Reinforcement learning is a paradigm that can account for how organisms learn to adapt their behavior in complex environments with sparse rewards. To partition an environment into discrete states, implementations in spiking neuronal networks typically rely on input architectures involving place cells or receptive fields specified ad hoc by the researcher. This is problematic as a model for how an organism can learn appropriate behavioral sequences in unknown environments, as it fails to account for the unsupervised and self-organized nature of the required representations. Additionally, this approach presupposes knowledge on the part of the researcher on how the environment should be partitioned and represented and scales poorly with the size or complexity of the environment. To address these issues and gain insights into how the brain generates its own task-relevant mappings, we propose a learning architecture that combines unsupervised learning on the input projections with biologically motivated clustered connectivity within the representation layer. This combination allows input features to be mapped to clusters; thus the network self-organizes to produce clearly distinguishable activity patterns that can serve as the basis for reinforcement learning on the output projections. On the basis of the MNIST and Mountain Car tasks, we show that our proposed model performs better than either a comparable unclustered network or a clustered network with static input projections. We conclude that the combination of unsupervised learning and clustered connectivity provides a generic representational substrate suitable for further computation.
Collapse
Affiliation(s)
- Philipp Weidel
- Institute of Neuroscience and Medicine (INM-6) & Institute for Advanced Simulation (IAS-6) & JARA-Institute Brain Structure-Function Relationship (JBI-1 / INM-10), Research Centre Jülich, Jülich, Germany.,Department of Computer Science 3 - Software Engineering, RWTH Aachen University, Aachen, Germany
| | - Renato Duarte
- Institute of Neuroscience and Medicine (INM-6) & Institute for Advanced Simulation (IAS-6) & JARA-Institute Brain Structure-Function Relationship (JBI-1 / INM-10), Research Centre Jülich, Jülich, Germany
| | - Abigail Morrison
- Institute of Neuroscience and Medicine (INM-6) & Institute for Advanced Simulation (IAS-6) & JARA-Institute Brain Structure-Function Relationship (JBI-1 / INM-10), Research Centre Jülich, Jülich, Germany.,Department of Computer Science 3 - Software Engineering, RWTH Aachen University, Aachen, Germany
| |
Collapse
|
9
|
Wu YK, Hengen KB, Turrigiano GG, Gjorgjieva J. Homeostatic mechanisms regulate distinct aspects of cortical circuit dynamics. Proc Natl Acad Sci U S A 2020; 117:24514-24525. [PMID: 32917810 PMCID: PMC7533694 DOI: 10.1073/pnas.1918368117] [Citation(s) in RCA: 31] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2019] [Accepted: 08/04/2020] [Indexed: 11/18/2022] Open
Abstract
Homeostasis is indispensable to counteract the destabilizing effects of Hebbian plasticity. Although it is commonly assumed that homeostasis modulates synaptic strength, membrane excitability, and firing rates, its role at the neural circuit and network level is unknown. Here, we identify changes in higher-order network properties of freely behaving rodents during prolonged visual deprivation. Strikingly, our data reveal that functional pairwise correlations and their structure are subject to homeostatic regulation. Using a computational model, we demonstrate that the interplay of different plasticity and homeostatic mechanisms can capture the initial drop and delayed recovery of firing rates and correlations observed experimentally. Moreover, our model indicates that synaptic scaling is crucial for the recovery of correlations and network structure, while intrinsic plasticity is essential for the rebound of firing rates, suggesting that synaptic scaling and intrinsic plasticity can serve distinct functions in homeostatically regulating network dynamics.
Collapse
Affiliation(s)
- Yue Kris Wu
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, 60438 Frankfurt, Germany
| | - Keith B Hengen
- Department of Biology, Brandeis University, Waltham, MA 02454
- Department of Biology, Washington University in St. Louis, St. Louis, MO 63130
| | | | - Julijana Gjorgjieva
- Computation in Neural Circuits Group, Max Planck Institute for Brain Research, 60438 Frankfurt, Germany;
- School of Life Sciences, Technical University of Munich, 85354 Freising, Germany
| |
Collapse
|