1
|
Lobov SA, Zharinov AI, Makarov VA, Kazantsev VB. Spatial Memory in a Spiking Neural Network with Robot Embodiment. SENSORS 2021; 21:s21082678. [PMID: 33920246 PMCID: PMC8070389 DOI: 10.3390/s21082678] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/19/2021] [Revised: 04/06/2021] [Accepted: 04/07/2021] [Indexed: 11/16/2022]
Abstract
Cognitive maps and spatial memory are fundamental paradigms of brain functioning. Here, we present a spiking neural network (SNN) capable of generating an internal representation of the external environment and implementing spatial memory. The SNN initially has a non-specific architecture, which is then shaped by Hebbian-type synaptic plasticity. The network receives stimuli at specific loci, while the memory retrieval operates as a functional SNN response in the form of population bursts. The SNN function is explored through its embodiment in a robot moving in an arena with safe and dangerous zones. We propose a measure of the global network memory using the synaptic vector field approach to validate results and calculate information characteristics, including learning curves. We show that after training, the SNN can effectively control the robot’s cognitive behavior, allowing it to avoid dangerous regions in the arena. However, the learning is not perfect. The robot eventually visits dangerous areas. Such behavior, also observed in animals, enables relearning in time-evolving environments. If a dangerous zone moves into another place, the SNN remaps positive and negative areas, allowing escaping the catastrophic interference phenomenon known for some AI architectures. Thus, the robot adapts to changing world.
Collapse
Affiliation(s)
- Sergey A. Lobov
- Neurotechnology Department, Lobachevsky State University of Nizhny Novgorod, 23 Gagarin Ave., 603950 Nizhny Novgorod, Russia; (A.I.Z.); (V.A.M.); (V.B.K.)
- Neuroscience and Cognitive Technology Laboratory, Center for Technologies in Robotics and Mechatronics Components, Innopolis University, 1 Universitetskaya Str., 420500 Innopolis, Russia
- Center For Neurotechnology and Machine Learning, Immanuel Kant Baltic Federal University, 14 Nevsky Str., 236016 Kaliningrad, Russia
- Correspondence:
| | - Alexey I. Zharinov
- Neurotechnology Department, Lobachevsky State University of Nizhny Novgorod, 23 Gagarin Ave., 603950 Nizhny Novgorod, Russia; (A.I.Z.); (V.A.M.); (V.B.K.)
| | - Valeri A. Makarov
- Neurotechnology Department, Lobachevsky State University of Nizhny Novgorod, 23 Gagarin Ave., 603950 Nizhny Novgorod, Russia; (A.I.Z.); (V.A.M.); (V.B.K.)
- Instituto de Matemática Interdisciplinar, Facultad de Ciencias Matemáticas, Universidad Complutense de Madrid, 28040 Madrid, Spain
| | - Victor B. Kazantsev
- Neurotechnology Department, Lobachevsky State University of Nizhny Novgorod, 23 Gagarin Ave., 603950 Nizhny Novgorod, Russia; (A.I.Z.); (V.A.M.); (V.B.K.)
- Neuroscience and Cognitive Technology Laboratory, Center for Technologies in Robotics and Mechatronics Components, Innopolis University, 1 Universitetskaya Str., 420500 Innopolis, Russia
- Center For Neurotechnology and Machine Learning, Immanuel Kant Baltic Federal University, 14 Nevsky Str., 236016 Kaliningrad, Russia
- Lab of Neurocybernetics, Russian State Scientific Center for Robotics and Technical Cybernetics, 21 Tikhoretsky Ave., St., 194064 Petersburg, Russia
| |
Collapse
|
2
|
Lee JY, Stiber M, Si D. Machine Learning of Spatiotemporal Bursting Behavior in Developing Neural Networks. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2018; 2018:348-351. [PMID: 30440408 DOI: 10.1109/embc.2018.8512358] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
As with other modern sciences (and their computational counterparts), neuroscience experiments can now produce data that, in terms of both quantity and complexity, challenge our interpretative abilities. It is relatively common to be faced with datasets containing many millions of neural spikes collected from tens of thousands of neurons. Traditional data analysis methods can, in a relatively straightforward manner, identify large-scale features in such data (e.g., on the scale of entire networks). What these approaches often cannot do is to connect macroscopic activity to the relevant small-scale behaviors of individual cells, especially in the face of ongoing background activity that is not relevant. This communication presents an application of machine learning techniques to bridge the gap between microscopic and macroscopic behaviors and identify the small-scale activity that leads to large-scale behavior, reducing data complexity to a level that can be amenable to further analysis. A small number of spatiotemporal spikes (among many millions) were found to provide reliable information about if and where a burst will occur.
Collapse
|
3
|
Neural electrical activity and neural network growth. Neural Netw 2018; 101:15-24. [PMID: 29475142 DOI: 10.1016/j.neunet.2018.02.001] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2017] [Revised: 01/31/2018] [Accepted: 02/01/2018] [Indexed: 01/19/2023]
Abstract
The development of central and peripheral neural system depends in part on the emergence of the correct functional connectivity in its input and output pathways. Now it is generally accepted that molecular factors guide neurons to establish a primary scaffold that undergoes activity-dependent refinement for building a fully functional circuit. However, a number of experimental results obtained recently shows that the neuronal electrical activity plays an important role in the establishing of initial interneuronal connections. Nevertheless, these processes are rather difficult to study experimentally, due to the absence of theoretical description and quantitative parameters for estimation of the neuronal activity influence on growth in neural networks. In this work we propose a general framework for a theoretical description of the activity-dependent neural network growth. The theoretical description incorporates a closed-loop growth model in which the neural activity can affect neurite outgrowth, which in turn can affect neural activity. We carried out the detailed quantitative analysis of spatiotemporal activity patterns and studied the relationship between individual cells and the network as a whole to explore the relationship between developing connectivity and activity patterns. The model, developed in this work will allow us to develop new experimental techniques for studying and quantifying the influence of the neuronal activity on growth processes in neural networks and may lead to a novel techniques for constructing large-scale neural networks by self-organization.
Collapse
|