1
|
Tavakoli SK, Longtin A. Boosting reservoir computer performance with multiple delays. Phys Rev E 2024; 109:054203. [PMID: 38907463 DOI: 10.1103/physreve.109.054203] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2023] [Accepted: 04/01/2024] [Indexed: 06/24/2024]
Abstract
Time delays play a significant role in dynamical systems, as they affect their transient behavior and the dimensionality of their attractors. The number, values, and spacing of these time delays influences the eigenvalues of a nonlinear delay-differential system at its fixed point. Here we explore a multidelay system as the core computational element of a reservoir computer making predictions on its input in the usual regime close to fixed point instability. Variations in the number and separation of time delays are first examined to determine the effect of such parameters of the delay distribution on the effectiveness of time-delay reservoirs for nonlinear time series prediction. We demonstrate computationally that an optoelectronic device with multiple different delays can improve the mapping of scalar input into higher-dimensional dynamics, and thus its memory and prediction capabilities for input time series generated by low- and high-dimensional dynamical systems. In particular, this enhances the suitability of such reservoir computers for predicting input data with temporal correlations. Additionally, we highlight the pronounced harmful resonance condition for reservoir computing when using an electro-optic oscillator model with multiple delays. We illustrate that the resonance point may shift depending on the task at hand, such as cross prediction or multistep ahead prediction, in both single delay and multiple delay cases.
Collapse
Affiliation(s)
- S Kamyar Tavakoli
- Department of Physics, University of Ottawa, 150 Louis Pasteur, Ottawa, Ontario, Canada K1N6N5
| | - André Longtin
- Department of Physics, University of Ottawa, 150 Louis Pasteur, Ottawa, Ontario, Canada K1N6N5
- Centre for Neural Dynamics and AI, University of Ottawa, Ottawa, Ontario, Canada K1N6N5
| |
Collapse
|
2
|
Harding S, Leishman Q, Lunceford W, Passey DJ, Pool T, Webb B. Global forecasts in reservoir computers. CHAOS (WOODBURY, N.Y.) 2024; 34:023136. [PMID: 38407397 DOI: 10.1063/5.0181694] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/17/2023] [Accepted: 01/24/2024] [Indexed: 02/27/2024]
Abstract
A reservoir computer is a machine learning model that can be used to predict the future state(s) of time-dependent processes, e.g., dynamical systems. In practice, data in the form of an input-signal are fed into the reservoir. The trained reservoir is then used to predict the future state of this signal. We develop a new method for not only predicting the future dynamics of the input-signal but also the future dynamics starting at an arbitrary initial condition of a system. The systems we consider are the Lorenz, Rossler, and Thomas systems restricted to their attractors. This method, which creates a global forecast, still uses only a single input-signal to train the reservoir but breaks the signal into many smaller windowed signals. We examine how well this windowed method is able to forecast the dynamics of a system starting at an arbitrary point on a system's attractor and compare this to the standard method without windows. We find that the standard method has almost no ability to forecast anything but the original input-signal while the windowed method can capture the dynamics starting at most points on an attractor with significant accuracy.
Collapse
Affiliation(s)
- S Harding
- Mathematics Department, Brigham Young University, Provo, Utah 84602, USA
| | - Q Leishman
- Mathematics Department, Brigham Young University, Provo, Utah 84602, USA
| | - W Lunceford
- Mathematics Department, Brigham Young University, Provo, Utah 84602, USA
| | - D J Passey
- Mathematics Department, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina 27599, USA
| | - T Pool
- The Robotics Institute, Carnegie Mellon University, Pittsburg, Pennsylvania 15289, USA
| | - B Webb
- Mathematics Department, Brigham Young University, Provo, Utah 84602, USA
| |
Collapse
|
3
|
Mandal S, Shrimali MD. Learning unidirectional coupling using an echo-state network. Phys Rev E 2023; 107:064205. [PMID: 37464638 DOI: 10.1103/physreve.107.064205] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2022] [Accepted: 05/23/2023] [Indexed: 07/20/2023]
Abstract
Reservoir Computing has found many potential applications in the field of complex dynamics. In this article, we explore the exceptional capability of the echo-state network (ESN) model to make it learn a unidirectional coupling scheme from only a few time series data of the system. We show that, once trained with a few example dynamics of a drive-response system, the machine is able to predict the response system's dynamics for any driver signal with the same coupling. Only a few time series data of an A-B type drive-response system in training is sufficient for the ESN to learn the coupling scheme. After training, even if we replace drive system A with a different system C, the ESN can reproduce the dynamics of response system B using the dynamics of new drive system C only.
Collapse
|
4
|
Zhong D, Hu Y, Zhao K, Deng W, Hou P, Zhang J. Accurate separation of mixed high-dimension optical-chaotic signals using optical reservoir computing based on optically pumped VCSELs. OPTICS EXPRESS 2022; 30:39561-39581. [PMID: 36298905 DOI: 10.1364/oe.470857] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/20/2022] [Accepted: 09/28/2022] [Indexed: 06/16/2023]
Abstract
In this work, with the mixing fractions being known in advance or unknown, the schemes and theories for the separations of two groups of the mixed optical chaotic signals are proposed in detail, using the VCSEL-based reservoir computing (RC) systems. Here, two groups of the mixed optical chaotic signals are linearly combined with many beams of the chaotic x-polarization components (X-PCs) and Y-PCs emitted by the optically pumped spin-VCSELs operation alone. Two parallel reservoirs are performed by using the chaotic X-PC and Y-PC output by the optically pumped spin-VCSEL with both optical feedback and optical injection. Moreover, we further demonstrate the separation performances of the mixed chaotic signal linearly combined with no more than three beams of the chaotic X-PC or Y-PC. We find that two groups of the mixed optical chaos signals can be effectively separated by using two reservoirs in single RC system based on optically pumped Spin-VCSEL and their corresponding separated errors characterized by the training errors are no more than 0.093, when the mixing fractions are known as a certain value in advance. If the mixing fractions are unknown, we utilize two cascaded RC systems based on optically pumped Spin-VCSELs to separate each group of the mixed optical signals. The mixing fractions can be accurate predicted by using two parallel reservoirs in the first RC system. Based on the values of the predictive mixing fractions, two groups of the mixed optical chaos signals can be effectively separated by utilizing two parallel reservoirs in the second RC system, and their separated errors also are no more than 0.093. In the same way, the mixed optical chaos signal linearly superimposed with more than three beams of optical chaotic signals can be effectively separated. The method and idea for separation of complex optical chaos signals proposed by this paper may provide an impact to development of novel principles of multiple access and demultiplexing in multi-channel chaotic cryptography communication.
Collapse
|
5
|
Chen Y, Qian Y, Cui X. Time series reconstructing using calibrated reservoir computing. Sci Rep 2022; 12:16318. [PMID: 36175460 PMCID: PMC9522934 DOI: 10.1038/s41598-022-20331-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2022] [Accepted: 09/12/2022] [Indexed: 11/23/2022] Open
Abstract
Reservoir computing, a new method of machine learning, has recently been used to predict the state evolution of various chaotic dynamic systems. It has significant advantages in terms of training cost and adjusted parameters; however, the prediction length is limited. For classic reservoir computing, the prediction length can only reach five to six Lyapunov times. Here, we modified the method of reservoir computing by adding feedback, continuous or discrete, to “calibrate” the input of the reservoir and then reconstruct the entire dynamic systems. The reconstruction length appreciably increased and the training length obviously decreased. The reconstructing of dynamical systems is studied in detail under this method. The reconstruction can be significantly improved both in length and accuracy. Additionally, we summarized the effect of different kinds of input feedback. The more it interacts with others in dynamical equations, the better the reconstructions. Nonlinear terms can reveal more information than linear terms once the interaction terms are equal. This method has proven effective via several classical chaotic systems. It can be superior to traditional reservoir computing in reconstruction, provides new hints in computing promotion, and may be used in some real applications.
Collapse
Affiliation(s)
- Yeyuge Chen
- School of Systems Science, Beijing Normal University, Beijing, 100875, China
| | - Yu Qian
- Nonlinear Research Institute, Baoji University of Arts and Sciences, Baoji, 721007, China
| | - Xiaohua Cui
- School of Systems Science, Beijing Normal University, Beijing, 100875, China.
| |
Collapse
|
6
|
Whiteaker B, Gerstoft P. Reducing echo state network size with controllability matrices. CHAOS (WOODBURY, N.Y.) 2022; 32:073116. [PMID: 35907714 DOI: 10.1063/5.0071926] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/17/2021] [Accepted: 06/21/2022] [Indexed: 06/15/2023]
Abstract
Echo state networks are a fast training variant of recurrent neural networks excelling at approximating nonlinear dynamical systems and time series prediction. These machine learning models act as nonlinear fading memory filters. While these models benefit from quick training and low complexity, computation demands from a large reservoir matrix are a bottleneck. Using control theory, a reduced size replacement reservoir matrix is found. Starting from a large, task-effective reservoir matrix, we form a controllability matrix whose rank indicates the active sub-manifold and candidate replacement reservoir size. Resulting time speed-ups and reduced memory usage come with minimal error increase to chaotic climate reconstruction or short term prediction. Experiments are performed on simple time series signals and the Lorenz-1963 and Mackey-Glass complex chaotic signals. Observing low error models shows variation of active rank and memory along a sequence of predictions.
Collapse
Affiliation(s)
- Brian Whiteaker
- Scripps Institution of Oceanography, University of California at San Diego, La Jolla, California 92093-0238, USA
| | - Peter Gerstoft
- Scripps Institution of Oceanography, University of California at San Diego, La Jolla, California 92093-0238, USA
| |
Collapse
|
7
|
Meiyazhagan J, Manikandan K, Sudharsan JB, Senthilvelan M. Data driven soliton solution of the nonlinear Schrödinger equation with certain P T-symmetric potentials via deep learning. CHAOS (WOODBURY, N.Y.) 2022; 32:053115. [PMID: 35649991 DOI: 10.1063/5.0086038] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/21/2022] [Accepted: 04/18/2022] [Indexed: 06/15/2023]
Abstract
We investigate the physics informed neural network method, a deep learning approach, to approximate soliton solution of the nonlinear Schrödinger equation with parity time symmetric potentials. We consider three different parity time symmetric potentials, namely, Gaussian, periodic, and Rosen-Morse potentials. We use the physics informed neural network to solve the considered nonlinear partial differential equation with the above three potentials. We compare the predicted result with the actual result and analyze the ability of deep learning in solving the considered partial differential equation. We check the ability of deep learning in approximating the soliton solution by taking the squared error between real and predicted values. Further, we examine the factors that affect the performance of the considered deep learning method with different activation functions, namely, ReLU, sigmoid, and tanh. We also use a new activation function, namely, sech, which is not used in the field of deep learning, and analyze whether this new activation function is suitable for the prediction of soliton solution of the nonlinear Schrödinger equation for the aforementioned parity time symmetric potentials. In addition to the above, we present how the network's structure and the size of the training data influence the performance of the physics informed neural network. Our results show that the constructed deep learning model successfully approximates the soliton solution of the considered equation with high accuracy.
Collapse
Affiliation(s)
- J Meiyazhagan
- Department of Nonlinear Dynamics, Bharathidasan University, Tiruchirappalli 620 024, Tamil Nadu, India
| | - K Manikandan
- Centre for Nonlinear Systems, Chennai Institute of Technology, Chennai 600 069, Tamil Nadu, India
| | - J B Sudharsan
- Centre for Nonlinear Systems, Chennai Institute of Technology, Chennai 600 069, Tamil Nadu, India
| | - M Senthilvelan
- Department of Nonlinear Dynamics, Bharathidasan University, Tiruchirappalli 620 024, Tamil Nadu, India
| |
Collapse
|
8
|
Han X, Zhao Y, Small M. A tighter generalization bound for reservoir computing. CHAOS (WOODBURY, N.Y.) 2022; 32:043115. [PMID: 35489854 DOI: 10.1063/5.0082258] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Accepted: 03/22/2022] [Indexed: 06/14/2023]
Abstract
While reservoir computing (RC) has demonstrated astonishing performance in many practical scenarios, the understanding of its capability for generalization on previously unseen data is limited. To address this issue, we propose a novel generalization bound for RC based on the empirical Rademacher complexity under the probably approximately correct learning framework. Note that the generalization bound for the RC is derived in terms of the model hyperparameters. For this reason, it can explore the dependencies of the generalization bound for RC on its hyperparameters. Compared with the existing generalization bound, our generalization bound for RC is tighter, which is verified by numerical experiments. Furthermore, we study the generalization bound for the RC corresponding to different reservoir graphs, including directed acyclic graph (DAG) and Erdős-R e´nyi undirected random graph (ER graph). Specifically, the generalization bound for the RC whose reservoir graph is designated as a DAG can be refined by leveraging the structural property (i.e., the longest path length) of the DAG. Finally, both theoretical and experimental findings confirm that the generalization bound for the RC of a DAG is lower and less sensitive to the model hyperparameters than that for the RC of an ER graph.
Collapse
Affiliation(s)
- Xinyu Han
- Harbin Institute of Technology, Shenzhen, 518055 Guangdong, China
| | - Yi Zhao
- Harbin Institute of Technology, Shenzhen, 518055 Guangdong, China
| | - Michael Small
- Complex Systems Group, Department of Mathematics and Statistics, The University of Western Australia, Crawley, Western Australia 6009, Australia
| |
Collapse
|
9
|
Clustered and deep echo state networks for signal noise reduction. Mach Learn 2022. [DOI: 10.1007/s10994-022-06135-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
10
|
Thorne B, Jüngling T, Small M, Corrêa D, Zaitouny A. Reservoir time series analysis: Using the response of complex dynamical systems as a universal indicator of change. CHAOS (WOODBURY, N.Y.) 2022; 32:033109. [PMID: 35364819 DOI: 10.1063/5.0082122] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Accepted: 02/10/2022] [Indexed: 06/14/2023]
Abstract
We present the idea of reservoir time series analysis (RTSA), a method by which the state space representation generated by a reservoir computing (RC) model can be used for time series analysis. We discuss the motivation for this with reference to the characteristics of RC and present three ad hoc methods for generating representative features from the reservoir state space. We then develop and implement a hypothesis test to assess the capacity of these features to distinguish signals from systems with varying parameters. In comparison to a number of benchmark approaches (statistical, Fourier, phase space, and recurrence analysis), we are able to show significant, generalized accuracy across the proposed RTSA features that surpasses the benchmark methods. Finally, we briefly present an application for bearing fault distinction to motivate the use of RTSA in application.
Collapse
Affiliation(s)
- Braden Thorne
- Complex Systems Group, Department of Mathematics and Statistics, The University of Western Australia, Crawley, Western Australia 6009, Australia
| | - Thomas Jüngling
- Complex Systems Group, Department of Mathematics and Statistics, The University of Western Australia, Crawley, Western Australia 6009, Australia
| | - Michael Small
- Complex Systems Group, Department of Mathematics and Statistics, The University of Western Australia, Crawley, Western Australia 6009, Australia
| | - Débora Corrêa
- ARC Centre for Transforming Maintenance Through Data Science, The University of Western Australia, Crawley, Western Australia 6009, Australia
| | - Ayham Zaitouny
- Complex Systems Group, Department of Mathematics and Statistics, The University of Western Australia, Crawley, Western Australia 6009, Australia
| |
Collapse
|
11
|
Thorne B, Jüngling T, Small M, Hodkiewicz M. Parameter extraction with reservoir computing: Nonlinear time series analysis and application to industrial maintenance. CHAOS (WOODBURY, N.Y.) 2021; 31:033122. [PMID: 33810743 DOI: 10.1063/5.0039193] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/01/2020] [Accepted: 02/18/2021] [Indexed: 06/12/2023]
Abstract
We study the task of determining parameters of dynamical systems from their time series using variations of reservoir computing. Averages of reservoir activations yield a static set of random features that allows us to separate different parameter values. We study such random feature models in the time and frequency domain. For the Lorenz and Rössler systems throughout stable and chaotic regimes, we achieve accurate and robust parameter extraction. For vibration data of centrifugal pumps, we find a significant ability to recover the operating regime. While the time domain models achieve higher performance for the numerical systems, the frequency domain models are superior in the application context.
Collapse
Affiliation(s)
- Braden Thorne
- Complex Systems Group, Department of Mathematics and Statistics, The University of Western Australia, Crawley, Western Australia 6009, Australia
| | - Thomas Jüngling
- Complex Systems Group, Department of Mathematics and Statistics, The University of Western Australia, Crawley, Western Australia 6009, Australia
| | - Michael Small
- Complex Systems Group, Department of Mathematics and Statistics, The University of Western Australia, Crawley, Western Australia 6009, Australia
| | - Melinda Hodkiewicz
- ARC Centre for Transforming Maintenance Through Data Science, The University of Western Australia, Crawley, Western Australia 6009, Australia
| |
Collapse
|
12
|
Han X, Zhao Y, Small M. Revisiting the memory capacity in reservoir computing of directed acyclic network. CHAOS (WOODBURY, N.Y.) 2021; 31:033106. [PMID: 33810761 DOI: 10.1063/5.0040251] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/11/2020] [Accepted: 02/05/2021] [Indexed: 06/12/2023]
Abstract
Reservoir computing (RC) is an attractive area of research by virtue of its potential for hardware implementation and low training cost. An intriguing research direction in this field is to interpret the underlying dynamics of an RC model by analyzing its short-term memory property, which can be quantified by the global index: memory capacity (MC). In this paper, the global MC of the RC whose reservoir network is specified as a directed acyclic network (DAN) is examined, and first we give that its global MC is theoretically bounded by the length of the longest path of the reservoir DAN. Since the global MC is technically influenced by the model hyperparameters, the dependency of the MC on the hyperparameters of this RC is then explored in detail. In the further study, we employ the improved conventional network embedding method (i.e., struc2vec) to mine the underlying memory community in the reservoir DAN, which can be regarded as the cluster of reservoir nodes with the same memory profile. Experimental results demonstrate that such a memory community structure can provide a concrete interpretation of the global MC of this RC. Finally, the clustered RC is proposed by exploiting the detected memory community structure of DAN, where its prediction performance is verified to be enhanced with lower training cost compared with other RC models on several chaotic time series benchmarks.
Collapse
Affiliation(s)
- Xinyu Han
- Harbin Institute of Technology, Shenzhen, 518055 Guangdong, China
| | - Yi Zhao
- Harbin Institute of Technology, Shenzhen, 518055 Guangdong, China
| | - Michael Small
- Complex Systems Group, Department of Mathematics and Statistics, The University of Western Australia, 35 Stirling Highway, Crawley, WA 6009, Australia
| |
Collapse
|
13
|
Flynn A, Tsachouridis VA, Amann A. Multifunctionality in a reservoir computer. CHAOS (WOODBURY, N.Y.) 2021; 31:013125. [PMID: 33754772 DOI: 10.1063/5.0019974] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/26/2020] [Accepted: 12/18/2020] [Indexed: 06/12/2023]
Abstract
Multifunctionality is a well observed phenomenological feature of biological neural networks and considered to be of fundamental importance to the survival of certain species over time. These multifunctional neural networks are capable of performing more than one task without changing any network connections. In this paper, we investigate how this neurological idiosyncrasy can be achieved in an artificial setting with a modern machine learning paradigm known as "reservoir computing." A training technique is designed to enable a reservoir computer to perform tasks of a multifunctional nature. We explore the critical effects that changes in certain parameters can have on the reservoir computers' ability to express multifunctionality. We also expose the existence of several "untrained attractors"; attractors that dwell within the prediction state space of the reservoir computer were not part of the training. We conduct a bifurcation analysis of these untrained attractors and discuss the implications of our results.
Collapse
Affiliation(s)
- Andrew Flynn
- School of Mathematical Sciences, University College Cork, Cork T12 XF62, Ireland
| | | | - Andreas Amann
- School of Mathematical Sciences, University College Cork, Cork T12 XF62, Ireland
| |
Collapse
|
14
|
Chen X, Weng T, Yang H, Gu C, Zhang J, Small M. Mapping topological characteristics of dynamical systems into neural networks: A reservoir computing approach. Phys Rev E 2020; 102:033314. [PMID: 33075895 DOI: 10.1103/physreve.102.033314] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2020] [Accepted: 09/09/2020] [Indexed: 11/07/2022]
Abstract
Significant advances have recently been made in modeling chaotic systems with the reservoir computing approach, especially for prediction. We find that although state prediction of the trained reservoir computer will gradually deviate from the actual trajectory of the original system, the associated geometric features remain invariant. Specifically, we show that the typical geometric metrics including the correlation dimension, the multiscale entropy, and the memory effect are nearly identical between the trained reservoir computer and its learned chaotic systems. We further demonstrate this fact on a broad range of chaotic systems ranging from discrete and continuous chaotic systems to hyperchaotic systems. Our findings suggest that the successfully reservoir computer may be topologically conjugate to an observed dynamical system.
Collapse
Affiliation(s)
- Xiaolu Chen
- Business School, University of Shanghai for Science and Technology, Shanghai 200093, People's Republic of China
| | - Tongfeng Weng
- Alibaba Research Center for Complexity Sciences, Hangzhou Normal University, Hangzhou 311121, People's Republic of China
| | - Huijie Yang
- Business School, University of Shanghai for Science and Technology, Shanghai 200093, People's Republic of China
| | - Changgui Gu
- Business School, University of Shanghai for Science and Technology, Shanghai 200093, People's Republic of China
| | - Jie Zhang
- Institute of Science and Technology for Brain-Inspired Intellegence, Fudan University, Shanghai 200433, People's Republic of China
| | - Michael Small
- Complex Systems Group, Department of Mathematics and Statistics, The University of Western Australia, Crawley, Western Australia 6009, Australia.,Mineral Resources, CSIRO, Kensington, Western Australia 6151, Australia
| |
Collapse
|
15
|
Lu Z, Bassett DS. Invertible generalized synchronization: A putative mechanism for implicit learning in neural systems. CHAOS (WOODBURY, N.Y.) 2020; 30:063133. [PMID: 32611103 DOI: 10.1063/5.0004344] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/10/2020] [Accepted: 05/25/2020] [Indexed: 06/11/2023]
Abstract
Regardless of the marked differences between biological and artificial neural systems, one fundamental similarity is that they are essentially dynamical systems that can learn to imitate other dynamical systems whose governing equations are unknown. The brain is able to learn the dynamic nature of the physical world via experience; analogously, artificial neural systems such as reservoir computing networks (RCNs) can learn the long-term behavior of complex dynamical systems from data. Recent work has shown that the mechanism of such learning in RCNs is invertible generalized synchronization (IGS). Yet, whether IGS is also the mechanism of learning in biological systems remains unclear. To shed light on this question, we draw inspiration from features of the human brain to propose a general and biologically feasible learning framework that utilizes IGS. To evaluate the framework's relevance, we construct several distinct neural network models as instantiations of the proposed framework. Regardless of their particularities, these neural network models can consistently learn to imitate other dynamical processes with a biologically feasible adaptation rule that modulates the strength of synapses. Further, we observe and theoretically explain the spontaneous emergence of four distinct phenomena reminiscent of cognitive functions: (i) learning multiple dynamics; (ii) switching among the imitations of multiple dynamical systems, either spontaneously or driven by external cues; (iii) filling-in missing variables from incomplete observations; and (iv) deciphering superimposed input from different dynamical systems. Collectively, our findings support the notion that biological neural networks can learn the dynamic nature of their environment through the mechanism of IGS.
Collapse
Affiliation(s)
- Zhixin Lu
- Department of Bioengineering, School of Engineering and Applied Science, University of Pennsylvania, Philadelphia, Pennsylvania 19104, USA
| | - Danielle S Bassett
- Department of Bioengineering, School of Engineering and Applied Science, University of Pennsylvania, Philadelphia, Pennsylvania 19104, USA
| |
Collapse
|
16
|
Tang Y, Kurths J, Lin W, Ott E, Kocarev L. Introduction to Focus Issue: When machine learning meets complex systems: Networks, chaos, and nonlinear dynamics. CHAOS (WOODBURY, N.Y.) 2020; 30:063151. [PMID: 32611112 DOI: 10.1063/5.0016505] [Citation(s) in RCA: 27] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/04/2020] [Accepted: 06/05/2020] [Indexed: 06/11/2023]
Affiliation(s)
- Yang Tang
- Key Laboratory of Advanced Control and Optimization for Chemical Processes, Ministry of Education, East China University of Science and Technology, Shanghai, China
| | - Jürgen Kurths
- Potsdam Institute for Climate Impact Research, Potsdam 14473, Germany
| | - Wei Lin
- Center for Computational Systems Biology of ISTBI and Research Institute of Intelligent Complex Systems, Fudan University, Shanghai 200433, China
| | - Edward Ott
- Department of Physics, University of Maryland, College Park, Maryland 20742, USA
| | - Ljupco Kocarev
- Macedonian Academy of Sciences and Arts, 1000 Skopje, Macedonia
| |
Collapse
|