1
|
Stability of Solutions to Systems of Nonlinear Differential Equations with Discontinuous Right-Hand Sides: Applications to Hopfield Artificial Neural Networks. MATHEMATICS 2022. [DOI: 10.3390/math10091524] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
In this paper, we study the stability of solutions to systems of differential equations with discontinuous right-hand sides. We have investigated nonlinear and linear equations. Stability sufficient conditions for linear equations are expressed as a logarithmic norm for coefficients of systems of equations. Stability sufficient conditions for nonlinear equations are expressed as the logarithmic norm of the Jacobian of the right-hand side of the system of equations. Sufficient conditions for the stability of solutions of systems of differential equations expressed in terms of logarithmic norms of the right-hand sides of equations (for systems of linear equations) and the Jacobian of right-hand sides (for nonlinear equations) have the following advantages: (1) in investigating stability in different metrics from the same standpoints, we have obtained a set of sufficient conditions; (2) sufficient conditions are easily expressed; (3) robustness areas of systems are easily determined with respect to the variation of their parameters; (4) in case of impulse action, information on moments of impact distribution is not required; (5) a method to obtain sufficient conditions of stability is extended to other definitions of stability (in particular, to p-moment stability). The obtained sufficient conditions are used to study Hopfield neural networks with discontinuous synapses and discontinuous activation functions.
Collapse
|
2
|
Jeon PR, Hong MS, Braatz RD. Compact Neural Network Modeling of Nonlinear Dynamical Systems via the Standard Nonlinear Operator Form. Comput Chem Eng 2022. [DOI: 10.1016/j.compchemeng.2022.107674] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
3
|
Xie X, Lam J, Fan C. Robust time-weighted guaranteed cost control of uncertain periodic piecewise linear systems. Inf Sci (N Y) 2018. [DOI: 10.1016/j.ins.2018.05.052] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
4
|
Kim KKK, Patrón ER, Braatz RD. Standard representation and unified stability analysis for dynamic artificial neural network models. Neural Netw 2017; 98:251-262. [PMID: 29287188 DOI: 10.1016/j.neunet.2017.11.014] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2017] [Revised: 09/25/2017] [Accepted: 11/20/2017] [Indexed: 12/01/2022]
Abstract
An overview is provided of dynamic artificial neural network models (DANNs) for nonlinear dynamical system identification and control problems, and convex stability conditions are proposed that are less conservative than past results. The three most popular classes of dynamic artificial neural network models are described, with their mathematical representations and architectures followed by transformations based on their block diagrams that are convenient for stability and performance analyses. Classes of nonlinear dynamical systems that are universally approximated by such models are characterized, which include rigorous upper bounds on the approximation errors. A unified framework and linear matrix inequality-based stability conditions are described for different classes of dynamic artificial neural network models that take additional information into account such as local slope restrictions and whether the nonlinearities within the DANNs are odd. A theoretical example shows reduced conservatism obtained by the conditions.
Collapse
Affiliation(s)
- Kwang-Ki K Kim
- Department of Electrical Engineering, Inha University, Incheon, Republic of Korea.
| | | | - Richard D Braatz
- Massachusetts Institute of Technology, Cambridge, MA, United States.
| |
Collapse
|
5
|
Velmurugan G, Rakkiyappan R, Vembarasan V, Cao J, Alsaedi A. Dissipativity and stability analysis of fractional-order complex-valued neural networks with time delay. Neural Netw 2017. [PMID: 27939066 DOI: 10.1186/s13662-017-1266-3] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/16/2023]
Abstract
As we know, the notion of dissipativity is an important dynamical property of neural networks. Thus, the analysis of dissipativity of neural networks with time delay is becoming more and more important in the research field. In this paper, the authors establish a class of fractional-order complex-valued neural networks (FCVNNs) with time delay, and intensively study the problem of dissipativity, as well as global asymptotic stability of the considered FCVNNs with time delay. Based on the fractional Halanay inequality and suitable Lyapunov functions, some new sufficient conditions are obtained that guarantee the dissipativity of FCVNNs with time delay. Moreover, some sufficient conditions are derived in order to ensure the global asymptotic stability of the addressed FCVNNs with time delay. Finally, two numerical simulations are posed to ensure that the attention of our main results are valuable.
Collapse
Affiliation(s)
- G Velmurugan
- Department of Mathematics, Bharathiar University, Coimbatore-641 046, Tamil Nadu, India
| | - R Rakkiyappan
- Department of Mathematics, Bharathiar University, Coimbatore-641 046, Tamil Nadu, India.
| | - V Vembarasan
- Department of Mathematics, SSN College of Engineering, Chennai-600 004, Tamil Nadu, India
| | - Jinde Cao
- Department of Mathematics, and Research Center for Complex Systems and Network Sciences, Southeast University, Nanjing 210096, Jiangsu, China; Department of Mathematics, Faculty of Science, King Abdulaziz University, Jeddah 21589, Saudi Arabia.
| | - Ahmed Alsaedi
- Nonlinear Analysis and Applied Mathematics (NAAM) Research Group, Department of Mathematics, King Abdulaziz University, Jeddah 21589, Saudi Arabia
| |
Collapse
|
6
|
Lu W, Zheng R, Chen T. Centralized and decentralized global outer-synchronization of asymmetric recurrent time-varying neural network by data-sampling. Neural Netw 2016; 75:22-31. [DOI: 10.1016/j.neunet.2015.11.006] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2015] [Revised: 10/29/2015] [Accepted: 11/10/2015] [Indexed: 11/24/2022]
|
7
|
Abstract
Supervised learning in recurrent neural networks involves two processes: the neuron activity from which gradients are estimated and the process on connection parameters induced by these measurements. A problem such algorithms must address is how to balance the relative rates of these activities so that accurate sensitivity estimates are obtained while still allowing synaptic modification to take place at a rate sufficient for learning. We show how to calculate a sufficient timescale separation between these two processes for a class of contracting neural networks.
Collapse
Affiliation(s)
- Thomas Flynn
- Graduate Center, City University of New York, New York, NY 10016, U.S.A.
| |
Collapse
|
8
|
Zhao Y, Feng Z, Ding W. Existence and stability of periodic solution of impulsive neural systems with complex deviating arguments. JOURNAL OF BIOLOGICAL DYNAMICS 2014; 9 Suppl 1:291-306. [PMID: 25397685 DOI: 10.1080/17513758.2014.978401] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Abstract
This paper discusses a class of impulsive neural networks with the variable delay and complex deviating arguments. By using Mawhin's continuation theorem of coincidence degree and the Halanay-type inequalities, several sufficient conditions for impulsive neural networks are established for the existence and globally exponential stability of periodic solutions, respectively. Furthermore, the obtained results are applied to some typical impulsive neural network systems as special cases, with a real-life example to show feasibility of our results.
Collapse
Affiliation(s)
- Yong Zhao
- a School of Mathematics and Information Science , Henan Polytechnic University , Jiaozuo 454000 , People's Republic of China
| | | | | |
Collapse
|
9
|
Liu PL. Improved delay-dependent stability of neutral type neural networks with distributed delays. ISA TRANSACTIONS 2013; 52:717-724. [PMID: 23871149 DOI: 10.1016/j.isatra.2013.06.012] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/22/2013] [Revised: 06/07/2013] [Accepted: 06/25/2013] [Indexed: 06/02/2023]
Abstract
This paper deals with the problem of improved delay-dependent stability analysis of neutral type neural networks with distributed delays. These conditions are in terms of linear matrix inequality (LMI), easily checked by recently developed algorithms in solving linear matrix inequalities (LMIs). Finally, numerical examples demonstrate effectiveness of the proposed method.
Collapse
Affiliation(s)
- Pin-Lin Liu
- Department of Automation Engineering Institute of Mechatronoptic System, Chienkuo Technology University, Changhua 500, Taiwan, ROC.
| |
Collapse
|
10
|
ZHAO YONG, XIA YONGHUI, LU QISHAO. STABILITY ANALYSIS OF A CLASS OF GENERAL PERIODIC NEURAL NETWORKS WITH DELAYS AND IMPULSES. Int J Neural Syst 2011; 19:375-86. [DOI: 10.1142/s012906570900204x] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Based on the inequality analysis, matrix theory and spectral theory, a class of general periodic neural networks with delays and impulses is studied. Some sufficient conditions are established for the existence and globally exponential stability of a unique periodic solution. Furthermore, the results are applied to some typical impulsive neural network systems as special cases, with a real-life example to show feasibility of our results.
Collapse
Affiliation(s)
- YONG ZHAO
- Department of Dynamics and Control, Beihang University, Beijing 100191, China
| | - YONGHUI XIA
- Department of Mathematics, Zhejiang Normal University, Jinhua, 210034, China
| | - QISHAO LU
- Department of Dynamics and Control, Beihang University, Beijing 100191, China
| |
Collapse
|
11
|
|
12
|
Yuguang Fang, Cohen MA, Kincaid TG. Dynamic Analysis of a General Class of Winner-Take-All Competitive Neural Networks. ACTA ACUST UNITED AC 2010; 21:771-83. [DOI: 10.1109/tnn.2010.2041671] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
13
|
|
14
|
Feng JE, Xu S, Zou Y. Delay-dependent stability of neutral type neural networks with distributed delays. Neurocomputing 2009. [DOI: 10.1016/j.neucom.2008.10.018] [Citation(s) in RCA: 70] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2022]
|
15
|
Qiao C, Xu Z. A critical global convergence analysis of recurrent neural networks with general projection mappings. Neurocomputing 2009. [DOI: 10.1016/j.neucom.2008.06.006] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
16
|
Juang JC. Stability analysis of Hopfield-type neural networks. IEEE TRANSACTIONS ON NEURAL NETWORKS 2008; 10:1366-74. [PMID: 18252637 DOI: 10.1109/72.809081] [Citation(s) in RCA: 39] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
The paper applies several concepts in robust control research such as linear matrix inequalities, edge theorem, parameter-dependent Lyapunov function, and Popov criteria to investigate the stability property of Hopfield-type neural networks. The existence and uniqueness of an equilibrium is formulated as a matrix determinant problem. An induction scheme is used to find the equilibrium. To verify whether the determinant is nozero for a class of matrix, a numerical range test is proposed. Several robust control techniques in particular linear matrix inequalities are used to characterize the local stability of the neural networks around the equilibrium. The global stability of the Hopfield neural networks is then addressed using a parameter-dependent Lyapunov function technique. All these results are shown to generalize existing results in verifying the existence/uniqueness of the equilibrium and local/global stability of Hopfield-type neural networks.
Collapse
Affiliation(s)
- J C Juang
- Department of Electrical Engineering, National Cheng Kung University, Tainan, Taiwan
| |
Collapse
|
17
|
Liang XB. Equivalence between local exponential stability of the unique equilibrium point and global stability for Hopfield-type neural networks with two neurons. IEEE TRANSACTIONS ON NEURAL NETWORKS 2008; 11:1194-6. [PMID: 18249846 DOI: 10.1109/72.870051] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
In a recent paper, Fang and Kincaid proposed an open problem about the relationship between the local stability of the unique equilibrium point and the global stability for a Hopfield-type neural network with continuously differentiable and monotonically increasing activation functions. As a partial answer to the question, in the two-neuron case it is proved that for each given specific interconnection weight matrix, a Hopfield-type neural network has a unique equilibrium point which is also locally exponentially stable for any activation functions and for any other network parameters if and only if the network is globally asymptotically stable for any activation functions and for any other network parameters. If the derivatives of the activation functions of the network are bounded, then the network is globally exponentially stable for any activation functions and for any other network parameters.
Collapse
|
18
|
Hong Q, Peng J, Xu ZB, Zhang B. A reference model approach to stability analysis of neural networks. ACTA ACUST UNITED AC 2008; 33:925-36. [PMID: 18238244 DOI: 10.1109/tsmcb.2002.804368] [Citation(s) in RCA: 112] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
In this paper, a novel methodology called a reference model approach to stability analysis of neural networks is proposed. The core of the new approach is to study a neural network model with reference to other related models, so that different modeling approaches can be combinatively used and powerfully cross-fertilized. Focused on two representative neural network modeling approaches (the neuron state modeling approach and the local field modeling approach), we establish a rigorous theoretical basis on the feasibility and efficiency of the reference model approach. The new approach has been used to develop a series of new, generic stability theories for various neural network models. These results have been applied to several typical neural network systems including the Hopfield-type neural networks, the recurrent back-propagation neural networks, the BSB-type neural networks, the bound-constraints optimization neural networks, and the cellular neural networks. The results obtained unify, sharpen or generalize most of the existing stability assertions, and illustrate the feasibility and power of the new method.
Collapse
Affiliation(s)
- Qiao Hong
- Dept. of Comput., Univ. of Manchester Inst. of Sci. & Technol., UK
| | | | | | | |
Collapse
|
19
|
Xia Y, Cao J, Sun Cheng S. Global exponential stability of delayed cellular neural networks with impulses. Neurocomputing 2007. [DOI: 10.1016/j.neucom.2006.08.005] [Citation(s) in RCA: 51] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
20
|
Patan K. Stability Analysis and the Stabilization of a Class of Discrete-Time Dynamic Neural Networks. ACTA ACUST UNITED AC 2007; 18:660-73. [PMID: 17526334 DOI: 10.1109/tnn.2007.891199] [Citation(s) in RCA: 42] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
This paper deals with problems of stability and the stabilization of discrete-time neural networks. Neural structures under consideration belong to the class of the so-called locally recurrent globally feedforward networks. The single processing unit possesses dynamic behavior. It is realized by introducing into the neuron structure a linear dynamic system in the form of an infinite impulse response filter. In this way, a dynamic neural network is obtained. It is well known that the crucial problem with neural networks of the dynamic type is stability as well as stabilization in learning problems. The paper formulates stability conditions for the analyzed class of neural networks. Moreover, a stabilization problem is defined and solved as a constrained optimization task. In order to tackle this problem two methods are proposed. The first one is based on a gradient projection (GP) and the second one on a minimum distance projection (MDP). It is worth noting that these methods can be easily introduced into the existing learning algorithm as an additional step, and suitable convergence conditions can be developed for them. The efficiency and usefulness of the proposed approaches are justified by using a number of experiments including numerical complexity analysis, stabilization effectiveness, and the identification of an industrial process.
Collapse
Affiliation(s)
- Krzysztof Patan
- Institute of Control and Computation Engineering, University of Zielona Góra, Zielona Góra 65-246, Poland.
| |
Collapse
|
21
|
Liao X, Wang L, Yu P. Stability of Dynamical Systems. MONOGRAPH SERIES ON NONLINEAR SCIENCE AND COMPLEXITY 2007. [DOI: 10.1016/s1574-6917(07)05001-5] [Citation(s) in RCA: 52] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/31/2023]
|
22
|
Sanqing Hu, Jun Wang. Global robust stability of a class of discrete-time interval neural networks. ACTA ACUST UNITED AC 2006. [DOI: 10.1109/tcsi.2005.854288] [Citation(s) in RCA: 52] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
23
|
Vanualailai J, Nakagiri SI. Some generalized sufficient convergence criteria for nonlinear continuous neural networks. Neural Comput 2005; 17:1820-35. [PMID: 15969919 DOI: 10.1162/0899766054026701] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
A reason for applying the direct method of Lyapunov to artificial neural networks (ANNs) is to design dynamical neural networks so that they exhibit global asymptotic stability. Lyapunov functions that frequently appear in the ANN literature include the quadratic function, the Persidskii function, and the Luré-Postnikov function. This contribution revisits the quadratic function and shows that via Krasovskii-like stability criteria, it is possible to have a very simple and systematic procedure to obtain not only new and generalized results but also well-known sufficient conditions for convergence established recently by non-Lyapunov methods, such as the matrix measure and nonlinear measure.
Collapse
Affiliation(s)
- Jito Vanualailai
- Department of Mathematics and Computing Science, University of the South Pacific, Suva, Fiji.
| | | |
Collapse
|
24
|
Pastravanu O, Matcovschi MH. Absolute componentwise stability of interval hopfield neural networks. IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS. PART B, CYBERNETICS : A PUBLICATION OF THE IEEE SYSTEMS, MAN, AND CYBERNETICS SOCIETY 2005; 35:136-41. [PMID: 15719942 DOI: 10.1109/tsmcb.2004.839246] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
25
|
Truccolo WA, Rangarajan G, Chen Y, Ding M. Analyzing stability of equilibrium points in neural networks: a general approach. Neural Netw 2004; 16:1453-60. [PMID: 14622876 DOI: 10.1016/s0893-6080(03)00136-9] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
Abstract
Networks of coupled neural systems represent an important class of models in computational neuroscience. In some applications it is required that equilibrium points in these networks remain stable under parameter variations. Here we present a general methodology to yield explicit constraints on the coupling strengths to ensure the stability of the equilibrium point. Two models of coupled excitatory-inhibitory oscillators are used to illustrate the approach.
Collapse
Affiliation(s)
- Wilson A Truccolo
- Department of Neuroscience, Brown University, Providence, RI 02912, USA.
| | | | | | | |
Collapse
|
26
|
Abstract
The neuron state modeling and the local field modeling provides two fundamental modeling approaches to neural network research, based on which a neural network system can be called either as a static neural network model or as a local field neural network model. These two models are theoretically compared in terms of their trajectory transformation property, equilibrium correspondence property, nontrivial attractive manifold property, global convergence as well as stability in many different senses. The comparison reveals an important stability invariance property of the two models in the sense that the stability (in any sense) of the static model is equivalent to that of a subsystem deduced from the local field model when restricted to a specific manifold. Such stability invariance property lays a sound theoretical foundation of validity of a useful, cross-fertilization type stability analysis methodology for various neural network models.
Collapse
Affiliation(s)
- Zong-Ben Xu
- Institute for Information and System Sciences, Xi'an Jiaotong University, Xi'an, China.
| | | | | | | |
Collapse
|
27
|
Chunhua Feng, Plamondon R. Stability analysis of bidirectional associative memory networks with time delays. ACTA ACUST UNITED AC 2003; 14:1560-5. [DOI: 10.1109/tnn.2003.820829] [Citation(s) in RCA: 77] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
28
|
Liao X, Wang J. Global dissipativity of continuous-time recurrent neural networks with time delay. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2003; 68:016118. [PMID: 12935211 DOI: 10.1103/physreve.68.016118] [Citation(s) in RCA: 41] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/07/2002] [Indexed: 05/24/2023]
Abstract
This paper addresses the global dissipativity of a general class of continuous-time recurrent neural networks. First, the concepts of global dissipation and global exponential dissipation are defined and elaborated. Next, the sets of global dissipativity and global exponentially dissipativity are characterized using the parameters of recurrent neural network models. In particular, it is shown that the Hopfield network and cellular neural networks with or without time delays are dissipative systems.
Collapse
Affiliation(s)
- Xiaoxin Liao
- Department of Control Science and Engineering, Huazhong University of Science and Technology, Wuhan, Hubei, China
| | | |
Collapse
|
29
|
Abstract
In this letter, we discuss the dynamics of the Cohen-Grossberg neural networks. We provide a new and relaxed set of sufficient conditions for the Cohen-Grossberg networks to be absolutely stable and exponentially stable globally. We also provide an estimate of the rate of convergence.
Collapse
Affiliation(s)
- Wenlian Lu
- Laboratory of Nonlinear Science, Institute of Mathematics, Fudan University, Shanghai 200433, P.R. China.
| | | |
Collapse
|
30
|
Chen T, Lu W, Amari SI. Global convergence rate of recurrently connected neural networks. Neural Comput 2002; 14:2947-57. [PMID: 12487799 DOI: 10.1162/089976602760805359] [Citation(s) in RCA: 17] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
We discuss recurrently connected neural networks, investigating their global exponential stability (GES). Some sufficient conditions for a class of recurrent neural networks belonging to GES are given. Sharp convergence rate is given too.
Collapse
Affiliation(s)
- Tianping Chen
- Laboratory of Nonlinear Mathematics Science, Institute of Mathematics, Fudan University, Shanghai, China.
| | | | | |
Collapse
|
31
|
|
32
|
Sanqing Hu, Jun Wang. Global stability of a class of discrete-time recurrent neural networks. ACTA ACUST UNITED AC 2002. [DOI: 10.1109/tcsi.2002.801284] [Citation(s) in RCA: 42] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
33
|
Hu S, Wang J. Global exponential stability of continuous-time interval neural networks. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2002; 65:036133. [PMID: 11909191 DOI: 10.1103/physreve.65.036133] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/07/2001] [Indexed: 05/23/2023]
Abstract
This paper addresses global robust stability of a class of continuous-time interval neural networks that contain time-invariant uncertain parameters with their values being unknown but bounded in given compact sets. We first introduce the concept of diagonally constrained interval neural networks and present a necessary and sufficient condition for global exponential stability of these interval neural networks irregardless of any bounds of non-diagonal uncertain parameters in connection weight matrices. Then we extend the robust stability result to general interval neural networks by giving a sufficient condition. Simulation results illustrate the characteristics of the main results.
Collapse
Affiliation(s)
- Sanqing Hu
- Department of Automation and Computer-Aided Engineering, The Chinese University of Hong Kong, Shatin, New Territories, Hong Kong
| | | |
Collapse
|
34
|
Abstract
The stability of neural networks is a prerequisite for successful applications of the networks as either associative memories or optimization solvers. Because the integration and communication delays are ubiquitous, the stability of neural networks with delays has received extensive attention. However, the approach used in the previous investigation is mainly based on Liapunov's direct method. Since the construction of Liapunov function is very skilful, there is little compatibility among the existing results. In this paper, we develop a new approach to stability analysis of Hopfield-type neural networks with time-varying delays by defining two novel quantities of nonlinear function similar to the matrix norm and the matrix measure, respectively. With the new approach, we present sufficient conditions of the stability, which are either the generalization of those existing or new. The developed approach may be also applied for any general system with time delays rather than Hopfield-type neural networks.
Collapse
Affiliation(s)
- Jigen Peng
- Institute for Information and System Science, Faculty of Science, Xi'an Jiaotong University, PR China.
| | | | | |
Collapse
|
35
|
Abstract
In this paper, the problems of stability of delayed neural networks are investigated, including the stability of discrete and distributed delayed neural networks. Under the generalization of dropping the Lipschitzian hypotheses for output functions, some stability criteria are obtained by using the Liapunov functional method. We do not assume the symmetry of the connection matrix and we establish that the system admits a unique equilibrium point in which the output functions do not satisfy the Lipschitz conditions and do not require them to be differential or strictly monotonously increasing. These criteria can be used to analyze the dynamics of biological neural systems or to design globally stable artificial neural networks.
Collapse
Affiliation(s)
- C Feng
- Laboratoire Scribens, Ecole Polytechnique de Montréal, Québec, Canada.
| | | |
Collapse
|
36
|
Abstract
In this paper, we discuss dynamical behaviors of recurrently asymmetrically connected neural networks. We propose a new approach to study global convergence of the networks. Better test conditions for global convergence are given.
Collapse
Affiliation(s)
- T Chen
- Department of Mathematics, Fudan University, Shanghai, PR, China.
| | | |
Collapse
|
37
|
Xue-Bin Liang, Si J. Global exponential stability of neural networks with globally Lipschitz continuous activations and its application to linear variational inequality problem. ACTA ACUST UNITED AC 2001; 12:349-59. [DOI: 10.1109/72.914529] [Citation(s) in RCA: 72] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
38
|
Hong Qiao, Jigen Peng, Zong-Ben Xu. Nonlinear measures: a new approach to exponential stability analysis for Hopfield-type neural networks. ACTA ACUST UNITED AC 2001; 12:360-70. [DOI: 10.1109/72.914530] [Citation(s) in RCA: 119] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
39
|
Abstract
We discuss some delayed dynamical systems, investigating their stability and convergence. We prove that under mild conditions, these delayed systems are global exponential convergent.
Collapse
Affiliation(s)
- T Chen
- Department of Mathematics, Fudan University, Shanghai, China
| | | |
Collapse
|
40
|
Fernandez de Caflete J, Barreiro A, Garcia-Cerezo A, Garcia-Moral I. An input-output based robust stabilization criterion for neural-network control of nonlinear systems. ACTA ACUST UNITED AC 2001; 12:1491-7. [DOI: 10.1109/72.963785] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
41
|
|
42
|
Calvert B, Marinov C. Another K-winners-take-all analog neural network. ACTA ACUST UNITED AC 2000; 11:829-38. [DOI: 10.1109/72.857764] [Citation(s) in RCA: 56] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
43
|
|
44
|
Guan ZH, Chen G, Qin Y. On equilibria, stability, and instability of Hopfield neural networks. ACTA ACUST UNITED AC 2000; 11:534-40. [PMID: 18249783 DOI: 10.1109/72.839023] [Citation(s) in RCA: 91] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Affiliation(s)
- Z H Guan
- Department of Control Science and Engineering, Huazhong University of Science and Technology, Wuhan, Hubei, 430074, China.
| | | | | |
Collapse
|
45
|
Abstract
Many systems existing in physics, chemistry, biology, engineering, and information science can be characterized by impulsive dynamics caused by abrupt jumps at certain instants during the process. These complex dynamical behaviors can be modeled by impulsive differential systems or impulsive neural networks. This paper formulates and studies a new model of impulsive autoassociative neural networks. Several fundamental issues, such as global exponential stability and existence and uniqueness of equilibria of such neural networks, are established.
Collapse
Affiliation(s)
- Z H Guan
- Department of Control Science and Engineering, Huazhong University of Science and Technology, Wuhan, Hubei, People's Republic of China.
| | | | | |
Collapse
|
46
|
Abstract
This paper is concerned with the asymptotic hyperstability of recurrent neural networks. We derive based on the stability results necessary and sufficient conditions for the network parameters. The results we achieve are more general than those based on Lyapunov methods, since they provide milder constraints on the connection weights than the conventional results and do not suppose symmetry of the weights.
Collapse
|
47
|
Abstract
Many evolutionary processes, particularly some biological systems, exhibit impulsive dynamical behaviors, which can be well described by impulsive Hopfield neural networks. This paper formulates and studies a model of delayed impulsive Hopfield neural networks. Several fundamental issues such as global exponential stability, existence and uniqueness of the equilibrium of such networks are established. A numerical example is given for illustration and interpretation of the theoretical results.
Collapse
Affiliation(s)
- Zhi Hong Guan
- Department of Automatic Control Engineering, Huazhong University of Science and Technology, Wuhan, People's Republic of China
| | | |
Collapse
|
48
|
Zhenjiang M, Baozong Y. Analysis and optimal design of continuous neural networks with applications to associative memory. Neural Netw 1999; 12:259-271. [PMID: 12662702 DOI: 10.1016/s0893-6080(98)00118-x] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
Abstract
The asymptotic stability of a continuous neural network is analyzed for associative memory. An optimal design method is proposed which ensures the highest associative memory speed and guarantees the storage of each desired memory with attractivity. The network asymptotic stability is analyzed by means of a new energy function, and four theorems are obtained. By comparing these theorems with existing ones, it can be shown that in some cases they are consistent, while in others they are not equivalent but complementary to each other. Further study results in two more generalized conclusions, of which the existing conclusions are special cases. The network optimal design method is proposed in terms of an optimal associative memory theorem. Two application examples are presented to demonstrate the defeffectiveness of the optimal design method, which can be used to design the network for many applications.
Collapse
Affiliation(s)
- Miao Zhenjiang
- National Research Council of Canada, Institute for Information Technology, 1500 Montreal Rd., Bldg. M-50, Ottawa, Canada
| | | |
Collapse
|
49
|
Abstract
We establish robustness stability results for a specific type of artificial neural networks for associative memories under parameter perturbations and determine conditions that ensure the existence of asymptotically stable equilibria of the perturbed neural system that are the asymptotically stable equilibria of the original unperturbed neural network. The proposed stability analysis tool is the sliding mode control and it facilitates the analysis by considering only a reduced-order system instead of the original one and time-dependent external stimuli.
Collapse
Affiliation(s)
- A Meyer-Bäse
- University of Florida, Department of Electrical and Computer Engineering, Gainesville 32611-6130, USA.
| |
Collapse
|