801
|
Abstract
Feedforward neural networks with a single hidden layer using normalized gaussian units are studied. It is proved that such neural networks are capable of universal approximation in a satisfactory sense. Then, a hybrid learning rule as per Moody and Darken that combines unsupervised learning of hidden units and supervised learning of output units is considered. By using the method of ordinary differential equations for adaptive algorithms (ODE method) it is shown that the asymptotic properties of the learning rule may be studied in terms of an autonomous cascade of dynamical systems. Some recent results from Hirsch about cascades are used to show the asymptotic stability of the learning rule.
Collapse
Affiliation(s)
- Michel Benaim
- Department of Mathematics, University of California at Berkeley, Berkeley, CA 94720 USA
| |
Collapse
|
802
|
Abstract
The detection of novel or abnormal input vectors is of importance in many monitoring tasks, such as fault detection in complex systems and detection of abnormal patterns in medical diagnostics. We have developed a robust method for novelty detection, which aims to minimize the number of heuristically chosen thresholds in the novelty decision process. We achieve this by growing a gaussian mixture model to form a representation of a training set of “normal” system states. When previously unseen data are to be screened for novelty we use the same threshold as was used during training to define a novelty decision boundary. We show on a sample problem of medical signal processing that this method is capable of providing robust novelty decision boundaries and apply the technique to the detection of epileptic seizures within a data record.
Collapse
Affiliation(s)
- Stephen Roberts
- Neural Network Research Group, Department of Engineering Science, University of Oxford, Oxford, UK
| | - Lionel Tarassenko
- Neural Network Research Group, Department of Engineering Science, University of Oxford, Oxford, UK
| |
Collapse
|
803
|
Xu L, Krzyżak A, Yuille A. On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size. Neural Netw 1994. [DOI: 10.1016/0893-6080(94)90040-x] [Citation(s) in RCA: 105] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
804
|
|
805
|
Abstract
This paper concerns conditions for the approximation of functions in certain general spaces using radial-basis-function networks. It has been shown in recent papers that certain classes of radial-basis-function networks are broad enough for universal approximation. In this paper these results are considerably extended and sharpened.
Collapse
Affiliation(s)
- Jooyoung Park
- Department of Electrical and Computer Engineering, The University of Texas at Austin, Austin, TX 78712 USA
| | - Irwin W. Sandberg
- Department of Electrical and Computer Engineering, The University of Texas at Austin, Austin, TX 78712 USA
| |
Collapse
|
806
|
|
807
|
Leonard J, Kramer M, Ungar L. Using radial basis functions to approximate a function and its error bounds. ACTA ACUST UNITED AC 1992; 3:624-7. [DOI: 10.1109/72.143377] [Citation(s) in RCA: 176] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
808
|
|