1
|
Gao S, Zhou M, Wang Z, Sugiyama D, Cheng J, Wang J, Todo Y. Fully Complex-Valued Dendritic Neuron Model. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2023; 34:2105-2118. [PMID: 34487498 DOI: 10.1109/tnnls.2021.3105901] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
A single dendritic neuron model (DNM) that owns the nonlinear information processing ability of dendrites has been widely used for classification and prediction. Complex-valued neural networks that consist of a number of multiple/deep-layer McCulloch-Pitts neurons have achieved great successes so far since neural computing was utilized for signal processing. Yet no complex value representations appear in single neuron architectures. In this article, we first extend DNM from a real-value domain to a complex-valued one. Performance of complex-valued DNM (CDNM) is evaluated through a complex XOR problem, a non-minimum phase equalization problem, and a real-world wind prediction task. Also, a comparative analysis on a set of elementary transcendental functions as an activation function is implemented and preparatory experiments are carried out for determining hyperparameters. The experimental results indicate that the proposed CDNM significantly outperforms real-valued DNM, complex-valued multi-layer perceptron, and other complex-valued neuron models.
Collapse
|
2
|
Chen Y, Liang J, Wu Y, He B, Lin L, Wang Y. Self-Regulating and Self-Perception Particle Swarm Optimization with Mutation Mechanism. J INTELL ROBOT SYST 2022. [DOI: 10.1007/s10846-022-01627-y] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
3
|
Xiao J, Jia Y, Jiang X, Wang S. Circular Complex-Valued GMDH-Type Neural Network for Real-Valued Classification Problems. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2020; 31:5285-5299. [PMID: 32078563 DOI: 10.1109/tnnls.2020.2966031] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Recently, applications of complex-valued neural networks (CVNNs) to real-valued classification problems have attracted significant attention. However, most existing CVNNs are black-box models with poor explanation performance. This study extends the real-valued group method of data handling (RGMDH)-type neural network to the complex field and constructs a circular complex-valued group method of data handling (C-CGMDH)-type neural network, which is a white-box model. First, a complex least squares method is proposed for parameter estimation. Second, a new complex-valued symmetric regularity criterion is constructed with a logarithmic function to represent explicitly the magnitude and phase of the actual and predicted complex output to evaluate and select the middle candidate models. Furthermore, the property of this new complex-valued external criterion is proven to be similar to that of the real external criterion. Before training this model, a circular transformation is used to transform the real-valued input features to the complex field. Twenty-five real-valued classification data sets from the UCI Machine Learning Repository are used to conduct the experiments. The results show that both RGMDH and C-CGMDH models can select the most important features from the complete feature space through a self-organizing modeling process. Compared with RGMDH, the C-CGMDH model converges faster and selects fewer features. Furthermore, its classification performance is statistically significantly better than the benchmark complex-valued and real-valued models. Regarding time complexity, the C-CGMDH model is comparable with other models in dealing with the data sets that have few features. Finally, we demonstrate that the GMDH-type neural network can be interpretable.
Collapse
|
4
|
Online RBM: Growing Restricted Boltzmann Machine on the fly for unsupervised representation. Appl Soft Comput 2020. [DOI: 10.1016/j.asoc.2020.106278] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
|
5
|
Aruna Kumar S, Harish B, Mahanand B, Sundararajan N. An efficient Meta-cognitive Fuzzy C-Means clustering approach. Appl Soft Comput 2019. [DOI: 10.1016/j.asoc.2019.105838] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
6
|
Anh N, Suresh S, Pratama M, Srikanth N. Interval prediction of wave energy characteristics using meta-cognitive interval type-2 fuzzy inference system. Knowl Based Syst 2019. [DOI: 10.1016/j.knosys.2019.01.025] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
|
7
|
Wu R, Huang H, Qian X, Huang T. A L-BFGS Based Learning Algorithm for Complex-Valued Feedforward Neural Networks. Neural Process Lett 2018. [DOI: 10.1007/s11063-017-9692-5] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
8
|
|
9
|
Short-Term Load Forecasting Using Adaptive Annealing Learning Algorithm Based Reinforcement Neural Network. ENERGIES 2016. [DOI: 10.3390/en9120987] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
10
|
Sivachitra M, Savitha R, Suresh S, Vijayachitra S. A Fully Complex-valued Fast Learning Classifier (FC-FLC) for real-valued classification problems. Neurocomputing 2015. [DOI: 10.1016/j.neucom.2014.04.075] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
11
|
|
12
|
Scaled Conjugate Gradient Learning for Complex-Valued Neural Networks. ADVANCES IN INTELLIGENT SYSTEMS AND COMPUTING 2015. [DOI: 10.1007/978-3-319-19824-8_18] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/12/2023]
|
13
|
Subramanian K, Savitha R, Suresh S. A complex-valued neuro-fuzzy inference system and its learning mechanism. Neurocomputing 2014. [DOI: 10.1016/j.neucom.2013.06.009] [Citation(s) in RCA: 31] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
|
14
|
A meta-cognitive interval type-2 fuzzy inference system and its projection based learning algorithm. EVOLVING SYSTEMS 2013. [DOI: 10.1007/s12530-013-9102-9] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
15
|
Paul TK, Ogunfunmi T. Study of the convergence behavior of the complex kernel least mean square algorithm. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2013; 24:1349-1363. [PMID: 24808573 DOI: 10.1109/tnnls.2013.2256367] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
The complex kernel least mean square (CKLMS) algorithm is recently derived and allows for online kernel adaptive learning for complex data. Kernel adaptive methods can be used in finding solutions for neural network and machine learning applications. The derivation of CKLMS involved the development of a modified Wirtinger calculus for Hilbert spaces to obtain the cost function gradient. We analyze the convergence of the CKLMS with different kernel forms for complex data. The expressions obtained enable us to generate theory-predicted mean-square error curves considering the circularity of the complex input signals and their effect on nonlinear learning. Simulations are used for verifying the analysis results.
Collapse
|
16
|
|
17
|
Savitha R, Suresh S, Sundararajan N. Projection-based fast learning fully complex-valued relaxation neural network. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2013; 24:529-541. [PMID: 24808375 DOI: 10.1109/tnnls.2012.2235460] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
This paper presents a fully complex-valued relaxation network (FCRN) with its projection-based learning algorithm. The FCRN is a single hidden layer network with a Gaussian-like sech activation function in the hidden layer and an exponential activation function in the output layer. For a given number of hidden neurons, the input weights are assigned randomly and the output weights are estimated by minimizing a nonlinear logarithmic function (called as an energy function) which explicitly contains both the magnitude and phase errors. A projection-based learning algorithm determines the optimal output weights corresponding to the minima of the energy function by converting the nonlinear programming problem into that of solving a set of simultaneous linear algebraic equations. The resultant FCRN approximates the desired output more accurately with a lower computational effort. The classification ability of FCRN is evaluated using a set of real-valued benchmark classification problems from the University of California, Irvine machine learning repository. Here, a circular transformation is used to transform the real-valued input features to the complex domain. Next, the FCRN is used to solve three practical problems: a quadrature amplitude modulation channel equalization, an adaptive beamforming, and a mammogram classification. Performance results from this paper clearly indicate the superior classification/approximation performance of the FCRN.
Collapse
|
18
|
Babu GS, Suresh S. Sequential projection-based metacognitive learning in a radial basis function network for classification problems. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2013; 24:194-206. [PMID: 24808275 DOI: 10.1109/tnnls.2012.2226748] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
In this paper, we present a sequential projection-based metacognitive learning algorithm in a radial basis function network (PBL-McRBFN) for classification problems. The algorithm is inspired by human metacognitive learning principles and has two components: a cognitive component and a metacognitive component. The cognitive component is a single-hidden-layer radial basis function network with evolving architecture. The metacognitive component controls the learning process in the cognitive component by choosing the best learning strategy for the current sample and adapts the learning strategies by implementing self-regulation. In addition, sample overlapping conditions and past knowledge of the samples in the form of pseudosamples are used for proper initialization of new hidden neurons to minimize the misclassification. The parameter update strategy uses projection-based direct minimization of hinge loss error. The interaction of the cognitive component and the metacognitive component addresses the what-to-learn, when-to-learn, and how-to-learn human learning principles efficiently. The performance of the PBL-McRBFN is evaluated using a set of benchmark classification problems from the University of California Irvine machine learning repository. The statistical performance evaluation on these problems proves the superior performance of the PBL-McRBFN classifier over results reported in the literature. Also, we evaluate the performance of the proposed algorithm on a practical Alzheimer's disease detection problem. The performance results on open access series of imaging studies and Alzheimer's disease neuroimaging initiative datasets, which are obtained from different demographic regions, clearly show that PBL-McRBFN can handle a problem with change in distribution.
Collapse
|
19
|
Meta-cognitive RBF Network and its Projection Based Learning algorithm for classification problems. Appl Soft Comput 2013. [DOI: 10.1016/j.asoc.2012.08.047] [Citation(s) in RCA: 86] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
|
20
|
SUBRAMANIAN K, SURESH S. HUMAN ACTION RECOGNITION USING META-COGNITIVE NEURO-FUZZY INFERENCE SYSTEM. Int J Neural Syst 2012. [DOI: 10.1142/s0129065712500281] [Citation(s) in RCA: 39] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
We propose a sequential Meta-Cognitive learning algorithm for Neuro-Fuzzy Inference System (McFIS) to efficiently recognize human actions from video sequence. Optical flow information between two consecutive image planes can represent actions hierarchically from local pixel level to global object level, and hence are used to describe the human action in McFIS classifier. McFIS classifier and its sequential learning algorithm is developed based on the principles of self-regulation observed in human meta-cognition. McFIS decides on what-to-learn, when-to-learn and how-to-learn based on the knowledge stored in the classifier and the information contained in the new training samples. The sequential learning algorithm of McFIS is controlled and monitored by the meta-cognitive components which uses class-specific, knowledge based criteria along with self-regulatory thresholds to decide on one of the following strategies: (i) Sample deletion (ii) Sample learning and (iii) Sample reserve. Performance of proposed McFIS based human action recognition system is evaluated using benchmark Weizmann and KTH video sequences. The simulation results are compared with well known SVM classifier and also with state-of-the-art action recognition results reported in the literature. The results clearly indicates McFIS action recognition system achieves better performances with minimal computational effort.
Collapse
Affiliation(s)
- K. SUBRAMANIAN
- School of Computer Engineering, Nanyang Technological University, Nanyang Avenue, Singapore 639798, Singapore
| | - S. SURESH
- School of Computer Engineering, Nanyang Technological University, Nanyang Avenue, Singapore 639798, Singapore
| |
Collapse
|
21
|
|
22
|
Savitha R, Suresh S, Sundararajan N. A meta-cognitive learning algorithm for a Fully Complex-valued Relaxation Network. Neural Netw 2012; 32:209-18. [DOI: 10.1016/j.neunet.2012.02.015] [Citation(s) in RCA: 53] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2011] [Revised: 01/14/2012] [Accepted: 02/07/2012] [Indexed: 11/30/2022]
|
23
|
Venkatesh Babu R, Suresh S, Savitha R. Human action recognition using a fast learning fully complex-valued classifier. Neurocomputing 2012. [DOI: 10.1016/j.neucom.2012.03.003] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
|
24
|
Savitha R, Suresh S, Sundararajan N. Fast learning complex-valued classifiers for real-valued classification problems. INT J MACH LEARN CYB 2012. [DOI: 10.1007/s13042-012-0112-x] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
|
25
|
Savitha R, Suresh S, Sundararajan N. Metacognitive Learning in a Fully Complex-Valued Radial Basis Function Neural Network. Neural Comput 2012; 24:1297-328. [DOI: 10.1162/neco_a_00254] [Citation(s) in RCA: 91] [Impact Index Per Article: 7.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Recent studies on human learning reveal that self-regulated learning in a metacognitive framework is the best strategy for efficient learning. As the machine learning algorithms are inspired by the principles of human learning, one needs to incorporate the concept of metacognition to develop efficient machine learning algorithms. In this letter we present a metacognitive learning framework that controls the learning process of a fully complex-valued radial basis function network and is referred to as a metacognitive fully complex-valued radial basis function (Mc-FCRBF) network. Mc-FCRBF has two components: a cognitive component containing the FC-RBF network and a metacognitive component, which regulates the learning process of FC-RBF. In every epoch, when a sample is presented to Mc-FCRBF, the metacognitive component decides what to learn, when to learn, and how to learn based on the knowledge acquired by the FC-RBF network and the new information contained in the sample. The Mc-FCRBF learning algorithm is described in detail, and both its approximation and classification abilities are evaluated using a set of benchmark and practical problems. Performance results indicate the superior approximation and classification performance of Mc-FCRBF compared to existing methods in the literature.
Collapse
Affiliation(s)
- R. Savitha
- School of Computer Engineering, Nanyang Technological University, 639798 Singapore
| | - S. Suresh
- School of Computer Engineering, Nanyang Technological University, 639798 Singapore
| | - N. Sundararajan
- School of Electrical and Electronics Engineering, Nanyang Technological University, 639735 Singapore
| |
Collapse
|
26
|
Sateesh Babu G, Suresh S. Meta-cognitive Neural Network for classification problems in a sequential learning framework. Neurocomputing 2012. [DOI: 10.1016/j.neucom.2011.12.001] [Citation(s) in RCA: 92] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
27
|
Savitha R, Suresh S, Sundararajan N. Fast learning Circular Complex-valued Extreme Learning Machine (CC-ELM) for real-valued classification problems. Inf Sci (N Y) 2012. [DOI: 10.1016/j.ins.2011.11.003] [Citation(s) in RCA: 60] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
28
|
Savitha R, Suresh S, Sundararajan N, Kim H. A fully complex-valued radial basis function classifier for real-valued classification problems. Neurocomputing 2012. [DOI: 10.1016/j.neucom.2011.05.036] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
|