1
|
Han H, Sun C, Wu X, Yang H, Qiao J. Nonsingular Gradient Descent Algorithm for Interval Type-2 Fuzzy Neural Network. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2024; 35:8176-8189. [PMID: 37015616 DOI: 10.1109/tnnls.2022.3225181] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
Interval type-2 fuzzy neural network (IT2FNN) is widely used to model nonlinear systems. Unfortunately, the gradient descent-based IT2FNN with uncertain variances always suffers from low convergence speed due to its inherent singularity. To cope with this problem, a nonsingular gradient descent algorithm (NSGDA) is developed to update IT2FNN in this article. First, the widths of type-2 fuzzy rules are transformed into root inverse variances (RIVs) that always satisfy the sufficient condition of differentiability. Second, the singular RIVs are reformulated by the nonsingular Shapley-based matrices associated with type-2 fuzzy rules. It averts the convergence stagnation caused by zero derivatives of singular RIVs, thereby sustaining the gradient convergence. Third, an integrated-form update strategy (IUS) is designed to obtain the derivatives of parameters, including RIVs, centers, weight coefficients, deviations, and proportionality coefficient of IT2FNN. These parameters are packed into multiple subvariable matrices, which are capable to accelerate gradient convergence using parallel calculation instead of sequence iteration. Finally, the experiments showcase that the proposed NSGDA-based IT2FNN can improve the convergence speed through the improved learning algorithm.
Collapse
|
2
|
Han H, Sun C, Wu X, Yang H, Qiao J. Self-Organizing Interval Type-2 Fuzzy Neural Network Using Information Aggregation Method. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2023; 34:6428-6442. [PMID: 34982701 DOI: 10.1109/tnnls.2021.3136678] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Interval type-2 fuzzy neural networks (IT2FNNs) usually stack adequate fuzzy rules to identify nonlinear systems with high-dimensional inputs, which may result in an explosion of fuzzy rules. To cope with this problem, a self-organizing IT2FNN, based on the information aggregation method (IA-SOIT2FNN), is developed to avoid the explosion of fuzzy rules in this article. First, a relation-aware strategy is proposed to construct rotatable type-2 fuzzy rules (RT2FRs). This strategy uses the individual RT2FR, instead of multiple standard fuzzy rules, to interpret interactive features of high-dimensional inputs. Second, a comprehensive information evaluation mechanism, associated with the interval information and rotation information of RT2FR, is developed to direct the structural adjustment of IA-SOIT2FNN. This mechanism can achieve a compact structure of IA-SOIT2FNN by growing and pruning RT2FRs. Third, a multicriteria-based optimization algorithm is designed to optimize the parameters of IA-SOIT2FNN. The algorithm can simultaneously update the rotatable parameters and the conventional parameters of RT2FR, and further maintain the accuracy of IA-SOIT2FNN. Finally, the experiments showcase that the proposed IA-SOIT2FNN can compete with the state-of-the-art approaches in terms of identification performance.
Collapse
|
3
|
Malialis K, Panayiotou CG, Polycarpou MM. Nonstationary data stream classification with online active learning and siamese neural networks✩. Neurocomputing 2022. [DOI: 10.1016/j.neucom.2022.09.065] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/31/2022]
|
4
|
de Campos Souza PV, Lughofer E. Evolving fuzzy neural classifier that integrates uncertainty from human-expert feedback. EVOLVING SYSTEMS 2022; 14:319-341. [PMID: 37009465 PMCID: PMC10061807 DOI: 10.1007/s12530-022-09455-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2021] [Accepted: 07/17/2022] [Indexed: 10/15/2022]
Abstract
AbstractEvolving fuzzy neural networks are models capable of solving complex problems in a wide variety of contexts. In general, the quality of the data evaluated by a model has a direct impact on the quality of the results. Some procedures can generate uncertainty during data collection, which can be identified by experts to choose more suitable forms of model training. This paper proposes the integration of expert input on labeling uncertainty into evolving fuzzy neural classifiers (EFNC) in an approach called EFNC-U. Uncertainty is considered in class label input provided by experts, who may not be entirely confident in their labeling or who may have limited experience with the application scenario for which the data is processed. Further, we aimed to create highly interpretable fuzzy classification rules to gain a better understanding of the process and thus to enable the user to elicit new knowledge from the model. To prove our technique, we performed binary pattern classification tests within two application scenarios, cyber invasion and fraud detection in auctions. By explicitly considering class label uncertainty in the update process of the EFNC-U, improved accuracy trend lines were achieved compared to fully (and blindly) updating the classifiers with uncertain data. Integration of (simulated) labeling uncertainty smaller than 20% led to similar accuracy trends as using the original streams (unaffected by uncertainty). This demonstrates the robustness of our approach up to this uncertainty level. Finally, interpretable rules were elicited for a particular application (auction fraud identification) with reduced (and thus readable) antecedent lengths and with certainty values in the consequent class labels. Additionally, an average expected uncertainty of the rules were elicited based on the uncertainty levels in those samples which formed the corresponding rules.
Collapse
|
5
|
Lughofer E. Evolving multi-label fuzzy classifier with advanced robustness respecting human uncertainty. Knowl Based Syst 2022. [DOI: 10.1016/j.knosys.2022.109717] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
6
|
|
7
|
Lughofer E. Evolving multi-user fuzzy classifier systems integrating human uncertainty and expert knowledge. Inf Sci (N Y) 2022. [DOI: 10.1016/j.ins.2022.03.014] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
8
|
Kalibatiene D, Miliauskaitė J. A dynamic fuzzification approach for interval type-2 membership function development: case study for QoS planning. Soft comput 2021. [DOI: 10.1007/s00500-021-05899-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|
9
|
Laha M, Konar A, Rakshit P, Nagar AK. Exploration of Subjective Color Perceptual-Ability by EEG-Induced Type-2 Fuzzy Classifiers. IEEE Trans Cogn Dev Syst 2020. [DOI: 10.1109/tcds.2019.2959138] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
10
|
|
11
|
Bezerra CG, Costa BSJ, Guedes LA, Angelov PP. An evolving approach to data streams clustering based on typicality and eccentricity data analytics. Inf Sci (N Y) 2020. [DOI: 10.1016/j.ins.2019.12.022] [Citation(s) in RCA: 24] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
12
|
Rodríguez-Ruiz J, Mata-Sánchez JI, Monroy R, Loyola-González O, López-Cuevas A. A one-class classification approach for bot detection on Twitter. Comput Secur 2020. [DOI: 10.1016/j.cose.2020.101715] [Citation(s) in RCA: 41] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
13
|
Shahparast H, Mansoori EG. Developing an online general type-2 fuzzy classifier using evolving type-1 rules. Int J Approx Reason 2019. [DOI: 10.1016/j.ijar.2019.07.011] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
14
|
Pratama M, Wang D. Deep stacked stochastic configuration networks for lifelong learning of non-stationary data streams. Inf Sci (N Y) 2019. [DOI: 10.1016/j.ins.2019.04.055] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
|
15
|
Škrjanc I, Iglesias JA, Sanchis A, Leite D, Lughofer E, Gomide F. Evolving fuzzy and neuro-fuzzy approaches in clustering, regression, identification, and classification: A Survey. Inf Sci (N Y) 2019. [DOI: 10.1016/j.ins.2019.03.060] [Citation(s) in RCA: 151] [Impact Index Per Article: 30.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
16
|
Ghosh L, Konar A, Rakshit P, Nagar AK. Hemodynamic Analysis for Cognitive Load Assessment and Classification in Motor Learning Tasks Using Type-2 Fuzzy Sets. IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE 2019. [DOI: 10.1109/tetci.2018.2868323] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
17
|
|
18
|
L. Lobo J, Del Ser J, Bilbao MN, Perfecto C, Salcedo-Sanz S. DRED: An evolutionary diversity generation method for concept drift adaptation in online learning environments. Appl Soft Comput 2018. [DOI: 10.1016/j.asoc.2017.10.004] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
19
|
Learning of operator hand movements via least angle regression to be teached in a manipulator. EVOLVING SYSTEMS 2018. [DOI: 10.1007/s12530-018-9224-1] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
|
20
|
SAR Target Recognition via Incremental Nonnegative Matrix Factorization. REMOTE SENSING 2018. [DOI: 10.3390/rs10030374] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
In synthetic aperture radar (SAR) target recognition, the amount of target data increases continuously, and thus SAR automatic target recognition (ATR) systems are required to provide updated feature models in real time. Most recent SAR feature extraction methods have to use both existing and new samples to retrain a new model every time new data is acquired. However, this repeated calculation of existing samples leads to an increased computing cost. In this paper, a dynamic feature learning method called incremental nonnegative matrix factorization with L p sparse constraints (L p -INMF) is proposed as a solution to that problem. In contrast to conventional nonnegative matrix factorization (NMF) whereby existing and new samples are computed to retrain a new model, incremental NMF (INMF) computes only the new samples to update the trained model incrementally, which can improve the computing efficiency. Considering the sparse characteristics of scattering centers in SAR images, we set the updating process under a generic sparse constraint (L p ) for matrix decomposition of INMF. Thus, L p -INMF can extract sparse characteristics in SAR images. Experimental results using Moving and Stationary Target Acquisition and Recognition (MSTAR) benchmark data illustrate that the proposed L p -INMF method can not only update models with new samples more efficiently than conventional NMF, but also has a higher recognition rate than NMF and INMF.
Collapse
|
21
|
Saha A, Konar A, Nagar AK. EEG Analysis for Cognitive Failure Detection in Driving Using Type-2 Fuzzy Classifiers. IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE 2017. [DOI: 10.1109/tetci.2017.2750761] [Citation(s) in RCA: 40] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
22
|
Bouillon M, Anquetil E. Online active supervision of an evolving classifier for customized-gesture-command learning. Neurocomputing 2017. [DOI: 10.1016/j.neucom.2016.12.094] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
23
|
|
24
|
|
25
|
Pratama M, Lughofer E, Er MJ, Anavatti S, Lim CP. Data driven modelling based on Recurrent Interval-Valued Metacognitive Scaffolding Fuzzy Neural Network. Neurocomputing 2017. [DOI: 10.1016/j.neucom.2016.10.093] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
26
|
|
27
|
|
28
|
|
29
|
Páramo-Carranza LA, Meda-Campaña JA, de Jesús Rubio J, Tapia-Herrera R, Curtidor-López AV, Grande-Meza A, Cázares-Ramírez I. Discrete-time Kalman filter for Takagi–Sugeno fuzzy models. EVOLVING SYSTEMS 2017. [DOI: 10.1007/s12530-017-9181-0] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|