1
|
Yang B, Zhang X, Nie F, Chen B, Wang F, Nan Z, Zheng N. ECCA: Efficient Correntropy-Based Clustering Algorithm With Orthogonal Concept Factorization. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2023; 34:7377-7390. [PMID: 35100124 DOI: 10.1109/tnnls.2022.3142806] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
One of the hottest topics in unsupervised learning is how to efficiently and effectively cluster large amounts of unlabeled data. To address this issue, we propose an orthogonal conceptual factorization (OCF) model to increase clustering effectiveness by restricting the degree of freedom of matrix factorization. In addition, for the OCF model, a fast optimization algorithm containing only a few low-dimensional matrix operations is given to improve clustering efficiency, as opposed to the traditional CF optimization algorithm, which involves dense matrix multiplications. To further improve the clustering efficiency while suppressing the influence of the noises and outliers distributed in real-world data, an efficient correntropy-based clustering algorithm (ECCA) is proposed in this article. Compared with OCF, an anchor graph is constructed and then OCF is performed on the anchor graph instead of directly performing OCF on the original data, which can not only further improve the clustering efficiency but also inherit the advantages of the high performance of spectral clustering. In particular, the introduction of the anchor graph makes ECCA less sensitive to changes in data dimensions and still maintains high efficiency at higher data dimensions. Meanwhile, for various complex noises and outliers in real-world data, correntropy is introduced into ECCA to measure the similarity between the matrix before and after decomposition, which can greatly improve the clustering effectiveness and robustness. Subsequently, a novel and efficient half-quadratic optimization algorithm was proposed to quickly optimize the ECCA model. Finally, extensive experiments on different real-world datasets and noisy datasets show that ECCA can archive promising effectiveness and robustness while achieving tens to thousands of times the efficiency compared with other state-of-the-art baselines.
Collapse
|
2
|
Qian G, Yu X, Mei J, Liu J, Wang S. A Class of Adaptive Filtering Algorithms Based on Improper Complex Correntropy. Inf Sci (N Y) 2023. [DOI: 10.1016/j.ins.2023.03.076] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/13/2023]
|
3
|
Qu H, Zheng Y, Li L, Guo F. An Unsupervised Feature Extraction Approach Based on Self-Expression. BIG DATA 2023; 11:18-34. [PMID: 35537483 DOI: 10.1089/big.2021.0420] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Feature extraction algorithms lack good interpretability during the projection learning. To solve this problem, an unsupervised feature extraction algorithm, that is, block diagonal projection (BDP), based on self-expression is proposed. Specifically, if the original data are projected into a low-dimensional subspace by a feature extraction algorithm, although the data may be more compact, the new features obtained may not be as explanatory as the original sample features. Therefore, by imposing L2,1 norm constraint on the projection matrix, the projection matrix can be of row sparsity. On one hand, discriminative features can be selected to make the projection matrix to be more interpretable. On the other hand, irrelevant or redundant features can be suppressed. The proposed model integrates feature extraction and selection into one framework. In addition, since self-expression can well excavate the correlation between samples or sample features, the unsupervised feature extraction task can be better guided using this property between them. At the same time, the block diagonal representation regular term is introduced to directly pursue the block diagonal representation. Thus, the accuracy of pattern recognition tasks such as clustering and classification can be improved. Finally, the effectiveness of BDP in linear dimensionality reduction and classification is proved on various reference datasets. The experimental results show that this algorithm is superior to previous feature extraction counterparts.
Collapse
Affiliation(s)
- Hongchun Qu
- College of Computer Science, Chongqing University of Posts and Telecommunications, Chongqing, China
- College of Automation, Chongqing University of Posts and Telecommunications, Chongqing, China
| | - Yangqi Zheng
- College of Automation, Chongqing University of Posts and Telecommunications, Chongqing, China
| | - Lin Li
- College of Computer Science, Chongqing University of Posts and Telecommunications, Chongqing, China
| | - Fei Guo
- College of Automation, Chongqing University of Posts and Telecommunications, Chongqing, China
| |
Collapse
|
4
|
One Step Multi-view Spectral Clustering via Joint Adaptive Graph Learning and Matrix Factorization. Neurocomputing 2022. [DOI: 10.1016/j.neucom.2022.12.023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
|
5
|
|
6
|
Zhang Q, Kang Z, Xu Z, Huang S, Fu H. Spaks: Self-paced multiple kernel subspace clustering with feature smoothing regularization. Knowl Based Syst 2022. [DOI: 10.1016/j.knosys.2022.109500] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
|
7
|
Salient and consensus representation learning based incomplete multiview clustering. APPL INTELL 2022. [DOI: 10.1007/s10489-022-03530-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/02/2022]
|
8
|
El Hajjar S, Dornaika F, Abdallah F. One-step multi-view spectral clustering with cluster label correlation graph. Inf Sci (N Y) 2022. [DOI: 10.1016/j.ins.2022.01.017] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
9
|
Joint learning affinity matrix and representation matrix for robust low-rank multi-kernel clustering. APPL INTELL 2022. [DOI: 10.1007/s10489-021-02974-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/02/2022]
|
10
|
Zhang X, Wang J, Xue X, Sun H, Zhang J. Confidence level auto-weighting robust multi-view subspace clustering. Neurocomputing 2022. [DOI: 10.1016/j.neucom.2021.12.029] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
11
|
Guo L, Zhang X, Liu Z, Xue X, Wang Q, Zheng S. Robust subspace clustering based on automatic weighted multiple kernel learning. Inf Sci (N Y) 2021. [DOI: 10.1016/j.ins.2021.05.070] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
12
|
Zhang X, Xue X, Sun H, Liu Z, Guo L, Guo X. Robust multiple kernel subspace clustering with block diagonal representation and low-rank consensus kernel. Knowl Based Syst 2021. [DOI: 10.1016/j.knosys.2021.107243] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|
13
|
Ren Z, Li X, Mukherjee M, Huang Y, Sun Q, Huang Z. Robust multi-view graph clustering in latent energy-preserving embedding space. Inf Sci (N Y) 2021. [DOI: 10.1016/j.ins.2021.05.025] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
14
|
Li Y, Zhao Q, Luo K. Multi-objective soft subspace clustering in the composite kernel space. Inf Sci (N Y) 2021. [DOI: 10.1016/j.ins.2021.02.008] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
15
|
|
16
|
Ren Z, Yang SX, Sun Q, Wang T. Consensus Affinity Graph Learning for Multiple Kernel Clustering. IEEE TRANSACTIONS ON CYBERNETICS 2021; 51:3273-3284. [PMID: 32584777 DOI: 10.1109/tcyb.2020.3000947] [Citation(s) in RCA: 23] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Significant attention to multiple kernel graph-based clustering (MKGC) has emerged in recent years, primarily due to the superiority of multiple kernel learning (MKL) and the outstanding performance of graph-based clustering. However, many existing MKGC methods design a fat model that poses challenges for computational cost and clustering performance, as they learn both an affinity graph and an extra consensus kernel cumbersomely. To tackle this challenging problem, this article proposes a new MKGC method to learn a consensus affinity graph directly. By using the self-expressiveness graph learning and an adaptive local structure learning term, the local manifold structure of the data in kernel space is preserved for learning multiple candidate affinity graphs from a kernel pool first. After that, these candidate affinity graphs are synthesized to learn a consensus affinity graph via a thin autoweighted fusion model, in which a self-tuned Laplacian rank constraint and a top- k neighbors sparse strategy are introduced to improve the quality of the consensus affinity graph for accurate clustering purposes. The experimental results on ten benchmark datasets and two synthetic datasets show that the proposed method consistently and significantly outperforms the state-of-the-art methods.
Collapse
|
17
|
Zhang X, Ren Z, Sun H, Bai K, Feng X, Liu Z. Multiple kernel low-rank representation-based robust multi-view subspace clustering. Inf Sci (N Y) 2021. [DOI: 10.1016/j.ins.2020.10.059] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
18
|
Ren Z, Lei H, Sun Q, Yang C. Simultaneous learning coefficient matrix and affinity graph for multiple kernel clustering. Inf Sci (N Y) 2021. [DOI: 10.1016/j.ins.2020.08.056] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
19
|
Li X, Ren Z, Lei H, Huang Y, Sun Q. Multiple kernel clustering with pure graph learning scheme. Neurocomputing 2021. [DOI: 10.1016/j.neucom.2020.10.052] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
20
|
Image Segmentation Based on Non-convex Low Rank Multiple Kernel Clustering. ARTIF INTELL 2021. [DOI: 10.1007/978-3-030-93046-2_36] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/10/2023]
|
21
|
|
22
|
Simultaneously learning feature-wise weights and local structures for multi-view subspace clustering. Knowl Based Syst 2020. [DOI: 10.1016/j.knosys.2020.106280] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
23
|
Xue X, Zhang X, Feng X, Sun H, Chen W, Liu Z. Robust subspace clustering based on non-convex low-rank approximation and adaptive kernel. Inf Sci (N Y) 2020. [DOI: 10.1016/j.ins.2019.10.058] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
24
|
Ren Z, Li H, Yang C, Sun Q. Multiple kernel subspace clustering with local structural graph and low-rank consensus kernel learning. Knowl Based Syst 2020. [DOI: 10.1016/j.knosys.2019.105040] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
25
|
Deng T, Ye D, Ma R, Fujita H, Xiong L. Low-rank local tangent space embedding for subspace clustering. Inf Sci (N Y) 2020. [DOI: 10.1016/j.ins.2019.08.060] [Citation(s) in RCA: 56] [Impact Index Per Article: 14.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
|