1
|
Li Z, Yang T, Li J. Semi-supervised attribute reduction for partially labelled multiset-valued data via a prediction label strategy. Inf Sci (N Y) 2023. [DOI: 10.1016/j.ins.2023.03.127] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/29/2023]
|
2
|
Yu B, Hu Y, Kang Y, Cai M. A novel variable precision rough set attribute reduction algorithm based on local attribute significance. Int J Approx Reason 2023. [DOI: 10.1016/j.ijar.2023.03.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/30/2023]
|
3
|
Li R, Chen H, Liu S, Li X, Li Y, Wang B. Incomplete mixed data-driven outlier detection based on local−global neighborhood information. Inf Sci (N Y) 2023. [DOI: 10.1016/j.ins.2023.03.037] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/14/2023]
|
4
|
Feature selection using Information Gain and decision information in neighborhood decision system. Appl Soft Comput 2023. [DOI: 10.1016/j.asoc.2023.110100] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/12/2023]
|
5
|
Qu K, Xu J, Han Z, Xu S. Maximum relevance minimum redundancy-based feature selection using rough mutual information in adaptive neighborhood rough sets. APPL INTELL 2023. [DOI: 10.1007/s10489-022-04398-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/12/2023]
|
6
|
Yang L, Qin K, Sang B, Fu C. A novel incremental attribute reduction by using quantitative dominance-based neighborhood self-information. Knowl Based Syst 2022. [DOI: 10.1016/j.knosys.2022.110200] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022]
|
7
|
Huang J, Lin Y, Li J. Rule reductions of decision formal context based on mixed information. APPL INTELL 2022. [DOI: 10.1007/s10489-022-04194-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
|
8
|
Hu Q, Qin K, Yang H, Xue B. A novel approach to attribute reduction and rule acquisition of formal decision context. APPL INTELL 2022. [DOI: 10.1007/s10489-022-04139-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
9
|
Sparse multi-label feature selection via dynamic graph manifold regularization. INT J MACH LEARN CYB 2022. [DOI: 10.1007/s13042-022-01679-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/10/2022]
|
10
|
Ju H, Ding W, Shi Z, Huang J, Yang J, Yang X. Attribute reduction with personalized information granularity of nearest mutual neighbors. Inf Sci (N Y) 2022. [DOI: 10.1016/j.ins.2022.09.006] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
|
11
|
Chen Y, Wang P, Yang X, Yu H. Bee: towards a robust attribute reduction. INT J MACH LEARN CYB 2022. [DOI: 10.1007/s13042-022-01633-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2022]
|
12
|
Attribute Reduction Based on Lift and Random Sampling. Symmetry (Basel) 2022. [DOI: 10.3390/sym14091828] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
As one of the key topics in the development of neighborhood rough set, attribute reduction has attracted extensive attentions because of its practicability and interpretability for dimension reduction or feature selection. Although the random sampling strategy has been introduced in attribute reduction to avoid overfitting, uncontrollable sampling may still affect the efficiency of search reduct. By utilizing inherent characteristics of each label, Multi-label learning with Label specIfic FeaTures (Lift) algorithm can improve the performance of mathematical modeling. Therefore, here, it is attempted to use Lift algorithm to guide the sampling for reduce the uncontrollability of sampling. In this paper, an attribute reduction algorithm based on Lift and random sampling called ARLRS is proposed, which aims to improve the efficiency of searching reduct. Firstly, Lift algorithm is used to choose the samples from the dataset as the members of the first group, then the reduct of the first group is calculated. Secondly, random sampling strategy is used to divide the rest of samples into groups which have symmetry structure. Finally, the reducts are calculated group-by-group, which is guided by the maintenance of the reducts’ classification performance. Comparing with other 5 attribute reduction strategies based on rough set theory over 17 University of California Irvine (UCI) datasets, experimental results show that: (1) ARLRS algorithm can significantly reduce the time consumption of searching reduct; (2) the reduct derived from ARLRS algorithm can provide satisfying performance in classification tasks.
Collapse
|
13
|
Ding W, Qin T, Shen X, Ju H, Wang H, Huang J, Li M. Parallel incremental efficient attribute reduction algorithm based on attribute tree. Inf Sci (N Y) 2022. [DOI: 10.1016/j.ins.2022.08.044] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
14
|
Zhang X, Jiang Z, Xu W. Feature selection using a weighted method in interval-valued decision information systems. APPL INTELL 2022. [DOI: 10.1007/s10489-022-03987-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
15
|
Unsupervised attribute reduction: improving effectiveness and efficiency. INT J MACH LEARN CYB 2022. [DOI: 10.1007/s13042-022-01618-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
16
|
Feature selection for set-valued data based on D–S evidence theory. Artif Intell Rev 2022. [DOI: 10.1007/s10462-022-10241-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/02/2022]
|
17
|
Yang X, Yang Y, Luo J, Liu D, Li T. A unified incremental updating framework of attribute reduction for two-dimensionally time-evolving data. Inf Sci (N Y) 2022. [DOI: 10.1016/j.ins.2022.04.026] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
18
|
Liu K, Li T, Yang X, Ju H, Yang X, Liu D. Hierarchical neighborhood entropy based multi-granularity attribute reduction with application to gene prioritization. Int J Approx Reason 2022. [DOI: 10.1016/j.ijar.2022.05.011] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
19
|
|
20
|
A Distributed Attribute Reduction Algorithm for High-Dimensional Data under the Spark Framework. INT J COMPUT INT SYS 2022. [DOI: 10.1007/s44196-022-00076-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022] Open
Abstract
AbstractAttribute reduction is an important issue in rough set theory. However, the rough set theory-based attribute reduction algorithms need to be improved to deal with high-dimensional data. A distributed version of the attribute reduction algorithm is necessary to enable it to effectively handle big data. The partition of attribute space is an important research direction. In this paper, a distributed attribution reduction algorithm based on cosine similarity (DARCS) for high-dimensional data pre-processing under the Spark framework is proposed. First, to avoid the repeated calculation of similar attributes, the algorithm gathers similar attributes based on similarity measure to form multiple clusters. And then one attribute is selected randomly as a representative from each cluster to form a candidate attribute subset to participate in the subsequent reduction operation. At the same time, to improve computing efficiency, an improved method is introduced to calculate the attribute dependency in the divided sub-attribute space. Experiments on eight datasets show that, on the premise of avoiding critical information loss, the reduction ability and computing efficiency of DARCS have been improved by 0.32 to 39.61% and 31.32 to 93.79% respectively compared to the distributed version of attribute reduction algorithm based on a random partitioning of the attributes space.
Collapse
|
21
|
|
22
|
Fan X, Chen X, Wang C, Wang Y, Zhang Y. Margin attribute reductions for multi-label classification. APPL INTELL 2022. [DOI: 10.1007/s10489-021-02740-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
23
|
Incremental feature selection by sample selection and feature-based accelerator. Appl Soft Comput 2022. [DOI: 10.1016/j.asoc.2022.108800] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
|
24
|
|
25
|
|
26
|
|
27
|
Lu Y, Song J, Wang P, Xu T. Label-specific guidance for efficiently searching reduct. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS 2022. [DOI: 10.3233/jifs-213112] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
In the era of big data for exploring attribute reduction/rough set-based feature selection related problems, to design efficient strategies for deriving reducts and then reduce the dimensions of data, two fundamental perspectives of Granular Computing may be taken into account: breaking up the whole into pieces and gathering parts into a whole. From this point of view, a novel strategy named label-specific guidance is introduced into the process of searching reduct. Given a formal description of attribute reduction, by considering the corresponding constraint, we divide it into several label-specific based constraints. Consequently, a sequence of these label-specific based constraints can be obtained, it follows that the reduct related to the previous label-specific based constraint may have guidance on the computation of that related to the subsequent label-specific based constraint. The thinking of this label-specific guidance runs through the whole process of searching reduct until the reduct over the whole universe is derived. Compared with five state-of-the-art algorithms over 20 data sets, the experimental results demonstrate that our proposed acceleration strategy can not only significantly accelerate the process of searching reduct but also offer justifiable performance in the task of classification. This study suggests a new trend concerning the problem of quickly deriving reduct.
Collapse
Affiliation(s)
- Yu Lu
- School of Computer, Jiangsu University of Science and Technology, Zhenjiang, Jiangsu, China
| | - Jingjing Song
- School of Computer, Jiangsu University of Science and Technology, Zhenjiang, Jiangsu, China
| | - Pingxin Wang
- School of Computer, Jiangsu University of Science and Technology, Zhenjiang, Jiangsu, China
| | - Taihua Xu
- School of Computer, Jiangsu University of Science and Technology, Zhenjiang, Jiangsu, China
| |
Collapse
|
28
|
Abstract
Attribute reduction is a critical topic in the field of rough set theory. Currently, to further enhance the stability of the derived reduct, various attribute selectors are designed based on the framework of ensemble selectors. Nevertheless, it must be pointed out that some limitations are concealed in these selectors: (1) rely heavily on the distribution of samples; (2) rely heavily on the optimal attribute. To generate the reduct with higher stability, a novel beam-influenced selector (BIS) is designed based on the strategies of random partition and beam. The scientific novelty of our selector can be divided into two aspects: (1) randomly partition samples without considering the distribution of samples; (2) beam-based selections of features can save the selector from the dependency of the optimal attribute. Comprehensive experiments using 16 UCI data sets show the following: (1) the stability of the derived reducts may be significantly enhanced by using our selector; (2) the reducts generated based on the proposed selector can provide competent performance in classification tasks.
Collapse
|
29
|
Multigranulation double-quantitative decision-theoretic rough sets based on logical operations. INT J MACH LEARN CYB 2022. [DOI: 10.1007/s13042-021-01476-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
30
|
Xu J, Yang J, Ma Y, Qu K, Kang Y. Feature selection method for color image steganalysis based on fuzzy neighborhood conditional entropy. APPL INTELL 2022. [DOI: 10.1007/s10489-021-02923-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
31
|
Attribute reduction based on overlap degree and k-nearest-neighbor rough sets in decision information systems. Inf Sci (N Y) 2022. [DOI: 10.1016/j.ins.2021.10.063] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
|
32
|
Liu K, Li T, Yang X, Yang X, Liu D, Zhang P, Wang J. Granular cabin: An efficient solution to neighborhood learning in big data. Inf Sci (N Y) 2022. [DOI: 10.1016/j.ins.2021.11.034] [Citation(s) in RCA: 11] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/24/2023]
|
33
|
Chen Z, Liu K, Yang X, Fujita H. Random sampling accelerator for attribute reduction. Int J Approx Reason 2022. [DOI: 10.1016/j.ijar.2021.09.016] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
34
|
|
35
|
Wang J, Ma X, Dai J, Zhan J. A novel three-way decision approach under hesitant fuzzy information. Inf Sci (N Y) 2021. [DOI: 10.1016/j.ins.2021.07.054] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
36
|
Chen Y, Wang P, Yang X, Mi J, Liu D. Granular ball guided selector for attribute reduction. Knowl Based Syst 2021. [DOI: 10.1016/j.knosys.2021.107326] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
37
|
Ju H, Ding W, Yang X, Fujita H, Xu S. Robust supervised rough granular description model with the principle of justifiable granularity. Appl Soft Comput 2021. [DOI: 10.1016/j.asoc.2021.107612] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
|
38
|
Wang J, Liang J, Cui J, Liang J. Semi-supervised learning with mixed-order graph convolutional networks. Inf Sci (N Y) 2021. [DOI: 10.1016/j.ins.2021.05.057] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
39
|
|
40
|
Deng J, Zhan J, Wu WZ. A three-way decision methodology to multi-attribute decision-making in multi-scale decision information systems. Inf Sci (N Y) 2021. [DOI: 10.1016/j.ins.2021.03.058] [Citation(s) in RCA: 23] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
41
|
Cost-sensitive feature selection on multi-label data via neighborhood granularity and label enhancement. APPL INTELL 2021. [DOI: 10.1007/s10489-020-01993-w] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
42
|
Abstract
Attribute reduction is commonly referred to as the key topic in researching rough set. Concerning the strategies for searching reduct, though various heuristics based forward greedy searchings have been developed, most of them were designed for pursuing one and only one characteristic which is closely related to the performance of reduct. Nevertheless, it is frequently expected that a justifiable searching should explicitly involves three main characteristics: (1) the process of obtaining reduct with low time consumption; (2) generate reduct with high stability; (3) acquire reduct with competent classification ability. To fill such gap, a hybrid based searching mechanism is designed, which takes the above characteristics into account. Such a mechanism not only adopts multiple fitness functions to evaluate the candidate attributes, but also queries the distance between attributes for determining whether two or more attributes can be added into the reduct simultaneously. The former may be useful in deriving reduct with higher stability and competent classification ability, and the latter may contribute to the lower time consumption of deriving reduct. By comparing with 5 state-of-the-art algorithms for searching reduct, the experimental results over 20 UCI data sets demonstrate the effectiveness of our new mechanism. This study suggests a new trend of attribute reduction for achieving a balance among various characteristics.
Collapse
|
43
|
|
44
|
Jiang Z, Liu K, Song J, Yang X, Li J, Qian Y. Accelerator for crosswise computing reduct. Appl Soft Comput 2021. [DOI: 10.1016/j.asoc.2020.106740] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
45
|
Jiang Z, Dou H, Song J, Wang P, Yang X, Qian Y. Data-guided multi-granularity selector for attribute reduction. APPL INTELL 2020. [DOI: 10.1007/s10489-020-01846-6] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/03/2023]
|