1
|
Yu B, Hu Y, Kang Y, Cai M. A novel variable precision rough set attribute reduction algorithm based on local attribute significance. Int J Approx Reason 2023. [DOI: 10.1016/j.ijar.2023.03.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/30/2023]
|
2
|
Chen Y, Wang P, Yang X, Yu H. Bee: towards a robust attribute reduction. INT J MACH LEARN CYB 2022. [DOI: 10.1007/s13042-022-01633-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2022]
|
3
|
Topological reduction algorithm for relation systems. Soft comput 2022. [DOI: 10.1007/s00500-022-07431-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/14/2022]
|
4
|
Yang X, Yang Y, Luo J, Liu D, Li T. A unified incremental updating framework of attribute reduction for two-dimensionally time-evolving data. Inf Sci (N Y) 2022. [DOI: 10.1016/j.ins.2022.04.026] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
5
|
Lu Y, Song J, Wang P, Xu T. Label-specific guidance for efficiently searching reduct. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS 2022. [DOI: 10.3233/jifs-213112] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
In the era of big data for exploring attribute reduction/rough set-based feature selection related problems, to design efficient strategies for deriving reducts and then reduce the dimensions of data, two fundamental perspectives of Granular Computing may be taken into account: breaking up the whole into pieces and gathering parts into a whole. From this point of view, a novel strategy named label-specific guidance is introduced into the process of searching reduct. Given a formal description of attribute reduction, by considering the corresponding constraint, we divide it into several label-specific based constraints. Consequently, a sequence of these label-specific based constraints can be obtained, it follows that the reduct related to the previous label-specific based constraint may have guidance on the computation of that related to the subsequent label-specific based constraint. The thinking of this label-specific guidance runs through the whole process of searching reduct until the reduct over the whole universe is derived. Compared with five state-of-the-art algorithms over 20 data sets, the experimental results demonstrate that our proposed acceleration strategy can not only significantly accelerate the process of searching reduct but also offer justifiable performance in the task of classification. This study suggests a new trend concerning the problem of quickly deriving reduct.
Collapse
Affiliation(s)
- Yu Lu
- School of Computer, Jiangsu University of Science and Technology, Zhenjiang, Jiangsu, China
| | - Jingjing Song
- School of Computer, Jiangsu University of Science and Technology, Zhenjiang, Jiangsu, China
| | - Pingxin Wang
- School of Computer, Jiangsu University of Science and Technology, Zhenjiang, Jiangsu, China
| | - Taihua Xu
- School of Computer, Jiangsu University of Science and Technology, Zhenjiang, Jiangsu, China
| |
Collapse
|
6
|
Xie J, Hu BQ, Jiang H. A novel method to attribute reduction based on weighted neighborhood probabilistic rough sets. Int J Approx Reason 2022. [DOI: 10.1016/j.ijar.2022.01.010] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
7
|
Three-way decision model under a large-scale group decision-making environment with detecting and managing non-cooperative behaviors in consensus reaching process. Artif Intell Rev 2022. [DOI: 10.1007/s10462-021-10133-w] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/02/2022]
|
8
|
Chen Y, Wang P, Yang X, Mi J, Liu D. Granular ball guided selector for attribute reduction. Knowl Based Syst 2021. [DOI: 10.1016/j.knosys.2021.107326] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
9
|
|
10
|
|
11
|
Xie X, Gu X, Li Y, Ji Z. K-size partial reduct: Positive region optimization for attribute reduction. Knowl Based Syst 2021. [DOI: 10.1016/j.knosys.2021.107253] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/21/2023]
|
12
|
|
13
|
|
14
|
Jiang C, Guo D, Sun L. Effectiveness measure for TAO model of three-way decisions with interval set. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS 2021. [DOI: 10.3233/jifs-202207] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
The basic idea of the three-way decisions (3WD) is ‘thinking in threes.’ The TAO (trisecting-acting-outcome) model of 3WD includes three components, trisect a whole into three reasonable regions, devise a corresponding strategy on the trisection, and measure the effectiveness of the outcome. By reviewing existing studies, we found that only a few papers touch upon the third component, i.e., measure the effect. This paper’s principal aim is to present an effectiveness measure framework consisting of three parts: a specific TAO model - Change-based TAO model, interval sets, and utility functions with unique characteristics. Specifically, the change-based TAO model provides a method to measure effectiveness based on the difference before and after applying a strategy or an action. First, we use interval sets to represent these changes when a strategy or an action is applied. These changes correspond to three different intervals. Second, we use the utility measurement method to figure out three change intervals. Namely, different utility measures correspond to the different intervals, concave utility metric, direct utility metric, and convex utility metric, respectively. Third, it aggregates the toll utility through the joint of the three utilities mentioned above. The weights among these three are adjusted by a dual expected utility function that conveys the decision-makers’ preferences. We give an example and experiment highlighting the validity and practicability of the utility measure method in the change-based TAO model of three-way decisions.
Collapse
Affiliation(s)
- Chunmao Jiang
- College of Computer Science and Information Engineer, Harbin Normal University, Harbin, Heilongjiang Province, China
| | - Doudou Guo
- College of Computer Science and Information Engineer, Harbin Normal University, Harbin, Heilongjiang Province, China
| | - Lijuan Sun
- College of Computer Science and Information Engineer, Harbin Normal University, Harbin, Heilongjiang Province, China
| |
Collapse
|
15
|
Cost-sensitive selection of variables by ensemble of model sequences. Knowl Inf Syst 2021. [DOI: 10.1007/s10115-021-01551-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
16
|
Jafari-Marandi R. Supervised or unsupervised learning? Investigating the role of pattern recognition assumptions in the success of binary predictive prescriptions. Neurocomputing 2021. [DOI: 10.1016/j.neucom.2020.12.063] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
17
|
Cost-sensitive feature selection on multi-label data via neighborhood granularity and label enhancement. APPL INTELL 2021. [DOI: 10.1007/s10489-020-01993-w] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
18
|
Cost-sensitive hierarchical classification via multi-scale information entropy for data with an imbalanced distribution. APPL INTELL 2021. [DOI: 10.1007/s10489-020-02089-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
19
|
|
20
|
Jiang Z, Liu K, Song J, Yang X, Li J, Qian Y. Accelerator for crosswise computing reduct. Appl Soft Comput 2021. [DOI: 10.1016/j.asoc.2020.106740] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
21
|
A two-stage density clustering algorithm. Soft comput 2020. [DOI: 10.1007/s00500-020-05028-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
22
|
Mustafa H, Tantawy O. A new approach of attribute reduction of rough sets based on soft metric. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS 2020. [DOI: 10.3233/jifs-200457] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Attribute reduction is considered as an important processing step for pattern recognition, machine learning and data mining. In this paper, we combine soft set and rough set to use them in applications. We generalize rough set model and introduce a soft metric rough set model to deal with the problem of heterogeneous numerical feature subset selection. We construct a soft metric on the family of knowledge structures based on the soft distance between attributes. The proposed model will degrade to the classical one if we specify a zero soft real number. We also provide a systematic study of attribute reduction of rough sets based on soft metric. Based on the constructed metric, we define co-information systems and consistent co-decision systems, and we provide a new method of attribute reductions of each system. Furthermore, we present a judgement theorem and discernibility matrix associated with attribute of each type of system. As an application, we present a case study from Zoo data set to verify our theoretical results.
Collapse
Affiliation(s)
- H.I. Mustafa
- Department of Mathematics, Faculty of Science, Zagazig University, Egypt
| | - O.A. Tantawy
- Department of Mathematics, Faculty of Science, Zagazig University, Egypt
| |
Collapse
|
23
|
|
24
|
Jiang Z, Dou H, Song J, Wang P, Yang X, Qian Y. Data-guided multi-granularity selector for attribute reduction. APPL INTELL 2020. [DOI: 10.1007/s10489-020-01846-6] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/03/2023]
|
25
|
Rao X, Yang X, Yang X, Chen X, Liu D, Qian Y. Quickly calculating reduct: An attribute relationship based approach. Knowl Based Syst 2020. [DOI: 10.1016/j.knosys.2020.106014] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
26
|
Jiang Z, Liu K, Yang X, Yu H, Fujita H, Qian Y. Accelerator for supervised neighborhood based attribute reduction. Int J Approx Reason 2020. [DOI: 10.1016/j.ijar.2019.12.013] [Citation(s) in RCA: 54] [Impact Index Per Article: 13.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
|
27
|
Zhang X, Zhang Q, Cheng Y, Wang G. Optimal scale selection by integrating uncertainty and cost-sensitive learning in multi-scale decision tables. INT J MACH LEARN CYB 2020. [DOI: 10.1007/s13042-020-01101-x] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
28
|
|
29
|
A three-way decision method in a hybrid decision information system and its application in medical diagnosis. Artif Intell Rev 2020. [DOI: 10.1007/s10462-020-09805-w] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
30
|
|
31
|
Sowkuntla P, Sai Prasad P. MapReduce based improved quick reduct algorithm with granular refinement using vertical partitioning scheme. Knowl Based Syst 2020. [DOI: 10.1016/j.knosys.2019.105104] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
32
|
Li H, Zhang L, Huang B, Zhou X. Cost-sensitive dual-bidirectional linear discriminant analysis. Inf Sci (N Y) 2020. [DOI: 10.1016/j.ins.2019.09.032] [Citation(s) in RCA: 27] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
|
33
|
Inclusion measure-based multi-granulation decision-theoretic rough sets in multi-scale intuitionistic fuzzy information tables. Inf Sci (N Y) 2020. [DOI: 10.1016/j.ins.2018.08.061] [Citation(s) in RCA: 47] [Impact Index Per Article: 11.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022]
|
34
|
Sang B, Yang L, Chen H, Xu W, Guo Y, Yuan Z. Generalized multi-granulation double-quantitative decision-theoretic rough set of multi-source information system. Int J Approx Reason 2019. [DOI: 10.1016/j.ijar.2019.09.009] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
35
|
Xie X, Qin X, Zhou Q, Zhou Y, Zhang T, Janicki R, Zhao W. A novel test-cost-sensitive attribute reduction approach using the binary bat algorithm. Knowl Based Syst 2019. [DOI: 10.1016/j.knosys.2019.104938] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
36
|
|
37
|
Zhang C, Dai J. An incremental attribute reduction approach based on knowledge granularity for incomplete decision systems. GRANULAR COMPUTING 2019. [DOI: 10.1007/s41066-019-00173-7] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
38
|
|
39
|
|
40
|
|
41
|
|
42
|
Li W, Jia X, Wang L, Zhou B. Multi-objective attribute reduction in three-way decision-theoretic rough set model. Int J Approx Reason 2019. [DOI: 10.1016/j.ijar.2018.12.008] [Citation(s) in RCA: 59] [Impact Index Per Article: 11.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
43
|
Yang J, Wang G, Zhang Q, Chen Y, Xu T. Optimal granularity selection based on cost-sensitive sequential three-way decisions with rough fuzzy sets. Knowl Based Syst 2019. [DOI: 10.1016/j.knosys.2018.08.019] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
|
44
|
Fang Y, Min F. Cost-sensitive approximate attribute reduction with three-way decisions. Int J Approx Reason 2019. [DOI: 10.1016/j.ijar.2018.11.003] [Citation(s) in RCA: 56] [Impact Index Per Article: 11.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
45
|
|
46
|
Liao S, Zhu Q, Qian Y, Lin G. Multi-granularity feature selection on cost-sensitive data with measurement errors and variable costs. Knowl Based Syst 2018. [DOI: 10.1016/j.knosys.2018.05.020] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/16/2022]
|
47
|
|
48
|
Chang Y, Zou X, Wang F, Zhao L, Zheng W. Multi-mode plant-wide process operating performance assessment based on a novel two-level multi-block hybrid model. Chem Eng Res Des 2018. [DOI: 10.1016/j.cherd.2018.05.023] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/14/2022]
|
49
|
Fan X, Zhao W, Wang C, Huang Y. Attribute reduction based on max-decision neighborhood rough set model. Knowl Based Syst 2018. [DOI: 10.1016/j.knosys.2018.03.015] [Citation(s) in RCA: 46] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
50
|
Yu S, Zhao H. Rough sets and Laplacian score based cost-sensitive feature selection. PLoS One 2018; 13:e0197564. [PMID: 29912884 PMCID: PMC6005488 DOI: 10.1371/journal.pone.0197564] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2017] [Accepted: 12/10/2017] [Indexed: 12/02/2022] Open
Abstract
Cost-sensitive feature selection learning is an important preprocessing step in machine learning and data mining. Recently, most existing cost-sensitive feature selection algorithms are heuristic algorithms, which evaluate the importance of each feature individually and select features one by one. Obviously, these algorithms do not consider the relationship among features. In this paper, we propose a new algorithm for minimal cost feature selection called the rough sets and Laplacian score based cost-sensitive feature selection. The importance of each feature is evaluated by both rough sets and Laplacian score. Compared with heuristic algorithms, the proposed algorithm takes into consideration the relationship among features with locality preservation of Laplacian score. We select a feature subset with maximal feature importance and minimal cost when cost is undertaken in parallel, where the cost is given by three different distributions to simulate different applications. Different from existing cost-sensitive feature selection algorithms, our algorithm simultaneously selects out a predetermined number of “good” features. Extensive experimental results show that the approach is efficient and able to effectively obtain the minimum cost subset. In addition, the results of our method are more promising than the results of other cost-sensitive feature selection algorithms.
Collapse
Affiliation(s)
- Shenglong Yu
- Fujian Key Laboratory of Granular Computing and Application (Minnan Normal University), Zhangzhou, Fujian, China
- Key Laboratory of Data Science and Intelligence Application, Fujian Province University, Zhangzhou, Fujian, China
| | - Hong Zhao
- Fujian Key Laboratory of Granular Computing and Application (Minnan Normal University), Zhangzhou, Fujian, China
- Key Laboratory of Data Science and Intelligence Application, Fujian Province University, Zhangzhou, Fujian, China
- * E-mail:
| |
Collapse
|