101
|
Dalwinder S, Birmohan S, Manpreet K. Simultaneous feature weighting and parameter determination of Neural Networks using Ant Lion Optimization for the classification of breast cancer. Biocybern Biomed Eng 2020. [DOI: 10.1016/j.bbe.2019.12.004] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/17/2023]
|
102
|
Heidari AA, Faris H, Mirjalili S, Aljarah I, Mafarja M. Ant Lion Optimizer: Theory, Literature Review, and Application in Multi-layer Perceptron Neural Networks. NATURE-INSPIRED OPTIMIZERS 2020. [DOI: 10.1007/978-3-030-12127-3_3] [Citation(s) in RCA: 50] [Impact Index Per Article: 12.5] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/03/2022]
|
103
|
Tang H, Xu Y, Lin A, Heidari AA, Wang M, Chen H, Luo Y, Li C. Predicting Green Consumption Behaviors of Students Using Efficient Firefly Grey Wolf-Assisted K-Nearest Neighbor Classifiers. IEEE ACCESS 2020. [PMID: 0 DOI: 10.1109/access.2020.2973763] [Citation(s) in RCA: 21] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/06/2023]
|
104
|
Abstract
Harris hawk optimization (HHO) is one of the recently proposed metaheuristic algorithms that has proven to be work more effectively in several challenging optimization tasks. However, the original HHO is developed to solve the continuous optimization problems, but not to the problems with binary variables. This paper proposes the binary version of HHO (BHHO) to solve the feature selection problem in classification tasks. The proposed BHHO is equipped with an S-shaped or V-shaped transfer function to convert the continuous variable into a binary one. Moreover, another variant of HHO, namely quadratic binary Harris hawk optimization (QBHHO), is proposed to enhance the performance of BHHO. In this study, twenty-two datasets collected from the UCI machine learning repository are used to validate the performance of proposed algorithms. A comparative study is conducted to compare the effectiveness of QBHHO with other feature selection algorithms such as binary differential evolution (BDE), genetic algorithm (GA), binary multi-verse optimizer (BMVO), binary flower pollination algorithm (BFPA), and binary salp swarm algorithm (BSSA). The experimental results show the superiority of the proposed QBHHO in terms of classification performance, feature size, and fitness values compared to other algorithms.
Collapse
|
105
|
A novel parameter estimation in dynamic model via fuzzy swarm intelligence and chaos theory for faults in wastewater treatment plant. Soft comput 2019. [DOI: 10.1007/s00500-019-04225-7] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/31/2023]
|
106
|
Feature selection using binary grey wolf optimizer with elite-based crossover for Arabic text classification. Neural Comput Appl 2019. [DOI: 10.1007/s00521-019-04368-6] [Citation(s) in RCA: 42] [Impact Index Per Article: 8.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
|
107
|
|
108
|
Lee J, Park J, Kim HC, Kim DW. Competitive Particle Swarm Optimization for Multi-Category Text Feature Selection. ENTROPY 2019; 21:e21060602. [PMID: 33267316 PMCID: PMC7515086 DOI: 10.3390/e21060602] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/22/2019] [Revised: 06/11/2019] [Accepted: 06/17/2019] [Indexed: 11/16/2022]
Abstract
Multi-label feature selection is an important task for text categorization. This is because it enables learning algorithms to focus on essential features that foreshadow relevant categories, thereby improving the accuracy of text categorization. Recent studies have considered the hybridization of evolutionary feature wrappers and filters to enhance the evolutionary search process. However, the relative effectiveness of feature subset searches of evolutionary and feature filter operators has not been considered. This results in degenerated final feature subsets. In this paper, we propose a novel hybridization approach based on competition between the operators. This enables the proposed algorithm to apply each operator selectively and modify the feature subset according to its relative effectiveness, unlike conventional methods. The experimental results on 16 text datasets verify that the proposed method is superior to conventional methods.
Collapse
|
109
|
Abstract
Feature selection is known as an NP-hard combinatorial problem in which the possible feature subsets increase exponentially with the number of features. Due to the increment of the feature size, the exhaustive search has become impractical. In addition, a feature set normally includes irrelevant, redundant, and relevant information. Therefore, in this paper, binary variants of a competitive swarm optimizer are proposed for wrapper feature selection. The proposed approaches are used to select a subset of significant features for classification purposes. The binary version introduced here is performed by employing the S-shaped and V-shaped transfer functions, which allows the search agents to move on the binary search space. The proposed approaches are tested by using 15 benchmark datasets collected from the UCI machine learning repository, and the results are compared with other conventional feature selection methods. Our results prove the capability of the proposed binary version of the competitive swarm optimizer not only in terms of high classification performance, but also low computational cost.
Collapse
|
110
|
A New Co-Evolution Binary Particle Swarm Optimization with Multiple Inertia Weight Strategy for Feature Selection. INFORMATICS-BASEL 2019. [DOI: 10.3390/informatics6020021] [Citation(s) in RCA: 29] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Feature selection is a task of choosing the best combination of potential features that best describes the target concept during a classification process. However, selecting such relevant features becomes a difficult matter when large number of features are involved. Therefore, this study aims to solve the feature selection problem using binary particle swarm optimization (BPSO). Nevertheless, BPSO has limitations of premature convergence and the setting of inertia weight. Hence, a new co-evolution binary particle swarm optimization with a multiple inertia weight strategy (CBPSO-MIWS) is proposed in this work. The proposed method is validated with ten benchmark datasets from UCI machine learning repository. To examine the effectiveness of proposed method, four recent and popular feature selection methods namely BPSO, genetic algorithm (GA), binary gravitational search algorithm (BGSA) and competitive binary grey wolf optimizer (CBGWO) are used in a performance comparison. Our results show that CBPSO-MIWS can achieve competitive performance in feature selection, which is appropriate for application in engineering, rehabilitation and clinical areas.
Collapse
|
111
|
Li LL, Zhang XB, Tseng ML, Zhou YT. Optimal scale Gaussian process regression model in Insulated Gate Bipolar Transistor remaining life prediction. Appl Soft Comput 2019. [DOI: 10.1016/j.asoc.2019.02.035] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/30/2023]
|
112
|
Anand P, Arora S. A novel chaotic selfish herd optimizer for global optimization and feature selection. Artif Intell Rev 2019. [DOI: 10.1007/s10462-019-09707-6] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
|
113
|
Pourpanah F, Lim CP, Wang X, Tan CJ, Seera M, Shi Y. A hybrid model of fuzzy min–max and brain storm optimization for feature selection and data classification. Neurocomputing 2019. [DOI: 10.1016/j.neucom.2019.01.011] [Citation(s) in RCA: 37] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
114
|
Tu Q, Chen X, Liu X. Multi-strategy ensemble grey wolf optimizer and its application to feature selection. Appl Soft Comput 2019. [DOI: 10.1016/j.asoc.2018.11.047] [Citation(s) in RCA: 62] [Impact Index Per Article: 12.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
|
115
|
Emary E, Zawbaa HM, Sharawi M. Impact of Lèvy flight on modern meta-heuristic optimizers. Appl Soft Comput 2019. [DOI: 10.1016/j.asoc.2018.11.033] [Citation(s) in RCA: 36] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
116
|
Hussien AG, Hassanien AE, Houssein EH, Bhattacharyya S, Amin M. S-shaped Binary Whale Optimization Algorithm for Feature Selection. RECENT TRENDS IN SIGNAL AND IMAGE PROCESSING 2019. [DOI: 10.1007/978-981-10-8863-6_9] [Citation(s) in RCA: 78] [Impact Index Per Article: 15.6] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
|
117
|
Feature Selection Using Chaotic Salp Swarm Algorithm for Data Classification. ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING 2018. [DOI: 10.1007/s13369-018-3680-6] [Citation(s) in RCA: 43] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
118
|
Zawbaa HM, Schiano S, Perez-Gandarillas L, Grosan C, Michrafy A, Wu CY. Computational intelligence modelling of pharmaceutical tabletting processes using bio-inspired optimization algorithms. ADV POWDER TECHNOL 2018. [DOI: 10.1016/j.apt.2018.11.008] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
119
|
Aljarah I, Mafarja M, Heidari AA, Faris H, Zhang Y, Mirjalili S. Asynchronous accelerating multi-leader salp chains for feature selection. Appl Soft Comput 2018. [DOI: 10.1016/j.asoc.2018.07.040] [Citation(s) in RCA: 107] [Impact Index Per Article: 17.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/09/2023]
|
120
|
An efficient binary Salp Swarm Algorithm with crossover scheme for feature selection problems. Knowl Based Syst 2018. [DOI: 10.1016/j.knosys.2018.05.009] [Citation(s) in RCA: 390] [Impact Index Per Article: 65.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/16/2023]
|
121
|
Dinkar SK, Deep K. Accelerated Opposition-Based Antlion Optimizer with Application to Order Reduction of Linear Time-Invariant Systems. ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING 2018. [DOI: 10.1007/s13369-018-3370-4] [Citation(s) in RCA: 26] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
122
|
Yu S, Zhao H. Rough sets and Laplacian score based cost-sensitive feature selection. PLoS One 2018; 13:e0197564. [PMID: 29912884 PMCID: PMC6005488 DOI: 10.1371/journal.pone.0197564] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2017] [Accepted: 12/10/2017] [Indexed: 12/02/2022] Open
Abstract
Cost-sensitive feature selection learning is an important preprocessing step in machine learning and data mining. Recently, most existing cost-sensitive feature selection algorithms are heuristic algorithms, which evaluate the importance of each feature individually and select features one by one. Obviously, these algorithms do not consider the relationship among features. In this paper, we propose a new algorithm for minimal cost feature selection called the rough sets and Laplacian score based cost-sensitive feature selection. The importance of each feature is evaluated by both rough sets and Laplacian score. Compared with heuristic algorithms, the proposed algorithm takes into consideration the relationship among features with locality preservation of Laplacian score. We select a feature subset with maximal feature importance and minimal cost when cost is undertaken in parallel, where the cost is given by three different distributions to simulate different applications. Different from existing cost-sensitive feature selection algorithms, our algorithm simultaneously selects out a predetermined number of “good” features. Extensive experimental results show that the approach is efficient and able to effectively obtain the minimum cost subset. In addition, the results of our method are more promising than the results of other cost-sensitive feature selection algorithms.
Collapse
Affiliation(s)
- Shenglong Yu
- Fujian Key Laboratory of Granular Computing and Application (Minnan Normal University), Zhangzhou, Fujian, China
- Key Laboratory of Data Science and Intelligence Application, Fujian Province University, Zhangzhou, Fujian, China
| | - Hong Zhao
- Fujian Key Laboratory of Granular Computing and Application (Minnan Normal University), Zhangzhou, Fujian, China
- Key Laboratory of Data Science and Intelligence Application, Fujian Province University, Zhangzhou, Fujian, China
- * E-mail:
| |
Collapse
|
123
|
Hybrid binary ant lion optimizer with rough set and approximate entropy reducts for feature selection. Soft comput 2018. [DOI: 10.1007/s00500-018-3282-y] [Citation(s) in RCA: 57] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/14/2022]
|
124
|
Evolutionary Population Dynamics and Grasshopper Optimization approaches for feature selection problems. Knowl Based Syst 2018. [DOI: 10.1016/j.knosys.2017.12.037] [Citation(s) in RCA: 269] [Impact Index Per Article: 44.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
|
125
|
|
126
|
Ensemble of Filter-Based Rankers to Guide an Epsilon-Greedy Swarm Optimizer for High-Dimensional Feature Subset Selection. INFORMATION 2017. [DOI: 10.3390/info8040152] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
|
127
|
Zhao XS, Bao LL, Ning Q, Ji JC, Zhao XW. An Improved Binary Differential Evolution Algorithm for Feature Selection in Molecular Signatures. Mol Inform 2017; 37:e1700081. [PMID: 29106044 DOI: 10.1002/minf.201700081] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2017] [Accepted: 10/18/2017] [Indexed: 11/08/2022]
Abstract
The discovery of biomarkers from high-dimensional data is a very challenging task in cancer diagnoses. On the one hand, biomarker discovery is the so-called high-dimensional small-sample problem. On the other hand, these data are redundant and noisy. In recent years, biomarker discovery from high-throughput biological data has become an increasingly important emerging topic in the field of bioinformatics. In this study, we propose a binary differential evolution algorithm for feature selection. Firstly, we suggest using a two-stage approach, where three filter methods including the Fisher score, T-statistics, and Information gain are used to generate the feature pool for input to differential evolution (DE). Secondly, in order to improve the performance of differential evolution algorithm for feature selection, a new variant of binary DE called BDE is proposed. Three optimization strategies are incorporated into the BDE. The first strategy is the heuristic method in initial stage, the second one is the self-adaptive parameter control, and the third one is the minimum change value to improve the exploration behaviour thus enhance the diversity. Finally, Support vector machine (SVM) is used as the classifier in 10 fold cross-validation method. The experimental results of our proposed algorithm on some benchmark datasets demonstrate the effectiveness of our algorithm. In addition, the BDE forged in this study will be of great potential in feature selection problems.
Collapse
Affiliation(s)
- X S Zhao
- School of Computer Science and Information Technology, Northeast Normal University, Changchun, 130000, P.R.China
| | - L L Bao
- School of Computer Science and Information Technology, Northeast Normal University, Changchun, 130000, P.R.China
| | - Q Ning
- School of Computer Science and Information Technology, Northeast Normal University, Changchun, 130000, P.R.China
| | - J C Ji
- School of Computer Science and Information Technology, Northeast Normal University, Changchun, 130000, P.R.China
| | - X W Zhao
- School of Computer Science and Information Technology, Northeast Normal University, Changchun, 130000, P.R.China.,Key Laboratory of symbolic Computation and Knowledge Engineering of Ministry of Education, Jilin University, Changchun, 130012, China
| |
Collapse
|
128
|
Mafarja MM, Mirjalili S. Hybrid Whale Optimization Algorithm with simulated annealing for feature selection. Neurocomputing 2017. [DOI: 10.1016/j.neucom.2017.04.053] [Citation(s) in RCA: 670] [Impact Index Per Article: 95.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
129
|
Ma B, Xia Y. A tribe competition-based genetic algorithm for feature selection in pattern classification. Appl Soft Comput 2017. [DOI: 10.1016/j.asoc.2017.04.042] [Citation(s) in RCA: 40] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
130
|
|
131
|
|
132
|
Abd Elaziz ME, Ewees AA, Oliva D, Duan P, Xiong S. A Hybrid Method of Sine Cosine Algorithm and Differential Evolution for Feature Selection. NEURAL INFORMATION PROCESSING 2017:145-155. [DOI: 10.1007/978-3-319-70139-4_15] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/02/2023]
|