1
|
Quadir A, Tanveer M. Multiview learning with twin parametric margin SVM. Neural Netw 2024; 180:106598. [PMID: 39173204 DOI: 10.1016/j.neunet.2024.106598] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2024] [Revised: 06/27/2024] [Accepted: 08/02/2024] [Indexed: 08/24/2024]
Abstract
Multiview learning (MVL) seeks to leverage the benefits of diverse perspectives to complement each other, effectively extracting and utilizing the latent information within the dataset. Several twin support vector machine-based MVL (MvTSVM) models have been introduced and demonstrated outstanding performance in various learning tasks. However, MvTSVM-based models face significant challenges in the form of computational complexity due to four matrix inversions, the need to reformulate optimization problems in order to employ kernel-generated surfaces for handling non-linear cases, and the constraint of uniform noise assumption in the training data. Particularly in cases where the data possesses a heteroscedastic error structure, these challenges become even more pronounced. In view of the aforementioned challenges, we propose multiview twin parametric margin support vector machine (MvTPMSVM). MvTPMSVM constructs parametric margin hyperplanes corresponding to both classes, aiming to regulate and manage the impact of the heteroscedastic noise structure existing within the data. The proposed MvTPMSVM model avoids the explicit computation of matrix inversions in the dual formulation, leading to enhanced computational efficiency. We perform an extensive assessment of the MvTPMSVM model using benchmark datasets such as UCI, KEEL, synthetic, and Animals with Attributes (AwA). Our experimental results, coupled with rigorous statistical analyses, confirm the superior generalization capabilities of the proposed MvTPMSVM model compared to the baseline models. The source code of the proposed MvTPMSVM model is available at https://github.com/mtanveer1/MvTPMSVM.
Collapse
Affiliation(s)
- A Quadir
- Department of Mathematics, Indian Institute of Technology Indore, Simrol, Indore, 453552, Madhya Pradesh, India
| | - M Tanveer
- Department of Mathematics, Indian Institute of Technology Indore, Simrol, Indore, 453552, Madhya Pradesh, India.
| |
Collapse
|
2
|
Cui Z, Ding Z, Xu J, Zhang S, Wu J, Lian W. Probabilistic sunspot predictions with a gated recurrent units-based combined model guided by pinball loss. Sci Rep 2024; 14:13601. [PMID: 38867068 PMCID: PMC11169250 DOI: 10.1038/s41598-024-63878-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2024] [Accepted: 06/03/2024] [Indexed: 06/14/2024] Open
Abstract
Sunspots play a crucial role in both weather forecasting and the monitoring of solar storms. In this work, we propose a novel combined model for sunspot prediction using improved gated recurrent units (GRU) guided by pinball loss for probabilistic forecasts. Specifically, we optimize the GRU parameters using the slime mould algorithm and employ a seasonal-trend decomposition procedure based on loess to tackle challenges related to sequence prediction, such as self-correlations and non-stationarity. To address prediction uncertainty, we replace the traditional l 2 -norm loss with pinball loss. This modification extends the conventional GRU-based point forecasting to a probabilistic framework expressed as quantiles. We apply our proposed model to analyze a well-established historical sunspot dataset for both single- and multi-step ahead forecasting. The results demonstrate the effectiveness of our combined model in predicting sunspot values, surpassing the performance of other existing methods.
Collapse
Affiliation(s)
- Zhesen Cui
- Department of Computer Science, Changzhi University, Changzhi, 046011, People's Republic of China
| | - Zhe Ding
- School of Computer Science, Queensland University of Technology, Brisbane, QLD, 4001, Australia.
| | - Jing Xu
- College of Hydraulic Science and Engineering, Yangzhou University, Yangzhou, 225009, People's Republic of China
| | - Shaotong Zhang
- Frontiers Science Center for Deep Ocean Multispheres and Earth System, Key Lab of Submarine Geosciences and Prospecting Techniques, MOE and College of Marine Geosciences, Ocean University of China, Qingdao, 266100, People's Republic of China
| | - Jinran Wu
- Institute for Positive Psychology and Education, Australian Catholic University, Banyo, QLD, 4014, Australia
| | - Wei Lian
- Department of Computer Science, Changzhi University, Changzhi, 046011, People's Republic of China
| |
Collapse
|
3
|
Wang H, Zhu J, Zhang S. Safe screening rules for multi-view support vector machines. Neural Netw 2023; 166:326-343. [PMID: 37541164 DOI: 10.1016/j.neunet.2023.07.021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2022] [Revised: 07/11/2023] [Accepted: 07/12/2023] [Indexed: 08/06/2023]
Abstract
Multi-view learning aims to make use of the advantages of different views to complement each other and fully mines the potential information in the data. However, the complexity of multi-view learning algorithm is much higher than that of single view learning algorithm. Based on the optimality conditions of two classical multi-view models: SVM-2K and multi-view twin support vector machine (MvTwSVM), this paper analyzes the corresponding relationship between dual variables and samples, and derives their safe screening rules for the first time, termed as SSR-SVM-2K and SSR-MvTwSVM. It can assign or delete four groups of different dual variables in advance before solving the optimization problem, so as to greatly reduce the scale of the optimization problem and improve the solution speed. More importantly, the safe screening criterion is "safe", that is, the solution of the reduced optimization problem is the same as that of the original problem before screening. In addition, we further give a sequence screening rule to speed up the parameter optimization process, and analyze its properties, including the similarities and differences of safe screening rules between multi-view SVMs and single-view SVMs, the computational complexity, and the relationship between the parameter interval and screening rate. Numerical experiments verify the effectiveness of the proposed methods.
Collapse
Affiliation(s)
- Huiru Wang
- Department of Mathematics, College of Science, Beijing Forestry University, No. 35 Qinghua East Road, 100083 Haidian, Beijing, China.
| | - Jiayi Zhu
- School of Computer Science and Engineering and Guangdong Province Key Laboratory of Computational Science, Sun Yat-Sen University, Guangzhou, Guangdong 510006, China
| | - Siyuan Zhang
- College of Information and Electrical Engineering, China Agricultural University, No. 17 Qinghua East Road, 100083 Haidian, Beijing, China
| |
Collapse
|
4
|
Mohan NJ, Murugan R, Goel T, Tanveer M, Roy P. An efficient microaneurysms detection approach in retinal fundus images. INT J MACH LEARN CYB 2023. [DOI: 10.1007/s13042-022-01696-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/19/2023]
|
5
|
Thanigaivelu PS, Sridhar SS, Sulthana SF. OISVM: Optimal Incremental Support Vector Machine-based EEG Classification for Brain-computer Interface Model. Cognit Comput 2023. [DOI: 10.1007/s12559-023-10120-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/02/2023]
|
6
|
Capped Asymmetric Elastic Net Support Vector Machine for Robust Binary Classification. INT J INTELL SYST 2023. [DOI: 10.1155/2023/2201330] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/25/2023]
Abstract
Recently, there are lots of literature on improving the robustness of SVM by constructing nonconvex functions, but they seldom theoretically study the robust property of the constructed functions. In this paper, based on our recent work, we present a novel capped asymmetric elastic net (CaEN) loss and equip it with the SVM as CaENSVM. We derive the influence function of the estimators of the CaENSVM to theoretically explain the robustness of the proposed method. Our results can be easily extended to other similar nonconvex loss functions. We further show that the influence function of the CaENSVM is bounded, so that the robustness of the CaENSVM can be theoretically explained. Other theoretical analysis demonstrates that the CaENSVM satisfies the Bayes rule and the corresponding generalization error bound based on Rademacher complexity guarantees its good generalization capability. Since CaEN loss is concave, we implement an efficient DC procedure based on the stochastic gradient descent algorithm (Pegasos) to solve the optimization problem. A host of experiments are conducted to verify the effectiveness of our proposed CaENSVM model.
Collapse
|
7
|
Incremental learning for Lagrangian ε-twin support vector regression. Soft comput 2023. [DOI: 10.1007/s00500-022-07755-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
|
8
|
Non-parallel bounded support matrix machine and its application in roller bearing fault diagnosis. Inf Sci (N Y) 2023. [DOI: 10.1016/j.ins.2022.12.090] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/03/2023]
|
9
|
Online Learning Approach Based on Recursive Formulation for Twin Support Vector Machine and Sparse Pinball Twin Support Vector Machine. Neural Process Lett 2022. [DOI: 10.1007/s11063-022-11084-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
10
|
EEG signal classification using improved intuitionistic fuzzy twin support vector machines. Neural Comput Appl 2022. [DOI: 10.1007/s00521-022-07655-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/15/2022]
|
11
|
A Modified Stein Variational Inference Algorithm with Bayesian and Gradient Descent Techniques. Symmetry (Basel) 2022. [DOI: 10.3390/sym14061188] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023] Open
Abstract
This paper introduces a novel variational inference (VI) method with Bayesian and gradient descent techniques. To facilitate the approximation of the posterior distributions for the parameters of the models, the Stein method has been used in Bayesian variational inference algorithms in recent years. Unfortunately, previous methods fail to either explicitly describe the influence of its history in the tracing of particles (Q(x) in this paper) in the approximation, which is important information in the search for particles. In our paper, Q(x) is considered in design of the operator Bp, but the chance of jumping out of the local optimum may be increased, especially in the case of complex distribution. To address the existing issues, a modified Stein variational inference algorithm is proposed, which can make the gradient descent of Kullback–Leibler (KL) divergence more random. In our method, a group of particles are used to approximate target distribution by minimizing the KL divergence, which changes according to the newly defined kernelized Stein discrepancy. Furthermore, the usefulness of the suggested technique is demonstrated by using four data sets. Bayesian logistic regression is considered for classification. Statistical studies such as parameter estimate classification accuracy, F1, NRMSE, and others are used to validate the algorithm’s performance.
Collapse
|
12
|
Ramp loss KNN-weighted multi-class twin support vector machine. Soft comput 2022. [DOI: 10.1007/s00500-022-07040-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
13
|
Multiple birth support vector machine based on dynamic quantum particle swarm optimization algorithm. Neurocomputing 2022. [DOI: 10.1016/j.neucom.2022.01.012] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
14
|
A Novel Twin Support Vector Machine with Generalized Pinball Loss Function for Pattern Classification. Symmetry (Basel) 2022. [DOI: 10.3390/sym14020289] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
We introduce a novel twin support vector machine with the generalized pinball loss function (GPin-TSVM) for solving data classification problems that are less sensitive to noise and preserve the sparsity of the solution. In addition, we use a symmetric kernel trick to enlarge GPin-TSVM to nonlinear classification problems. The developed approach is tested on numerous UCI benchmark datasets, as well as synthetic datasets in the experiments. The comparisons demonstrate that our proposed algorithm outperforms existing classifiers in terms of accuracy. Furthermore, this employed approach in handwritten digit recognition applications is examined, and the automatic feature extractor employs a convolution neural network.
Collapse
|
15
|
Sharma R, Goel T, Tanveer M, Murugan R. FDN-ADNet: Fuzzy LS-TWSVM based deep learning network for prognosis of the Alzheimer’s disease using the sagittal plane of MRI scans. Appl Soft Comput 2022. [DOI: 10.1016/j.asoc.2021.108099] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/09/2023]
|
16
|
Liang Z, Zhang L. Intuitionistic fuzzy twin support vector machines with the insensitive pinball loss. Appl Soft Comput 2022. [DOI: 10.1016/j.asoc.2021.108231] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
17
|
Reductive and effective discriminative information-based nonparallel support vector machine. APPL INTELL 2021. [DOI: 10.1007/s10489-021-02874-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
18
|
|
19
|
|
20
|
Ye Y, Shao Y, Li C, Hua X, Guo Y. Online support vector quantile regression for the dynamic time series with heavy-tailed noise. Appl Soft Comput 2021. [DOI: 10.1016/j.asoc.2021.107560] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
21
|
|
22
|
Abstract
AbstractThe twin support vector machine improves the classification performance of the support vector machine by solving two small quadratic programming problems. However, this method has the following defects: (1) For the twin support vector machine and some of its variants, the constructed models use a hinge loss function, which is sensitive to noise and unstable in resampling. (2) The models need to be converted from the original space to the dual space, and their time complexity is high. To further enhance the performance of the twin support vector machine, the pinball loss function is introduced into the twin bounded support vector machine, and the problem of the pinball loss function not being differentiable at zero is solved by constructing a smooth approximation function. Based on this, a smooth twin bounded support vector machine model with pinball loss is obtained. The model is solved iteratively in the original space using the Newton-Armijo method. A smooth twin bounded support vector machine algorithm with pinball loss is proposed, and theoretically the convergence of the iterative algorithm is proven. In the experiments, the proposed algorithm is validated on the UCI datasets and the artificial datasets. Furthermore, the performance of the presented algorithm is compared with those of other representative algorithms, thereby demonstrating the effectiveness of the proposed algorithm.
Collapse
|
23
|
Tanveer M, Ganaie M, Suganthan P. Ensemble of classification models with weighted functional link network. Appl Soft Comput 2021. [DOI: 10.1016/j.asoc.2021.107322] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|
24
|
Ma J, Yang L, Sun Q. Adaptive robust learning framework for twin support vector machine classification. Knowl Based Syst 2021. [DOI: 10.1016/j.knosys.2020.106536] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
25
|
Liu MZ, Shao YH, Li CN, Chen WJ. Smooth pinball loss nonparallel support vector machine for robust classification. Appl Soft Comput 2021. [DOI: 10.1016/j.asoc.2020.106840] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
26
|
Ganaie M, Tanveer M. LSTSVM classifier with enhanced features from pre-trained functional link network. Appl Soft Comput 2020. [DOI: 10.1016/j.asoc.2020.106305] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
|
27
|
Chen WJ, Shao YH, Li CN, Wang YQ, Liu MZ, Wang Z. NPrSVM: Nonparallel sparse projection support vector machine with efficient algorithm. Appl Soft Comput 2020. [DOI: 10.1016/j.asoc.2020.106142] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
28
|
|
29
|
|