1
|
Discrete space reinforcement learning algorithm based on twin support vector machine classification. Pattern Recognit Lett 2022. [DOI: 10.1016/j.patrec.2022.11.017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
2
|
A Novel Twin Support Vector Machine with Generalized Pinball Loss Function for Pattern Classification. Symmetry (Basel) 2022. [DOI: 10.3390/sym14020289] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
We introduce a novel twin support vector machine with the generalized pinball loss function (GPin-TSVM) for solving data classification problems that are less sensitive to noise and preserve the sparsity of the solution. In addition, we use a symmetric kernel trick to enlarge GPin-TSVM to nonlinear classification problems. The developed approach is tested on numerous UCI benchmark datasets, as well as synthetic datasets in the experiments. The comparisons demonstrate that our proposed algorithm outperforms existing classifiers in terms of accuracy. Furthermore, this employed approach in handwritten digit recognition applications is examined, and the automatic feature extractor employs a convolution neural network.
Collapse
|
3
|
|
4
|
Abstract
AbstractThe twin support vector machine improves the classification performance of the support vector machine by solving two small quadratic programming problems. However, this method has the following defects: (1) For the twin support vector machine and some of its variants, the constructed models use a hinge loss function, which is sensitive to noise and unstable in resampling. (2) The models need to be converted from the original space to the dual space, and their time complexity is high. To further enhance the performance of the twin support vector machine, the pinball loss function is introduced into the twin bounded support vector machine, and the problem of the pinball loss function not being differentiable at zero is solved by constructing a smooth approximation function. Based on this, a smooth twin bounded support vector machine model with pinball loss is obtained. The model is solved iteratively in the original space using the Newton-Armijo method. A smooth twin bounded support vector machine algorithm with pinball loss is proposed, and theoretically the convergence of the iterative algorithm is proven. In the experiments, the proposed algorithm is validated on the UCI datasets and the artificial datasets. Furthermore, the performance of the presented algorithm is compared with those of other representative algorithms, thereby demonstrating the effectiveness of the proposed algorithm.
Collapse
|
5
|
|
6
|
Xu W, Huang D, Zhou S. Universal consistency of twin support vector machines. INT J MACH LEARN CYB 2021. [DOI: 10.1007/s13042-021-01281-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
AbstractA classification problem aims at constructing a best classifier with the smallest risk. When the sample size approaches infinity, the learning algorithms for a classification problem are characterized by an asymptotical property, i.e., universal consistency. It plays a crucial role in measuring the construction of classification rules. A universal consistent algorithm ensures that the larger the sample size of the algorithm is, the more accurately the distribution of the samples could be reconstructed. Support vector machines (SVMs) are regarded as one of the most important models in binary classification problems. How to effectively extend SVMs to twin support vector machines (TWSVMs) so as to improve performance of classification has gained increasing interest in many research areas recently. Many variants for TWSVMs have been proposed and used in practice. Thus in this paper, we focus on the universal consistency of TWSVMs in a binary classification setting. We first give a general framework for TWSVM classifiers that unifies most of the variants of TWSVMs for binary classification problems. Based on it, we then investigate the universal consistency of TWSVMs. To do this, we give some useful definitions of risk, Bayes risk and universal consistency for TWSVMs. Theoretical results indicate that universal consistency is valid for various TWSVM classifiers under some certain conditions, including covering number, localized covering number and stability. For applications of our general framework, several variants of TWSVMs are considered.
Collapse
|
7
|
|
8
|
|
9
|
Wen Y, Ma J, Yuan C, Yang L. Projection multi-birth support vector machinea for multi-classification. APPL INTELL 2020. [DOI: 10.1007/s10489-020-01699-z] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
10
|
Adaptively weighted learning for twin support vector machines via Bregman divergences. Neural Comput Appl 2020. [DOI: 10.1007/s00521-018-3843-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
11
|
|
12
|
|
13
|
|
14
|
|
15
|
|
16
|
|
17
|
|
18
|
|
19
|
|
20
|
Improvements on twin-hypersphere support vector machine using local density information. PROGRESS IN ARTIFICIAL INTELLIGENCE 2018. [DOI: 10.1007/s13748-018-0141-0] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
21
|
|
22
|
|
23
|
Peng X, Shen J. A twin-hyperspheres support vector machine with automatic variable weights for data classification. Inf Sci (N Y) 2017. [DOI: 10.1016/j.ins.2017.07.007] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
24
|
|
25
|
|
26
|
|
27
|
Cevikalp H. Best Fitting Hyperplanes for Classification. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 2017; 39:1076-1088. [PMID: 27392344 DOI: 10.1109/tpami.2016.2587647] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
In this paper, we propose novel methods that are more suitable than classical large-margin classifiers for open set recognition and object detection tasks. The proposed methods use the best fitting hyperplanes approach, and the main idea is to find the best fitting hyperplanes such that each hyperplane is close to the samples of one of the classes and is as far as possible from the other class samples. To this end, we propose two different classifiers: The first classifier solves a convex quadratic optimization problem, but negative samples can lie on one side of the best fitting hyperplane. The second classifier, however, allows the negative samples to lie on both sides of the fitting hyperplane by using concave-convex procedure. Both methods are extended to the nonlinear case by using the kernel trick. In contrast to the existing hyperplane fitting classifiers in the literature, our proposed methods are suitable for large-scale problems, and they return sparse solutions. The experiments on several databases show that the proposed methods typically outperform other hyperplane fitting classifiers, and they work as good as the SVM classifier in classical recognition tasks. However, the proposed methods significantly outperform SVM in open set recognition and object detection tasks.
Collapse
|
28
|
Gupta D. Training primal K-nearest neighbor based weighted twin support vector regression via unconstrained convex minimization. APPL INTELL 2017. [DOI: 10.1007/s10489-017-0913-4] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
29
|
|
30
|
|
31
|
Zhao X, Bai Q, Bai S. Simple nonparallel laplacian SVM for semi-supervised learning on binary classification problem. INTELL DATA ANAL 2016. [DOI: 10.3233/ida-150236] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Affiliation(s)
- Xi Zhao
- Post-Doctoral Research Station of China Construction Bank, Beijing, China
- Post-Doctoral Research Station of Tsinghua University, Beijing, China
- University of Chinese Academy of Sciences, Beijing, China
| | - Qiyu Bai
- Peking University, Beijing, China
| | - Shiguo Bai
- Langfang Teachers University, Langfang, Hebei, China
| |
Collapse
|
32
|
Zhu GY, Yang CG, Zhang P. Linear programming ν-nonparallel support vector machine and its application in vehicle recognition. Neurocomputing 2016. [DOI: 10.1016/j.neucom.2015.07.159] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
33
|
|
34
|
Balasundaram S, Gupta D, Prasad SC. A new approach for training Lagrangian twin support vector machine via unconstrained convex minimization. APPL INTELL 2016. [DOI: 10.1007/s10489-016-0809-8] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
35
|
|
36
|
Ding S, Zhang N, Zhang X, Wu F. Twin support vector machine: theory, algorithm and applications. Neural Comput Appl 2016. [DOI: 10.1007/s00521-016-2245-4] [Citation(s) in RCA: 56] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
37
|
Chen S, Wu X, Zhang R. A Novel Twin Support Vector Machine for Binary Classification Problems. Neural Process Lett 2016. [DOI: 10.1007/s11063-016-9495-0] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
38
|
Abstract
The traditional vector-based classifiers, such as support vector machine (SVM) and twin support vector machine (TSVM), cannot handle tensor data directly and may not utilize the data informations effectively. In this paper, we propose a novel classifier based on tensor data, called twin bounded support tensor machine (TBSTM) which is an extension of twin bounded support vector machine (TBSVM). Similar to TBSVM, TBSTM gets two hyperplanes and obtains the solution by solving two quadratic programming problems (QPPs). The computational complexity of each QPPs is smaller than that of support tensor machine (STM). TBSTM not only retains the advantage of TBSVM, but also has its unique superior characteristics: (1) it makes full use of the structure information of data; (2) it has acceptable or better classification accuracy compared to STM, TBSVM and SVM; (3) the computational cost is basically less than STM; (4) it can deal with large data that TBSVM is not easy to achieve, especially for small-sample-size (S3) problems; (5) it adopts alternating successive over relaxation iteration (ASOR) method to solve optimization problems which accelerates the pace of training. Finally, we demonstrate the effectiveness and superiority by the experiments based on vector and tensor data.
Collapse
Affiliation(s)
- Haifa Shi
- College of Science, China Agricultural University, Qinghuadonglu No. 17 Beijing 100083, P. R. China
| | - Xinbin Zhao
- College of Science, China Agricultural University, Qinghuadonglu No. 17 Beijing 100083, P. R. China
| | - Ling Zhen
- College of Science, China Agricultural University, Qinghuadonglu No. 17 Beijing 100083, P. R. China
| | - Ling Jing
- College of Science, China Agricultural University, Qinghuadonglu No. 17 Beijing 100083, P. R. China
| |
Collapse
|
39
|
|
40
|
Balasundaram S, Meena Y. Training primal twin support vector regression via unconstrained convex minimization. APPL INTELL 2015. [DOI: 10.1007/s10489-015-0731-5] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
41
|
Tanveer M, Shubham K, Aldhaifallah M, Nisar KS. An efficient implicit regularized Lagrangian twin support vector regression. APPL INTELL 2015. [DOI: 10.1007/s10489-015-0728-0] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
42
|
Xu H, Fan L, Gao X. TBSTM: A Novel and Fast Nonlinear Classification Method for Image Data. INT J PATTERN RECOGN 2015. [DOI: 10.1142/s021800141551012x] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
A new classifier for image data classification named as linear twin bounded support tensor machine (linear TBSTM) is proposed by adding regularization terms in objective functions, which results in the realization of structural risk minimization avoids of the singularity of matrices. We know that up to now nonlinear classifiers based on STM for image data classification are not seen more. In order to remedy this limitation, a new matrix kernel function is introduced and based on which the nonlinear version of TBSTM is studied with a detailed theoretical derivation, and then a nonlinear classifier called as nonlinear TBSTM is suggested. In order to examine the effectiveness of the proposed classifiers, a series of comparative experiments with three linear classifiers STM, TSTM and PSTM are performed on 15 binary image classification problems taken from ORL, YALE and AR datasets. Experiment results show that the proposed classifiers are effective and efficient.
Collapse
Affiliation(s)
- Haitao Xu
- School of Mathematics Sciences, Liaocheng University, Liaocheng 252059, P. R. China
| | - Liya Fan
- School of Mathematics Sciences, Liaocheng University, Liaocheng 252059, P. R. China
| | - Xizhan Gao
- School of Mathematics Sciences, Liaocheng University, Liaocheng 252059, P. R. China
| |
Collapse
|
43
|
Multi-class LSTMSVM based on optimal directed acyclic graph and shuffled frog leaping algorithm. INT J MACH LEARN CYB 2015. [DOI: 10.1007/s13042-015-0435-5] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
44
|
Khemchandani R, Saigal P. Color image classification and retrieval through ternary decision structure based multi-category TWSVM. Neurocomputing 2015. [DOI: 10.1016/j.neucom.2015.03.074] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
45
|
|
46
|
|
47
|
A Learning Framework of Nonparallel Hyperplanes Classifier. ScientificWorldJournal 2015; 2015:497617. [PMID: 26167527 PMCID: PMC4488010 DOI: 10.1155/2015/497617] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2014] [Revised: 09/19/2014] [Accepted: 09/19/2014] [Indexed: 11/17/2022] Open
Abstract
A novel learning framework of nonparallel hyperplanes support vector machines (NPSVMs) is proposed for binary classification and multiclass classification. This framework not only includes twin SVM (TWSVM) and its many deformation versions but also extends them into multiclass classification problem when different parameters or loss functions are chosen. Concretely, we discuss the linear and nonlinear cases of the framework, in which we select the hinge loss function as example. Moreover, we also give the primal problems of several extension versions of TWSVM's deformation versions. It is worth mentioning that, in the decision function, the Euclidean distance is replaced by the absolute value |wTx + b|, which keeps the consistency between the decision function and the optimization problem and reduces the computational cost particularly when the kernel function is introduced. The numerical experiments on several artificial and benchmark datasets indicate that our framework is not only fast but also shows good generalization.
Collapse
|
48
|
Tomar D, Agarwal S. A comparison on multi-class classification methods based on least squares twin support vector machine. Knowl Based Syst 2015. [DOI: 10.1016/j.knosys.2015.02.009] [Citation(s) in RCA: 77] [Impact Index Per Article: 8.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
49
|
|
50
|
Shao YH, Chen WJ, Wang Z, Li CN, Deng NY. Weighted linear loss twin support vector machine for large-scale classification. Knowl Based Syst 2015. [DOI: 10.1016/j.knosys.2014.10.011] [Citation(s) in RCA: 46] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|