1
|
A Novel Twin Support Vector Machine with Generalized Pinball Loss Function for Pattern Classification. Symmetry (Basel) 2022. [DOI: 10.3390/sym14020289] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
We introduce a novel twin support vector machine with the generalized pinball loss function (GPin-TSVM) for solving data classification problems that are less sensitive to noise and preserve the sparsity of the solution. In addition, we use a symmetric kernel trick to enlarge GPin-TSVM to nonlinear classification problems. The developed approach is tested on numerous UCI benchmark datasets, as well as synthetic datasets in the experiments. The comparisons demonstrate that our proposed algorithm outperforms existing classifiers in terms of accuracy. Furthermore, this employed approach in handwritten digit recognition applications is examined, and the automatic feature extractor employs a convolution neural network.
Collapse
|
2
|
|
3
|
Yuan C, Yang L. Capped L 2,p-norm metric based robust least squares twin support vector machine for pattern classification. Neural Netw 2021; 142:457-478. [PMID: 34273616 DOI: 10.1016/j.neunet.2021.06.028] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2020] [Revised: 06/25/2021] [Accepted: 06/29/2021] [Indexed: 11/27/2022]
Abstract
Least squares twin support vector machine (LSTSVM) is an effective and efficient learning algorithm for pattern classification. However, the distance in LSTSVM is measured by squared L2-norm metric that may magnify the influence of outliers. In this paper, a novel robust least squares twin support vector machine framework is proposed for binary classification, termed as CL2,p-LSTSVM, which utilizes capped L2,p-norm distance metric to reduce the influence of noise and outliers. The goal of CL2,p-LSTSVM is to minimize the capped L2,p-norm intra-class distance dispersion, and eliminate the influence of outliers during training process, where the value of the metric is controlled by the capped parameter, which can ensure better robustness. The proposed metric includes and extends the traditional metrics by setting appropriate values of p and capped parameter. This strategy not only retains the advantages of LSTSVM, but also improves the robustness in solving a binary classification problem with outliers. However, the nonconvexity of metric makes it difficult to optimize. We design an effective iterative algorithm to solve the CL2,p-LSTSVM. In each iteration, two systems of linear equations are solved. Simultaneously, we present some insightful analyses on the computational complexity and convergence of algorithm. Moreover, we extend the CL2,p-LSTSVM to nonlinear classifier and semi-supervised classification. Experiments are conducted on artificial datasets, UCI benchmark datasets, and image datasets to evaluate our method. Under different noise settings and different evaluation criteria, the experiment results show that the CL2,p-LSTSVM has better robustness than state-of-the-art approaches in most cases, which demonstrates the feasibility and effectiveness of the proposed method.
Collapse
Affiliation(s)
- Chao Yuan
- College of Information and Electrical Engineering, China Agricultural University, Beijing, Haidian, 100083, China
| | - Liming Yang
- College of Science, China Agricultural University, Beijing, Haidian, 100083, China.
| |
Collapse
|
4
|
|
5
|
|
6
|
Jayadeva, Pant H, Sharma M, Soman S. Twin Neural Networks for the classification of large unbalanced datasets. Neurocomputing 2019. [DOI: 10.1016/j.neucom.2018.07.089] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
|
7
|
|
8
|
Hsu MF, Huang CI. Decision Support System for Management Decision in High-Risk Business Environment. JOURNAL OF TESTING AND EVALUATION 2018; 46:2240-2250. [DOI: 10.1520/jte20170252] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/04/2025]
Abstract
Abstract
As a result of substantial variations in global financial markets, constructing an enterprise risk prewarning mechanism is essential. A vast amount of related studies have implemented monetary-related indicators to depict the full spectrum of an enterprise’s operating performance. Merely considering monetary-related indicators is unable to produce an in-depth understanding of an enterprise. To fill this gap, the balanced scorecard (BSC), with the advantages of being able to capture both monetary and nonmonetary indicators, was introduced. Unfortunately, the BSC also has its own challenges, one of which is the lack of consideration given to risk exposure, which affects an enterprise’s profit variation. Thus, this study extends the original BSC by considering risk exposure and introduces an artificial intelligence-based decision support system for management decision. The inherent decision logic embedded into neural network-based mechanisms is opaque and hard to comprehend by users. To handle the challenge, this study further incorporates fit theory with a knowledge visualization technique to handle the opaque nature of the model so as to decrease the cognitive load and mental burden. The empirical results show that the introduced model is a promising alternative for management decisions in highly fluctuating financial markets.
Collapse
Affiliation(s)
- Ming-Fu Hsu
- English Program of Global Business, Chinese Culture University 1 , 55 Hwa-Kang Rd., Yang-Ming-Shan, Taipei11114, Taiwan (Corresponding author), e-mail: , ORCID link for author moved to before name tags https://orcid.org/0000-0003-4843-5313
| | - Chung-I Huang
- Department of Technology Application and Human Resource Development, National Taiwan Normal University 2 , 162, Section 1, Heping E. Rd., Taipei City106, Taiwan
| |
Collapse
|
9
|
|
10
|
|
11
|
Ding S, Zhang N, Zhang X, Wu F. Twin support vector machine: theory, algorithm and applications. Neural Comput Appl 2016. [DOI: 10.1007/s00521-016-2245-4] [Citation(s) in RCA: 56] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
12
|
|
13
|
A Learning Framework of Nonparallel Hyperplanes Classifier. ScientificWorldJournal 2015; 2015:497617. [PMID: 26167527 PMCID: PMC4488010 DOI: 10.1155/2015/497617] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2014] [Revised: 09/19/2014] [Accepted: 09/19/2014] [Indexed: 11/17/2022] Open
Abstract
A novel learning framework of nonparallel hyperplanes support vector machines (NPSVMs) is proposed for binary classification and multiclass classification. This framework not only includes twin SVM (TWSVM) and its many deformation versions but also extends them into multiclass classification problem when different parameters or loss functions are chosen. Concretely, we discuss the linear and nonlinear cases of the framework, in which we select the hinge loss function as example. Moreover, we also give the primal problems of several extension versions of TWSVM's deformation versions. It is worth mentioning that, in the decision function, the Euclidean distance is replaced by the absolute value |wTx + b|, which keeps the consistency between the decision function and the optimization problem and reduces the computational cost particularly when the kernel function is introduced. The numerical experiments on several artificial and benchmark datasets indicate that our framework is not only fast but also shows good generalization.
Collapse
|
14
|
Tomar D, Agarwal S. A comparison on multi-class classification methods based on least squares twin support vector machine. Knowl Based Syst 2015. [DOI: 10.1016/j.knosys.2015.02.009] [Citation(s) in RCA: 77] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
15
|
Soman S, Jayadeva. High performance EEG signal classification using classifiability and the Twin SVM. Appl Soft Comput 2015. [DOI: 10.1016/j.asoc.2015.01.018] [Citation(s) in RCA: 51] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
16
|
|
17
|
|
18
|
Tanveer M. Application of smoothing techniques for linear programming twin support vector machines. Knowl Inf Syst 2014. [DOI: 10.1007/s10115-014-0786-3] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
19
|
Analyzing big data with the hybrid interval regression methods. ScientificWorldJournal 2014; 2014:243921. [PMID: 25143968 PMCID: PMC4131111 DOI: 10.1155/2014/243921] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2014] [Accepted: 07/07/2014] [Indexed: 12/02/2022] Open
Abstract
Big data is a new trend at present, forcing the significant impacts on information technologies. In big data applications, one of the most concerned issues is dealing with large-scale data sets that often require computation resources provided by public cloud services. How to analyze big data efficiently becomes a big challenge. In this paper, we collaborate interval regression with the smooth support vector machine (SSVM) to analyze big data. Recently, the smooth support vector machine (SSVM) was proposed as an alternative of the standard SVM that has been proved more efficient than the traditional SVM in processing large-scale data. In addition the soft margin method is proposed to modify the excursion of separation margin and to be effective in the gray zone that the distribution of data becomes hard to be described and the separation margin between classes.
Collapse
|
20
|
Tian Y, Qi Z, Ju X, Shi Y, Liu X. Nonparallel support vector machines for pattern classification. IEEE TRANSACTIONS ON CYBERNETICS 2014; 44:1067-1079. [PMID: 24013833 DOI: 10.1109/tcyb.2013.2279167] [Citation(s) in RCA: 94] [Impact Index Per Article: 8.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
We propose a novel nonparallel classifier, called nonparallel support vector machine (NPSVM), for binary classification. Our NPSVM that is fully different from the existing nonparallel classifiers, such as the generalized eigenvalue proximal support vector machine (GEPSVM) and the twin support vector machine (TWSVM), has several incomparable advantages: 1) two primal problems are constructed implementing the structural risk minimization principle; 2) the dual problems of these two primal problems have the same advantages as that of the standard SVMs, so that the kernel trick can be applied directly, while existing TWSVMs have to construct another two primal problems for nonlinear cases based on the approximate kernel-generated surfaces, furthermore, their nonlinear problems cannot degenerate to the linear case even the linear kernel is used; 3) the dual problems have the same elegant formulation with that of standard SVMs and can certainly be solved efficiently by sequential minimization optimization algorithm, while existing GEPSVM or TWSVMs are not suitable for large scale problems; 4) it has the inherent sparseness as standard SVMs; 5) existing TWSVMs are only the special cases of the NPSVM when the parameters of which are appropriately chosen. Experimental results on lots of datasets show the effectiveness of our method in both sparseness and classification accuracy, and therefore, confirm the above conclusion further. In some sense, our NPSVM is a new starting point of nonparallel classifiers.
Collapse
|
21
|
|
22
|
LÜ Y, YANG H. A Multi-model Approach for Soft Sensor Development Based on Feature Extraction Using Weighted Kernel Fisher Criterion. Chin J Chem Eng 2014. [DOI: 10.1016/s1004-9541(14)60007-0] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
23
|
Li HX, Yang JL, Zhang G, Fan B. Probabilistic support vector machines for classification of noise affected data. Inf Sci (N Y) 2013. [DOI: 10.1016/j.ins.2012.09.041] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
|
24
|
|
25
|
Peng X, Xu D. Twin Mahalanobis distance-based support vector machines for pattern recognition. Inf Sci (N Y) 2012. [DOI: 10.1016/j.ins.2012.02.047] [Citation(s) in RCA: 39] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
26
|
|
27
|
|
28
|
|