1
|
Quadir A, Tanveer M. Multiview learning with twin parametric margin SVM. Neural Netw 2024; 180:106598. [PMID: 39173204 DOI: 10.1016/j.neunet.2024.106598] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2024] [Revised: 06/27/2024] [Accepted: 08/02/2024] [Indexed: 08/24/2024]
Abstract
Multiview learning (MVL) seeks to leverage the benefits of diverse perspectives to complement each other, effectively extracting and utilizing the latent information within the dataset. Several twin support vector machine-based MVL (MvTSVM) models have been introduced and demonstrated outstanding performance in various learning tasks. However, MvTSVM-based models face significant challenges in the form of computational complexity due to four matrix inversions, the need to reformulate optimization problems in order to employ kernel-generated surfaces for handling non-linear cases, and the constraint of uniform noise assumption in the training data. Particularly in cases where the data possesses a heteroscedastic error structure, these challenges become even more pronounced. In view of the aforementioned challenges, we propose multiview twin parametric margin support vector machine (MvTPMSVM). MvTPMSVM constructs parametric margin hyperplanes corresponding to both classes, aiming to regulate and manage the impact of the heteroscedastic noise structure existing within the data. The proposed MvTPMSVM model avoids the explicit computation of matrix inversions in the dual formulation, leading to enhanced computational efficiency. We perform an extensive assessment of the MvTPMSVM model using benchmark datasets such as UCI, KEEL, synthetic, and Animals with Attributes (AwA). Our experimental results, coupled with rigorous statistical analyses, confirm the superior generalization capabilities of the proposed MvTPMSVM model compared to the baseline models. The source code of the proposed MvTPMSVM model is available at https://github.com/mtanveer1/MvTPMSVM.
Collapse
Affiliation(s)
- A Quadir
- Department of Mathematics, Indian Institute of Technology Indore, Simrol, Indore, 453552, Madhya Pradesh, India
| | - M Tanveer
- Department of Mathematics, Indian Institute of Technology Indore, Simrol, Indore, 453552, Madhya Pradesh, India.
| |
Collapse
|
2
|
Kuang Z, Yan Z, Yu L. Weakly supervised learning for multi-class medical image segmentation via feature decomposition. Comput Biol Med 2024; 171:108228. [PMID: 38422964 DOI: 10.1016/j.compbiomed.2024.108228] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2023] [Revised: 02/19/2024] [Accepted: 02/25/2024] [Indexed: 03/02/2024]
Abstract
Weakly supervised learning with image-level labels, releasing deep learning from highly labor-intensive pixel-wise annotation, has gained great attention for medical image segmentation. However, existing weakly supervised methods are mainly designed for single-class segmentation while leaving multi-class medical image segmentation rarely-explored. Different from natural images, label symbiosis, together with location adjacency, are much more common in medical images, making it more challenging for multi-class segmentation. In this paper, we propose a novel weakly supervised learning method for multi-class medical image segmentation with image-level labels. In terms of the multi-class classification backbone, a multi-level classification network encoding multi-scale features is proposed to produce binary predictions, together with the corresponding CAMs, of each class separately. To address the above issues (i.e., label symbiosis and location adjacency), a feature decomposition module based on semantic affinity is first proposed to learn both class-independent and class-dependent features by maximizing the inter-class feature distance. Through a cross-guidance loss to jointly utilize the above features, label symbiosis is largely alleviated. In terms of location adjacency, a mutually exclusive loss is constructed to minimize the overlap among regions corresponding to different classes. Experimental results on three datasets demonstrate the superior performance of the proposed weakly-supervised framework for both single-class and multi-class medical image segmentation. We believe the analysis in this paper would shed new light on future work for multi-class medical image segmentation. The source code of this paper is publicly available at https://github.com/HustAlexander/MCWSS.
Collapse
Affiliation(s)
- Zhuo Kuang
- School of Electronic Information and Communications, Huazhong University of Science and Technology, Wuhan, 430074, China.
| | - Zengqiang Yan
- School of Electronic Information and Communications, Huazhong University of Science and Technology, Wuhan, 430074, China.
| | - Li Yu
- School of Electronic Information and Communications, Huazhong University of Science and Technology, Wuhan, 430074, China.
| |
Collapse
|
3
|
Rezvani S, Wu J. Handling Multi-Class Problem by Intuitionistic Fuzzy Twin Support Vector Machines Based on Relative Density Information. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 2023; 45:14653-14664. [PMID: 37651498 DOI: 10.1109/tpami.2023.3310908] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/02/2023]
Abstract
The intuitionistic fuzzy twin support vector machine (IFTSVM) merges the idea of the intuitionistic fuzzy set (IFS) with the twin support vector machine (TSVM), which can reduce the negative impact of noise and outliers. However, this technique is not suitable for multi-class and high-dimensional feature space problems. Furthermore, the computational complexity of IFTSVM is high because it uses the membership and non-membership functions to build a score function. We propose a new version of IFTSVM by using relative density information. This idea approximates the probability density distribution in multi-dimensional continuous space by computing the K-nearest-neighbor distance of each training sample. Then, we evaluate all the training points by a one-versus-one-versus-rest strategy to construct the k-class classification hyperplanes. A coordinate descent system is utilized to reduce the computational complexity of the training. The bootstrap technique with a 95 % confidence interval and Friedman test are conducted to quantify the significance of the performance improvements observed in numerical evaluations. Experiments on 24 benchmark datasets demonstrate the proposed method produces promising results as compared with other support vector machine models reported in the literature.
Collapse
|
4
|
Moosaei H, Hladík M. Sparse solution of least-squares twin multi-class support vector machine using ℓ 0 and ℓ p-norm for classification and feature selection. Neural Netw 2023; 166:471-486. [PMID: 37574621 DOI: 10.1016/j.neunet.2023.07.039] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2022] [Revised: 06/22/2023] [Accepted: 07/26/2023] [Indexed: 08/15/2023]
Abstract
In the realm of multi-class classification, the twin K-class support vector classification (Twin-KSVC) generates ternary outputs {-1,0,+1} by evaluating all training data in a "1-versus-1-versus-rest" structure. Recently, inspired by the least-squares version of Twin-KSVC and Twin-KSVC, a new multi-class classifier called improvements on least-squares twin multi-class classification support vector machine (ILSTKSVC) has been proposed. In this method, the concept of structural risk minimization is achieved by incorporating a regularization term in addition to the minimization of empirical risk. Twin-KSVC and its improvements have an influence on classification accuracy. Another aspect influencing classification accuracy is feature selection, which is a critical stage in machine learning, especially when working with high-dimensional datasets. However, most prior studies have not addressed this crucial aspect. In this study, motivated by ILSTKSVC and the cardinality-constrained optimization problem, we propose ℓp-norm least-squares twin multi-class support vector machine (PLSTKSVC) with 0
Collapse
Affiliation(s)
- Hossein Moosaei
- Department of Informatics, Faculty of Science, Jan Evangelista Purkyně University, Ústí nad Labem, Czech Republic; Department of Econometrics, Prague University of Economics and Business, Czech Republic.
| | - Milan Hladík
- Department of Applied Mathematics, School of Computer Science, Faculty of Mathematics and Physics, Charles University, Prague, Czech Republic; Department of Econometrics, Prague University of Economics and Business, Czech Republic.
| |
Collapse
|
5
|
Zhao Y, Yang L. Distance metric learning based on the class center and nearest neighbor relationship. Neural Netw 2023; 164:631-644. [PMID: 37245477 DOI: 10.1016/j.neunet.2023.05.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2022] [Revised: 04/25/2023] [Accepted: 05/01/2023] [Indexed: 05/30/2023]
Abstract
Distance metric learning has been a promising technology to improve the performance of algorithms related to distance metrics. The existing distance metric learning methods are either based on the class center or the nearest neighbor relationship. In this work, we propose a new distance metric learning method based on the class center and nearest neighbor relationship (DMLCN). Specifically, when centers of different classes overlap, DMLCN first splits each class into several clusters and uses one center to represent one cluster. Then, a distance metric is learned such that each example is close to the corresponding cluster center and the nearest neighbor relationship is kept for each receptive field. Therefore, while characterizing the local structure of data, the proposed method leads to intra-class compactness and inter-class dispersion simultaneously. Further, to better process complex data, we introduce multiple metrics into DMLCN (MMLCN) by learning a local metric for each center. Following that, a new classification decision rule is designed based on the proposed methods. Moreover, we develop an iterative algorithm to optimize the proposed methods. The convergence and complexity are analyzed theoretically. Experiments on different types of data sets including artificial data sets, benchmark data sets and noise data sets show the feasibility and effectiveness of the proposed methods.
Collapse
Affiliation(s)
- Yifeng Zhao
- College of Information and Electrical Engineering, China Agricultural University, Beijing, China
| | - Liming Yang
- College of Information and Electrical Engineering, China Agricultural University, Beijing, China; College of Science, China Agricultural University, Beijing, Haidian, 100083, China.
| |
Collapse
|
6
|
Dai Y, Zhang Y, Wu Q. Over-relaxed multi-block ADMM algorithms for doubly regularized support vector machines. Neurocomputing 2023. [DOI: 10.1016/j.neucom.2023.01.082] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
|
7
|
Xiao Y, Liu J, Wen K, Liu B, Zhao L, Kong X. A least squares twin support vector machine method with uncertain data. APPL INTELL 2022. [DOI: 10.1007/s10489-022-03897-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/02/2022]
|
8
|
Akbari MG, Khorashadizadeh S, Majidi MH. Support vector machine classification using semi-parametric model. Soft comput 2022. [DOI: 10.1007/s00500-022-07376-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
9
|
Computer-Aided Multiclass Classification of Corn from Corn Images Integrating Deep Feature Extraction. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE 2022; 2022:2062944. [PMID: 35990122 PMCID: PMC9385333 DOI: 10.1155/2022/2062944] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/27/2022] [Revised: 06/16/2022] [Accepted: 06/28/2022] [Indexed: 12/24/2022]
Abstract
Corn has great importance in terms of production in the field of agriculture and animal feed. Obtaining pure corn seeds in corn production is quite significant for seed quality. For this reason, the distinction of corn seeds that have numerous varieties plays an essential role in marketing. This study was conducted with 14,469 images of BT6470, Calipso, Es_Armandi, and Hiva types of corn licensed by BIOTEK. The classification of images was carried out in three stages. At the first stage, deep feature extraction of the four types of corn images was performed with the pretrained CNN model SqueezeNet 1000 deep features were obtained for each image. In the second stage, in order to reduce these features obtained from deep feature extraction with SqueezeNet, separate feature selection processes were performed with the Bat Optimization (BA), Whale Optimization (WOA), and Gray Wolf Optimization (GWO) algorithms among optimization algorithms. Finally, in the last stage, the features obtained from the first and second stages were classified by using the machine learning methods Decision Tree (DT), Naive Bayes (NB), multi-class Support Vector Machine (mSVM), k-Nearest Neighbor (KNN), and Neural Network (NN). In the classification processes of the features obtained in the first stage, the mSVM model has achieved the highest classification success with 89.40%. In the second stage, as a result of the classifications performed through the active features selected by using three types of feature selection algorithms (BA, WOA, GWO), the classification success obtained with the mSVM model was 88.82%, 88.72%, and 88.95%, respectively. The classification accuracies of the tested methods and the classification accuracies obtained in the first stage are close to each other in terms of classification success. However, with the algorithms used in feature selection, successful classification processes have been carried out with fewer features and in a shorter time. The results of the study, in which classification was carried out in the inexpensive, the objective, and the shorter time of processing for the corn types, present a different perspective in terms of classification performance.
Collapse
|
10
|
Ganaie M, Tanveer M. KNN weighted reduced universum twin SVM for class imbalance learning. Knowl Based Syst 2022. [DOI: 10.1016/j.knosys.2022.108578] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/14/2023]
|
11
|
Ramp loss KNN-weighted multi-class twin support vector machine. Soft comput 2022. [DOI: 10.1007/s00500-022-07040-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
12
|
Ganaie M, Tanveer M. Fuzzy least squares projection twin support vector machines for class imbalance learning. Appl Soft Comput 2021. [DOI: 10.1016/j.asoc.2021.107933] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
13
|
Fazakas-Anca IS, Modrea A, Vlase S. Determination of Reactivity Ratios from Binary Copolymerization Using the k-Nearest Neighbor Non-Parametric Regression. Polymers (Basel) 2021; 13:polym13213811. [PMID: 34771367 PMCID: PMC8588380 DOI: 10.3390/polym13213811] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2021] [Revised: 10/27/2021] [Accepted: 11/02/2021] [Indexed: 11/16/2022] Open
Abstract
This paper proposes a new method for calculating the monomer reactivity ratios for binary copolymerization based on the terminal model. The original optimization method involves a numerical integration algorithm and an optimization algorithm based on k-nearest neighbour non-parametric regression. The calculation method has been tested on simulated and experimental data sets, at low (<10%), medium (10–35%) and high conversions (>40%), yielding reactivity ratios in a good agreement with the usual methods such as intersection, Fineman–Ross, reverse Fineman–Ross, Kelen–Tüdös, extended Kelen–Tüdös and the error in variable method. The experimental data sets used in this comparative analysis are copolymerization of 2-(N-phthalimido) ethyl acrylate with 1-vinyl-2-pyrolidone for low conversion, copolymerization of isoprene with glycidyl methacrylate for medium conversion and copolymerization of N-isopropylacrylamide with N,N-dimethylacrylamide for high conversion. Also, the possibility to estimate experimental errors from a single experimental data set formed by n experimental data is shown.
Collapse
Affiliation(s)
| | - Arina Modrea
- Pharmacy, Science and Technology George Emil Palade Targu Mures, University of Medicine, 300134 Targu Mures, Romania
- Correspondence: (A.M.); (S.V.); Tel.: +40-722-643020 (S.V.)
| | - Sorin Vlase
- Department of Mechanical Engineering, Transilvania University of Brasov, B-dul Eroilor 20, 500036 Brasov, Romania
- Romanian Academy of Technical Sciences, B-dul Dacia 26, 030167 Bucharest, Romania
- Correspondence: (A.M.); (S.V.); Tel.: +40-722-643020 (S.V.)
| |
Collapse
|
14
|
Tanveer M, Ganaie M, Suganthan P. Ensemble of classification models with weighted functional link network. Appl Soft Comput 2021. [DOI: 10.1016/j.asoc.2021.107322] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|