Next Article in Journal
Boolean Valued Representation of Random Sets and Markov Kernels with Application to Large Deviations
Next Article in Special Issue
Learning Medical Image Denoising with Deep Dynamic Residual Attention Network
Previous Article in Journal
The Local Representation Formula of Solution for the Perturbed Controlled Differential Equation with Delay and Discontinuous Initial Condition
Previous Article in Special Issue
Data Science in Economics: Comprehensive Review of Advanced Machine Learning and Deep Learning Methods
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Simultaneous Feature Selection and Classification for Data-Adaptive Kernel-Penalized SVM

1
School of Statistics and Management, Shanghai University of Finance and Economics, Shanghai 200433, China
2
Department of Statistical and Actuarial Sciences, University of Western Ontario, London, ON N6A 5B7, Canada
*
Author to whom correspondence should be addressed.
Mathematics 2020, 8(10), 1846; https://doi.org/10.3390/math8101846
Submission received: 20 September 2020 / Revised: 11 October 2020 / Accepted: 15 October 2020 / Published: 20 October 2020
(This article belongs to the Special Issue Advances in Machine Learning Prediction Models)

Abstract

Simultaneous feature selection and classification have been explored in the literature to extend the support vector machine (SVM) techniques by adding penalty terms to the loss function directly. However, it is the kernel function that controls the performance of the SVM, and an imbalance in the data will deteriorate the performance of an SVM. In this paper, we examine a new method of simultaneous feature selection and binary classification. Instead of incorporating the standard loss function of the SVM, a penalty is added to the data-adaptive kernel function directly to control the performance of the SVM, by firstly conformally transforming the kernel functions of the SVM, and then re-conducting an SVM classifier based on the sparse features selected. Both convex and non-convex penalties, such as least absolute shrinkage and selection (LASSO), moothly clipped absolute deviation (SCAD) and minimax concave penalty (MCP) are explored, and the oracle property of the estimator is established accordingly. An iterative optimization procedure is applied as there is no analytic form of the estimated coefficients available. Numerical comparisons show that the proposed method outperforms the competitors considered when data are imbalanced, and it performs similarly to the competitors when data are balanced. The method can be easily applied in medical images from different platforms.
Keywords: classification; data-adaptive kernel; feature selection; penalty; predictive model; simultaneous classification; support vector machine classification; data-adaptive kernel; feature selection; penalty; predictive model; simultaneous classification; support vector machine

Share and Cite

MDPI and ACS Style

Liu, X.; Zhao, B.; He, W. Simultaneous Feature Selection and Classification for Data-Adaptive Kernel-Penalized SVM. Mathematics 2020, 8, 1846. https://doi.org/10.3390/math8101846

AMA Style

Liu X, Zhao B, He W. Simultaneous Feature Selection and Classification for Data-Adaptive Kernel-Penalized SVM. Mathematics. 2020; 8(10):1846. https://doi.org/10.3390/math8101846

Chicago/Turabian Style

Liu, Xin, Bangxin Zhao, and Wenqing He. 2020. "Simultaneous Feature Selection and Classification for Data-Adaptive Kernel-Penalized SVM" Mathematics 8, no. 10: 1846. https://doi.org/10.3390/math8101846

APA Style

Liu, X., Zhao, B., & He, W. (2020). Simultaneous Feature Selection and Classification for Data-Adaptive Kernel-Penalized SVM. Mathematics, 8(10), 1846. https://doi.org/10.3390/math8101846

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop