*Recent Progress in Margin Theory*

Recent margin theory indicates that maximizing the minimum margin may not lead to an optimal result and better generalization performance. In the SVC algorithm, when the kernel width coefficient q is selected, the distribution of data points mapped to the feature space is determined. If the distribution of boundary data is different from that of internal data, the hyperplane constructed by SVC may not make better use of the data information, thus reducing the performance of SVC. Additionally, we note that SVC is always overfitting with too many support vector points in practice. Gao and Zhou have already demonstrated that marginal distribution is critical to the generalization performance [13]. The high generalization ability of margin has been shown in v-MADR, which minimizes both the absolute regression deviation mean and the absolute regression deviation variance [18]. We also note that SVC can be regarded as a binary classifier divided by the division hyperplane. Inspired by the aforementioned research, we introduce the mean and variance of the marginal distribution and minimize them to reduce the number of support vector points.

For the convenience of readers, a more detailed description of SVC is presented in Appendix A.
