Robust Weighted Linear Loss Twin Multi-Class Support Vector Regression for Large-Scale Classification

2020 
Abstract Least square twin multi-class support vector machine (LST-KSVC) is an efficient algorithm under “one-versus-one-versus-rest” approach. However, it has two drawbacks. One is the use of quadratic loss punishing the rest class samples causes both decision hyperplanes passing through the rest class, resulting in some rest class samples misclassified. The other is LST-KSVC gives the same weight to normal samples and outliers, which may cause sensitivity to outliers. To overcome these drawbacks, this paper presents a novel classifier called robust weighted linear loss based twin multi-class support vector regression (WLT-KSVC). First, WLT-KSVC punishes the rest class samples by a weighted linear loss with two advantages: to put the rest class samples in a reasonable region to improve classification accuracy, and to reduce the impact of outliers in the rest class. Second, WLT-KSVC reduces the impact of outliers in positive and negative classes by giving different samples different weights obtained by kernel-based possibilistic c-means algorithm. Finally, WLT-KSVC adds a regularization term in the objection function to improve model generalization ability. Notably, low computational complexity makes WLT-KSVC suitable for solving large-scale problems. Experimental results on UCI datasets and NDC datasets demonstrate that our WLT-KSVC achieves better performance than previous methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    44
    References
    4
    Citations
    NaN
    KQI
    []