Compounding General Purpose Membership Functions for Fuzzy Support Vector Machine Under Noisy Environment

2017 
Fuzzy support vector machine (FSVM) is accepted as a significant addition over soft margin SVM like $C$ -SVM, because the latter gives suboptimal results in the presence of outliers. FSVM's ability to absorb outliers strongly depends on how well the training samples are assigned fuzzy membership values (MVs). Traditionally, the membership functions (MFs) used for the FSVM were custom made for applications, and MFs used for one could, in general, not be used for others. To overcome this, general purpose membership functions (GPMFs) are defined in this paper as those MFs that can universally be used for multiple applications and statistically perform better than $C$ -SVM. This paper contributes to the GPMF literature in two stages. This paper first with the help of convex hulls presents limitations that the FSVM faces while treating all samples of a class with a single MF, and recommends differential treatment to data by dividing them into two fuzzy sets: one containing possible nonoutliers and the other containing possible outliers. While possible outliers are modeled with a normal MF, possible nonoutliers are recommended to have a constant MV of “1.” Subsequently, this paper introduces novel GPMFs that use clustering techniques to recognize possible outliers, and use set measures like the Hausdorff distance and pt-set distance for defining new MF heuristics. To establish conclusions, the introduced GPMFs are thoroughly evaluated and statistically compared with earlier GPMFs on 15 real-world benchmark datasets. The results were very encouraging, and showed that the proposed GPMFs not only perform significantly better in treating class noise, but also execute with efficient run time complexity.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    57
    References
    29
    Citations
    NaN
    KQI
    []