NBWELM: naive Bayesian based weighted extreme learning machine

2018 
Weighted extreme learning machines (WELMs) aim to find the better tradeoff between empirical and structural risks, so they obtain the good generalization performances, especially when using them to deal with the imbalance classification problems. The existing weighting strategies assign the distribution-independent weight matrices for WELMs, i.e., the weights do not consider the probabilistic information of samples. This causes that WELM strengthens the affect of outliers to some extent. In this paper, a naive Bayesian based WELM (NBWELM) is proposed, in which the weight is determined with the flexible naive Bayesian (FNB) classifier. Through calculating the posterior probability of sample, NBWELM cannot only handle the outliers effectively but also consider two different weighting information i.e., the training error in weighted regularized ELM (WRELM) and class distribution in Zong et al.’s WELM (ZWELM), synchronously. The experimental results on 45 KEEL and UCI datasets show that our proposed NBWELM can further improve the generalization capability of WELM and thus obtain a higher classification accuracy than WRELM and ZWELM. Meanwhile, NBWELM does not remarkably increase the computational complexity of WELM due to the simplicity of FNB.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    30
    References
    11
    Citations
    NaN
    KQI
    []