Feature Learning Viewpoint of AdaBoost and a New Algorithm

2019 
The AdaBoost algorithm has the superiority of resisting overfitting. Understanding the mysteries of this phenomenon is a very fascinating fundamental theoretical problem. Many studies are devoted to explaining it from statistical view and margin theory. In this paper, this phenomenon is illustrated by the proposed AdaBoost+SVM algorithm from feature learning viewpoint, which clearly explains the resistance to overfitting of AdaBoost. Firstly, we adopt the AdaBoost algorithm to learn the base classifiers. Then, instead of directly combining the base classifiers, we regard them as features and input them to SVM classifier. With this, the new coefficient and bias can be obtained, which can be used to construct the final classifier. We explain the rationality of this and illustrate the theorem that when the dimension of these features increases, the performance of SVM would not be worse, which can explain the resistance to overfitting of AdaBoost.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    39
    References
    12
    Citations
    NaN
    KQI
    []