Robust classification with feature selection using alternating minimization and Douglas-Rachford splitting method

2019 
This paper deals with supervised classification and feature selection. A classical approach is to project data on a low dimensional space with a strict control on sparsity. This results in an optimization problem minimizing the within sum of squares in the clusters (Frobenius norm) with an 1 penalty in order to promote sparsity. It is well known though that the Frobenius norm is not robust to outliers. In this paper, we propose an alternative approach with an 1 norm minimization both for the constraint and the loss function. Since the 1 criterion is only convex and not gradient Lipschitz, we advocate the use a Douglas-Rachford approach. We take advantage of the particular form of the cost and, using a change of variable, we provide a new efficient tailored primal Douglas-Rachford splitting algorithm. We also provide an efficient classifier in the projected space based on medoid modeling. The resulting algorithm, based on alternating minimization and primal Douglas-Rachford splitting, is coined ADRS. Experiments on biological data sets and computer vision dataset show that our method significantly improves the results obtained with a quadratic loss function.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    1
    Citations
    NaN
    KQI
    []