Prior Knowledge Regularized Self-Representation Model for Partial Multilabel Learning.

2021 
Partial multilabel learning (PML) aims to learn from training data, where each instance is associated with a set of candidate labels, among which only a part is correct. The common strategy to deal with such a problem is disambiguation, that is, identifying the ground-truth labels from the given candidate labels. However, the existing PML approaches always focus on leveraging the instance relationship to disambiguate the given noisy label space, while the potentially useful information in label space is not effectively explored. Meanwhile, the existence of noise and outliers in training data also makes the disambiguation operation less reliable, which inevitably decreases the robustness of the learned model. In this article, we propose a prior label knowledge regularized self-representation PML approach, called PAKS, where the self-representation scheme and prior label knowledge are jointly incorporated into a unified framework. Specifically, we introduce a self-representation model with a low-rank constraint, which aims to learn the subspace representations of distinct instances and explore the high-order underlying correlation among different instances. Meanwhile, we incorporate prior label knowledge into the above self-representation model, where the prior label knowledge is regarded as the complement of features to obtain an accurate self-representation matrix. The core of PAKS is to take advantage of the data membership preference, which is derived from the prior label knowledge, to purify the discovered membership of the data and accordingly obtain more representative feature subspace for model induction. Enormous experiments on both synthetic and real-world datasets show that our proposed approach can achieve superior or comparable performance to state-of-the-art approaches.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []