Source-Free Unsupervised Domain Adaptation with Sample Transport Learning

2021 
Unsupervised domain adaptation (UDA) has achieved great success in handling cross-domain machine learning applications. It typically benefits the model training of unlabeled target domain by leveraging knowledge from labeled source domain. For this purpose, the minimization of the marginal distribution divergence and conditional distribution divergence between the source and the target domain is widely adopted in existing work. Nevertheless, for the sake of privacy preservation, the source domain is usually not provided with training data but trained predictor (e.g., classifier). This incurs the above studies infeasible because the marginal and conditional distributions of the source domain are incalculable. To this end, this article proposes a source-free UDA which jointly models domain adaptation and sample transport learning, namely Sample Transport Domain Adaptation (STDA). Specifically, STDA constructs the pseudo source domain according to the aggregated decision boundaries of multiple source classifiers made on the target domain. Then, it refines the pseudo source domain by augmenting it through transporting those target samples with high confidence, and consequently generates labels for the target domain. We train the STDA model by performing domain adaptation with sample transport between the above steps in alternating manner, and eventually achieve knowledge adaptation to the target domain and attain confident labels for it. Finally, evaluation results have validated effectiveness and superiority of the proposed method.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    38
    References
    0
    Citations
    NaN
    KQI
    []