Coarse-to-Fine Pseudo Supervision Guided Meta-Task optimization for Few-Shot Object Classification

2022 
Abstract Few-Shot Learning (FSL) is a challenging and practical learning pattern, aiming to solve a target task which has only a few labeled examples. Currently, the field of FSL has made great progress, but largely in the supervised setting, where a large auxiliary labeled dataset is required for offline training. However, the unsupervised FSL (UFSL) problem where the auxiliary dataset is fully unlabeled has been seldom investigated despite of its significant value. This paper focuses on the more general and challenging UFSL problem and presents a novel method named Coarse-to-Fine Pseudo Supervision-guided Meta-Learning (C2FPS-ML) for unsupervised few-shot object classification. It first obtains prior knowledge from an unlabeled auxiliary dataset during unsupervised meta-training, and then use the prior knowledge to assist the downstream few-shot classification task. Coarse-to-Fine Pseudo Supervisions in C2FPS-ML aim to optimize meta-task sampling process in unsupervised meta-training stage which is one of the dominant factors for improving the performance of meta-learning based FSL algorithms. Human can learn new concepts progressively or hierarchically following the coarse-to-fine manners. By simulating this human’s behaviour, we develop two versions of C2FPS-ML for two different scenarios: one is natural object dataset and another one is other kinds of dataset (e.g., handwritten character dataset). For natural object dataset scenario, we propose to exploit the potential hierarchical semantics of the unlabeled auxiliary dataset to build a tree-like structure of visual concepts. For another scenario, progressive pseudo supervision is obtained by forming clusters in different similarity aspects and is represented by a pyramid-like structure. The obtained structure is applied as the supervision to construct meta-tasks in meta-training stage, and prior knowledge from the unlabeled auxiliary dataset is learned from the coarse-grained level to the fine-grained level. The proposed method sets the new state of the art on the gold-standard miniImageNet and achieves remarkable results on Omniglot while simultaneously increases efficiency.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    49
    References
    0
    Citations
    NaN
    KQI
    []