A novel meta-learning framework: Multi-features adaptive aggregation method with information enhancer

2021 
Abstract Deep learning has shown its great potential in the field of image classification due to its powerful feature extraction ability, which heavily depends on the number of available training samples. However, it is still a huge challenge on how to obtain an effective feature representation and further learn a promising classifier by deep networks when faced with few-shot classification tasks. This paper proposes a multi-features adaptive aggregation meta-learning method with an information enhancer for few-shot classification tasks, referred to as MFAML. It contains three main modules, including a feature extraction module, an information enhancer, and a multi-features adaptive aggregation classifier (MFAAC). During the meta-training stage, the information enhancer comprised of some deconvolutional layers is designed to promote the effective utilization of samples and thereby capturing more valuable information in the process of feature extraction. Simultaneously, the MFAAC module integrates the features from several convolutional layers of the feature extraction module. The obtained features then feed into the similarity module so that implementing the adaptive adjustment of the predicted label. The information enhancer and MFAAC are connected by a hybrid loss, providing an excellent feature representation. During the meta-test stage, the information enhancer is removed and we keep the remaining architecture for fast adaption on the final target task. The whole MFAML framework is solved by the optimization strategy of model-agnostic meta-learner (MAML) and can effectively improve generalization performance. Experimental results on several benchmark datasets demonstrate the superiority of the proposed method over other representative few-shot classification methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    46
    References
    0
    Citations
    NaN
    KQI
    []