Clustering-Guided Incremental Learning of Tasks

2021 
Incremental deep learning aims to learn a sequence of tasks while avoiding forgetting their knowledge. One naive approach using a deep architecture is to increase the capacity of the architecture as the number of tasks increases. However, this is followed by heavy memory consumption and makes the approach not practical. If we attempt to avoid such an issue with a fixed capacity, we encounter another challenging problem called catastrophic forgetting, which leads to a notable degradation of performance on previously learned tasks. To overcome these problems, we propose a clustering-guided incremental learning approach that can mitigate catastrophic forgetting while not increasing the capacity of an architecture. The proposed approach adopts a parameter-splitting strategy to assign a subset of parameters in an architecture for each task to prevent forgetting. It uses a clustering approach to discover the relationship between tasks by storing a few samples per task. When we learn a new task, we utilize the knowledge of the relevant tasks together with the current task to improve performance. This approach could maximize the efficiency of the approach realized in a single fixed architecture. Experimental results with a number of fine-grained datasets show that our method outperforms existing competitors.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    25
    References
    0
    Citations
    NaN
    KQI
    []