Distributed Dictionary Learning Over Heterogeneous Clients Using Local Adaptive Dictionaries

2021 
This work examines the use of dictionary learning among distributed clients with heterogeneous tasks. We propose a distributed dictionary learning algorithm that enables collaborative training of a shared global dictionary among clients while adaptively constructing local dictionary elements to address the heterogeneity of local tasks. The proposed distributed dictionary learning with local adaptive dictionaries (DDL-LAD) algorithm consists of two parts: a distributed optimization procedure that enables joint training of the dictionaries without sharing of the local datasets with the server, and a splitting and elimination procedure that is used to adaptively construct local dictionary elements. The splitting procedure identifies elements in the global dictionary that exhibit discriminative features for the local tasks. The elements are split and appended to the local dictionaries. Then, to avoid overgrowing of the local dictionaries, an elimination procedure is adopted to prune elements with less usage. Experiments on a distributed EMNIST dataset is provided to demonstrate the effectiveness of the proposed DDL-LAD algorithm compared to existing schemes that adopt only a global shared dictionary.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    17
    References
    0
    Citations
    NaN
    KQI
    []