Age-aware Communication Strategy in Federated Learning with Energy Harvesting Devices

2021 
Federated learning is considered as a privacy-preserving distributed machine learning framework, where the model training is distributed over end devices by fully exploiting scattered computation capability and training data. Different from centralized machine learning where the convergence time is decided by number of training rounds, under the framework of FL, the convergence time also depends on the communication delay and computation delay for local training in each round. Therefore, we employ total training delay as the performance metric in our strategy design. Note that the training delay per round is prone to the limited wireless resources and system heterogeneity, where end devices have different computational and communication capabilities. To achieve timely parameter aggregation over limited spectrum, we incorporate age of parameter in device scheduling for each training round, which is defined as the number of rounds elapsed since last time of parameter uploading. Moreover, since diversity of uploaded parameters is important for training performance over data with non-IID distributions, we exploit energy harvesting technology to prevent device drop-outs during training process. In this paper, we propose an age-aware communication strategy for federated learning over wireless networks, by jointly considering the staleness of parameters and heterogeneous capabilities at end devices to realize fast and accurate model training. Numerical results demonstrate the effectiveness and accuracy of our proposed strategy.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    6
    References
    0
    Citations
    NaN
    KQI
    []