Towards Accelerated Rates for Distributed Optimization over Time-Varying Networks

2021 
We study the problem of decentralized optimization with strongly convex smooth cost functions. This paper investigates accelerated algorithms under time-varying network constraints. In our approach, nodes run a multi-step gossip procedure after taking each gradient update, thus ensuring approximate consensus at each iteration. The outer cycle is based on accelerated Nesterov scheme. Both computation and communication complexities of our method have an optimal dependence on global function condition number \(\kappa _g\). In particular, the algorithm reaches an optimal computation complexity \(O(\sqrt{\kappa _g}\log (1/\varepsilon ))\).
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    34
    References
    1
    Citations
    NaN
    KQI
    []