Communication-Censored ADMM for Decentralized Consensus Optimization

2019 
In this paper, we devise a communication-efficient decentralized algorithm, named as communication-censored alternating direction method of multipliers (ADMM) (COCA), to solve a convex consensus optimization problem defined over a network. Similar to popular decentralized consensus optimization algorithms such as ADMM, at every iteration of COCA, a node exchanges its local variable with neighbors, and then updates its local variable according to the received neighboring variables and its local cost function. A different feature of COCA is that a node is not allowed to transmit its local variable to neighbors, if this variable is not sufficiently different to the previously transmitted one. The sufficiency of the difference is evaluated by a properly designed censoring function. Though this censoring strategy may slow down the optimization process, it effectively reduces the communication cost. We prove that when the censoring function is properly chosen, COCA converges to an optimal solution of the convex consensus optimization problem. Furthermore, if the local cost functions are strongly convex, COCA has a fast linear convergence rate. Numerical experiments demonstrate that, given a target solution accuracy, COCA is able to significantly reduce the overall communication cost compared to existing algorithms including ADMM, and hence fits for applications where network communication is a bottleneck.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    44
    References
    41
    Citations
    NaN
    KQI
    []