Fast distributed optimization using row-stochastic weights and uncoordinated step-sizes

2018 
In this paper, we discuss distributed optimization over directed graphs, where doubly-stochastic weight matrices cannot be constructed, in general. Most of the existing algorithms overcome this issue by applying push-sum consensus, which utilizes column-stochastic weight matrices. Column-stochastic weights require each agent to know (at least) its out degree, which may be impractical in e.g., broadcast-based communication protocols. In contrast, we design a fast distributed optimization algorithm with row-stochastic weights and uncoordinated step-sizes at the agents, the implementation of which is straightforward as each agent locally decides the weights to be assigned to the incoming information and locally chooses a suitable step-size. We show that the proposed algorithm converges linearly to the optimal solution when the step-sizes chosen are sufficiently small and objective functions are strongly-convex with Lipschitz-continuous gradients.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    62
    References
    0
    Citations
    NaN
    KQI
    []