Decentralized Parallel SGD with Privacy Preservation in Vehicular Networks

2021 
With the prosperity of vehicular networks and intelligent transport systems, vast amount of data can be easily collected by vehicular devices from their users and widely spread in the vehicular networks for the purpose of solving large-scale machine learning problems. Hence how to preserve the data privacy of users during the learning process has become a public concern. To address this concern, under the celebrated framework of differential privacy (DP), we present in this paper a decentralized parallel stochastic gradient descent (D-PSGD) algorithm, called DP $ {\rm\bf^{2}}$ -SGD, which can offer protection for privacy of users in vehicular networks. With thorough analysis we show that DP ${\rm\bf^{2}}$ -SGD satisfies $(\varepsilon,\delta)-$ DP while the learning efficiency is the same as D-PSGD without privacy preservation. We also propose a refined algorithm called EC-SGD by introducing an error-compensate strategy. Extensive experiments show that EC-SGD can further improve the convergence efficiency over DP $ {\rm\bf^{2}}$ -SGD in reality.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    28
    References
    1
    Citations
    NaN
    KQI
    []