A Privacy-preserving and Non-interactive Federated Learning Scheme for Regression Training with Gradient Descent

2020 
Abstract In recent years, the extensive application of machine learning technologies has been witnessed in various fields. However, in many applications, massive data are distributively stored in multiple data owners. Meanwhile, due to the privacy concerns and communication constraints, it is difficult to bridge the data silos among data owners for training a global machine learning model. In this paper, we propose a privacy-preserving and non-interacti v e feder a ted lear n ing sch e me for regression training with gradient descent, named VANE. With VANE, multiple data owners are able to train a global linear, ridge or logistic regression model with the assistance of cloud, while their private local training data can be well protected. Specifically, we first design a secure data aggregation algorithm, with which local training data from multiple data owners can be aggregated and trained to a global model without disclosing any private information. Meanwhile, benefit from our data pre-processing method, the whole training process is non-interactive, i.e., there is no interaction between data owners and the cloud. Detailed security analysis shows that VANE can well protect the local training data of data owners. The performance evaluation results demonstrate that the training performance of our VANE is around 103 times faster than existing schemes.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    33
    References
    5
    Citations
    NaN
    KQI
    []