Principal Component Analysis in the Stochastic Differential Privacy Model

2021 
In this paper, we study the differentially private Principal Component Analysis (PCA) problem in stochastic optimization settings. We first propose a new stochastic gradient perturbation PCA mechanism (DP-SPCA) for the calculation of the right singular subspace to achieve (Iµ,I´)-differential privacy. For achieving a better utility guarantee and performance, we then present a new differential privacy stochastic variance reduction mechanism (DP-VRPCA) with gradient perturbation for PCA, which has a near-optimal utility bound. To the best of our knowledge, this is the first work of stochastic gradient perturbation for (Iµ,I´)-differential privacy PCA. We also compare the proposed algorithms with existing state-of-the-art methods in the literature, and experiments on real-world datasets and on classification tasks confirm the improved theoretical guarantees of our algorithms.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    19
    References
    1
    Citations
    NaN
    KQI
    []