AdaPDP: Adaptive Personalized Differential Privacy

2021 
Users usually have different privacy demands when they contribute individual data to a dataset that is maintained and queried by others. To tackle this problem, several personalized differential privacy (PDP) mechanisms have been proposed to render statistical information of the entire dataset without revealing individual privacy. However, existing mechanisms produce query results with low accuracy, which leads to poor data utility. This is primarily because (1) some users are over protected; (2) utility is not explicitly included in the design objective. Poor data utility impedes the adoption of PDP in the real-world applications. In this paper, we present an adaptive personalized differential privacy framework, called AdaPDP. Specifically, to maximize data utility in different cases, AdaPDP adaptively selects underlying noise generation algorithms and calculates the corresponding parameters based on the type of query functions, data distributions and privacy settings. In addition, AdaPDP performs multiple rounds of utility-aware sampling to satisfy different privacy requirements for users. Our privacy analysis shows that the proposed framework renders rigorous privacy guarantee. We conduct extensive experiments on synthetic and real-world datasets to demonstrate the much less utility losses of the proposed framework over various query functions.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    32
    References
    1
    Citations
    NaN
    KQI
    []