Differential privacy performance evaluation under the condition of non-uniform noise distribution

2022 
Differential Privacy (DP) technology provides powerful and mathematically defined privacy protection for aggregated data. It makes the query results of adjacent datasets indistinguishable by introducing a proper noise level into the query results. The existing DP schemes usually uniformly add the noise which obeys a certain distribution to the target data set. Although they can effectively protect privacy, they may also lead to excessive privacy protection for local data in a data set, which greatly reduces the usability of query results. Based on the above issues, this paper proposes a novel DP scheme with non-uniform noise distribution. Our scheme divides the target data set and adds noise to specific data blocks. We analyzed the entire dataset to verify its privacy and usability. What we care about is whether the data set processed by the DP scheme with non-uniform noise distribution has sufficient privacy protection strength on the basis of improving usability. We obtain encouraging results from evaluating their performance on real data sets. Our results show that both theoretically and empirically, the scheme strikes a good balance between privacy and usability and can provide better privacy protection.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []