论文标题
两个可以一起行走:隐私增强方法并防止用户跟踪
Can Two Walk Together: Privacy Enhancing Methods and Preventing Tracking of Users
论文作者
论文摘要
当从试图减轻多个报告中的隐私泄漏的个人收集数据时,我们会引起新的关注:跟踪通过添加的提供隐私的机制参与数据收集的用户。我们提出了受差异隐私框架启发的不可跟踪机制的几个定义。 具体而言,我们将可跟踪的参数定义为一组报告源自单个用户的概率与同一集报告源自两个用户(具有相同私人值)的概率之间的最大比率的日志。我们探讨了这个新定义的含义。我们展示了如何合并差异性私有和无法跟踪的机制,以实现何时检测某个用户更改其私有价值的问题。 在检查Google的Everllasting隐私解决方案时,我们表明Rappor(Erlingsson等人ACM CCS,2014年)可以在我们的论文中介绍的参数框架中进行跟踪。 我们分析了一种随机响应的变体,用于收集单位统计数据,位于位的永恒隐私,可实现良好的准确性和永恒的隐私,而仅是相当无法跟踪的,专门在报告的数量中进行了线性增长。对于从较大域(直方图和重型击球手)收集有关数据的统计数据,我们提出了一种机制,可防止跟踪有限数量的响应。 我们还使用一种机制的输出作为另一种机制的输入,在差异隐私的范围内提出了机制链接的概念,并证明了$ \ varepsilon_1 $ -LDP机构的链接,具有$ \ varepsilon_2 $ -ldp机械性是$ \ ln \ frac {e^{\ varepsilon_1+\ varepsilon_2} +1} {e^{\ varepsilon_1}+e^{\ varepsilon_2}} $ - ldp,并且此范围很紧。
We present a new concern when collecting data from individuals that arises from the attempt to mitigate privacy leakage in multiple reporting: tracking of users participating in the data collection via the mechanisms added to provide privacy. We present several definitions for untrackable mechanisms, inspired by the differential privacy framework. Specifically, we define the trackable parameter as the log of the maximum ratio between the probability that a set of reports originated from a single user and the probability that the same set of reports originated from two users (with the same private value). We explore the implications of this new definition. We show how differentially private and untrackable mechanisms can be combined to achieve a bound for the problem of detecting when a certain user changed their private value. Examining Google's deployed solution for everlasting privacy, we show that RAPPOR (Erlingsson et al. ACM CCS, 2014) is trackable in our framework for the parameters presented in their paper. We analyze a variant of randomized response for collecting statistics of single bits, Bitwise Everlasting Privacy, that achieves good accuracy and everlasting privacy, while only being reasonably untrackable, specifically grows linearly in the number of reports. For collecting statistics about data from larger domains (for histograms and heavy hitters) we present a mechanism that prevents tracking for a limited number of responses. We also present the concept of Mechanism Chaining, using the output of one mechanism as the input of another, in the scope of Differential Privacy, and show that the chaining of an $\varepsilon_1$-LDP mechanism with an $\varepsilon_2$-LDP mechanism is $\ln\frac{e^{\varepsilon_1+\varepsilon_2}+1}{e^{\varepsilon_1}+e^{\varepsilon_2}}$-LDP and that this bound is tight.