论文标题

通过Renyi过滤器进行个人隐私会计

Individual Privacy Accounting via a Renyi Filter

论文作者

Feldman, Vitaly, Zrnic, Tijana

论文摘要

我们考虑一个顺序设置,其中单个个人数据集用于执行适应性选择的分析,同时确保每个参与者的差异隐私损失不会超过预先指定的隐私预算。解决此问题的标准方法依赖于对所有个体的隐私损失以及其数据的所有可能值的最坏情况估算。然而,在许多情况下,这种方法过于保守,特别是对于“典型”数据点,由于大多数分析的参与,几乎没有隐私损失。在这项工作中,我们根据每个分析中每个人的个性化隐私损失估算值的价值,为更严格的隐私损失会计。要实施会计方法,我们为Rényi差异隐私设计了过滤器。过滤器是一种工具,可确保具有自适应选择的隐私参数组成的算法序列的隐私参数不会超过预先指定的预算。我们的过滤器比Rogers等人的$(ε,δ)$差异隐私比已知的过滤器更简单,更紧密。我们将结果应用于对嘈杂梯度下降的分析,并表明个性化会计可以实用,易于实施,并且只能使隐私 - 实用性权衡更严格。

We consider a sequential setting in which a single dataset of individuals is used to perform adaptively-chosen analyses, while ensuring that the differential privacy loss of each participant does not exceed a pre-specified privacy budget. The standard approach to this problem relies on bounding a worst-case estimate of the privacy loss over all individuals and all possible values of their data, for every single analysis. Yet, in many scenarios this approach is overly conservative, especially for "typical" data points which incur little privacy loss by participation in most of the analyses. In this work, we give a method for tighter privacy loss accounting based on the value of a personalized privacy loss estimate for each individual in each analysis. To implement the accounting method we design a filter for Rényi differential privacy. A filter is a tool that ensures that the privacy parameter of a composed sequence of algorithms with adaptively-chosen privacy parameters does not exceed a pre-specified budget. Our filter is simpler and tighter than the known filter for $(ε,δ)$-differential privacy by Rogers et al. We apply our results to the analysis of noisy gradient descent and show that personalized accounting can be practical, easy to implement, and can only make the privacy-utility tradeoff tighter.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源