论文标题

DP-SGD的维度无关,用于过度参数的光滑凸优化

Dimension Independent Generalization of DP-SGD for Overparameterized Smooth Convex Optimization

论文作者

Ma, Yi-An, Marinov, Teodor Vanislavov, Zhang, Tong

论文摘要

本文考虑了差异性凸学习的概括性能。我们证明了Langevin算法的收敛分析可用于获得DP-SGD具有不同隐私保证的新概括界。更具体地说,通过使用一些最近获得的与尺寸无关的收敛结果,用于具有凸目标函数的随机Langevin算法,我们获得了DP-SGD的$ O(N^{ - 1/4})$隐私保证,并获得了$ \ tilde {o}(n^ofer} $ promer的最佳概括误差,以确定{n^^{ - 1/2 { - 1/2 { - 1/2})。这改善了以前的DP-SGD结果,这些问题包含明确的维度依赖性,从而使结果的概括界限不适合在实际应用中使用的过度参数化模型。

This paper considers the generalization performance of differentially private convex learning. We demonstrate that the convergence analysis of Langevin algorithms can be used to obtain new generalization bounds with differential privacy guarantees for DP-SGD. More specifically, by using some recently obtained dimension-independent convergence results for stochastic Langevin algorithms with convex objective functions, we obtain $O(n^{-1/4})$ privacy guarantees for DP-SGD with the optimal excess generalization error of $\tilde{O}(n^{-1/2})$ for certain classes of overparameterized smooth convex optimization problems. This improves previous DP-SGD results for such problems that contain explicit dimension dependencies, so that the resulting generalization bounds become unsuitable for overparameterized models used in practical applications.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源