论文标题
私有加速优化算法
Differentially Private Accelerated Optimization Algorithms
论文作者
论文摘要
我们提出了从众所周知的加速一阶方法得出的两类差异私有优化算法。第一种算法的灵感来自Polyak的重球方法,并采用平滑方法来减少差异隐私所需的梯度步骤的累积噪声。第二类算法基于Nesterov的加速梯度方法及其最近的多阶段变体。我们为Nesterov方法的迭代提出了一种噪声分隔机制,以改善算法的误差行为。在动态系统分析技术的帮助下,为重球和Nesterov的加速梯度方法提供了收敛速率分析。最后,我们通过数值实验得出结论,表明所提出的算法比众所周知的差异私有算法具有优势。
We present two classes of differentially private optimization algorithms derived from the well-known accelerated first-order methods. The first algorithm is inspired by Polyak's heavy ball method and employs a smoothing approach to decrease the accumulated noise on the gradient steps required for differential privacy. The second class of algorithms are based on Nesterov's accelerated gradient method and its recent multi-stage variant. We propose a noise dividing mechanism for the iterations of Nesterov's method in order to improve the error behavior of the algorithm. The convergence rate analyses are provided for both the heavy ball and the Nesterov's accelerated gradient method with the help of the dynamical system analysis techniques. Finally, we conclude with our numerical experiments showing that the presented algorithms have advantages over the well-known differentially private algorithms.