论文标题
嘈杂的Hölder-Maradient函数的零级方法
Zeroth-order methods for noisy Hölder-gradient functions
论文作者
论文摘要
在本文中,我们证明了非凸优化的零阶方法的新复杂性界限,并且对目标函数值不确定。我们使用Nesterov和Spokoiny [2015]的高斯平滑途径,并扩展了其结果,用于优化方法,以使光滑的零级非凸问题进行优化方法,以用Hölder-Continul-Contient梯度与嘈杂的Zeroth-rorth Oracle,以及获得噪声上限的噪声上限。我们考虑基于正态分布的随机高斯向量的有限差异梯度近似,并证明基于此近似值的梯度下降方案会收敛到平滑函数的固定点。我们还考虑收敛到原始功能的固定点,并在算法的步骤数上获得界限,以使其梯度较小。此外,我们为零级甲骨文中的噪声水平提供界限,仍然可以保证上述界限保持。我们还分别考虑$ν= 1 $的情况,并表明在这种情况下,可以改善获得的界限对维度的依赖性。
In this paper, we prove new complexity bounds for zeroth-order methods in non-convex optimization with inexact observations of the objective function values. We use the Gaussian smoothing approach of Nesterov and Spokoiny [2015] and extend their results, obtained for optimization methods for smooth zeroth-order non-convex problems, to the setting of minimization of functions with Hölder-continuous gradient with noisy zeroth-order oracle, obtaining noise upper-bounds as well. We consider finite-difference gradient approximation based on normally distributed random Gaussian vectors and prove that gradient descent scheme based on this approximation converges to the stationary point of the smoothed function. We also consider convergence to the stationary point of the original (not smoothed) function and obtain bounds on the number of steps of the algorithm for making the norm of its gradient small. Additionally, we provide bounds for the level of noise in the zeroth-order oracle for which it is still possible to guarantee that the above bounds hold. We also consider separately the case of $ν= 1$ and show that in this case the dependence of the obtained bounds on the dimension can be improved.