论文标题

用一个变量之一的较小维

Solving strongly convex-concave composite saddle point problems with a small dimension of one of the variables

论文作者

Gladin, Egor, Kuruzov, Ilya, Stonyakin, Fedor, Pasechnyuk, Dmitry, Alkousa, Mohammad, Gasnikov, Alexander

论文摘要

该文章致力于开发算法方法,以确保在一个变量群具有高维且另一个相对较低(最高一百个)的情况下,在一个情况下,在强烈凸出concave鞍点问题上具有有效的复杂性界限。所提出的技术基于将这种类型的问题减少到最小化凸(最大化凹形)功能的问题的问题,其中一个变量可以在任意点上使用辅助优化子问题在任意点上找到一个近似梯度,并具有另一个变量。在这种情况下,椭圆形方法用于低维问题(如有必要,使用不确定的$δ$ - 缩略器),并将加速梯度方法用于高维问题。对于一个非常小的变量(最多5)的情况,一种基于YU的多维类似物的新版本的方法。提出了E. Nesterov在正方形(多维二分法)上的方法,并可能使用目标功能梯度的不精确值。

The article is devoted to the development of algorithmic methods ensuring efficient complexity bounds for strongly convex-concave saddle point problems in the case when one of the groups of variables is high-dimensional, and the other is relatively low-dimensional (up to a hundred). The proposed technique is based on reducing problems of this type to a problem of minimizing a convex (maximizing a concave) functional in one of the variables, for which it is possible to find an approximate gradient at an arbitrary point with the required accuracy using an auxiliary optimization subproblem with another variable. In this case, the ellipsoid method is used for low-dimensional problems (if necessary, with an inexact $δ$-subgradient), and accelerated gradient methods are used for high-dimensional problems. For the case of a very small dimension of one of the groups of variables (up to 5), an approach based on a new version of the multidimensional analog of the Yu. E. Nesterov's method on the square (multidimensional dichotomy) is proposed with the possibility of using inexact values of the gradient of the objective functional.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源