论文标题

在线多核心凸追逐和优化

Online Multiserver Convex Chasing and Optimization

论文作者

Bubeck, Sébastien, Rabani, Yuval, Sellke, Mark

论文摘要

我们介绍了$ k $ chasing convex函数的问题,同时概括了$ r^d $中著名的k-server问题,以及追逐凸面和功能的问题。除了这种一般形式的基本兴趣外,它还针对在线$ k $ clustruster的问题,包括$ k $ -Median或$ k $ -Means。我们表明,这个问题表现出丰富的行为景观。通常,如果$ k> 1 $和$ d> 1 $都不存在任何在线算法,并且竞争力有限。相比之下,我们表现出一类表现出色的功能(特别包括上述聚类问题),为此我们表明存在竞争性的在线算法,并且具有无维度的竞争比率。我们还引入了一个平行的问题,即在在线凸优化领域中,$ k $ Action遗憾的最小化。在那里,$ k> 1 $也出现了更粗糙的景观。虽然有可能实现消失的遗憾,但与最高的诉讼案例不同,消失的速度并不能加快强烈凸出功能的速度。此外,消失的遗憾需要棘手的计算和随机性。最终,我们是否可以以$ k> 1美元的价格和一般凸的损失来实现几乎无尺寸的遗憾。作为有可能的证据,我们通过信息理论论证证明了线性损失的无维后悔。

We introduce the problem of $k$-chasing of convex functions, a simultaneous generalization of both the famous k-server problem in $R^d$, and of the problem of chasing convex bodies and functions. Aside from fundamental interest in this general form, it has natural applications to online $k$-clustering problems with objectives such as $k$-median or $k$-means. We show that this problem exhibits a rich landscape of behavior. In general, if both $k > 1$ and $d > 1$ there does not exist any online algorithm with bounded competitiveness. By contrast, we exhibit a class of nicely behaved functions (which include in particular the above-mentioned clustering problems), for which we show that competitive online algorithms exist, and moreover with dimension-free competitive ratio. We also introduce a parallel question of top-$k$ action regret minimization in the realm of online convex optimization. There, too, a much rougher landscape emerges for $k > 1$. While it is possible to achieve vanishing regret, unlike the top-one action case the rate of vanishing does not speed up for strongly convex functions. Moreover, vanishing regret necessitates both intractable computations and randomness. Finally we leave open whether almost dimension-free regret is achievable for $k > 1$ and general convex losses. As evidence that it might be possible, we prove dimension-free regret for linear losses via an information-theoretic argument.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源