论文标题

关于类星 - 凸优化的一阶方法的收敛性

On The Convergence of First Order Methods for Quasar-Convex Optimization

论文作者

Jin, Jikai

论文摘要

近年来,深度学习的成功激发了许多研究人员研究一般平滑非凸功能的优化。但是,最近的工作已经为此类功能建立了悲观的最坏情况,这与它们在现实世界应用中的出色表现形成鲜明对比(例如,培训深神经网络)。另一方面,发现许多流行的非凸优化问题享有某些结构化特性,这些特性与凸性具有一定的相似性。在本文中,我们研究\ textit {quasar-convex函数}的类别,以缩小理论和实践之间的差距。我们在各种不同的设置和不同的最佳标准下研究一阶方法的收敛性。我们证明,复杂性的上限类似于为凸功能建立的标准结果,并且比非凸功能的最新收敛速率更好。总体而言,本文建议\ textit {quasar-convexity}允许有效的优化过程,我们期待看到更多的问题,这些问题在实践中显示出相似的属性。

In recent years, the success of deep learning has inspired many researchers to study the optimization of general smooth non-convex functions. However, recent works have established pessimistic worst-case complexities for this class functions, which is in stark contrast with their superior performance in real-world applications (e.g. training deep neural networks). On the other hand, it is found that many popular non-convex optimization problems enjoy certain structured properties which bear some similarities to convexity. In this paper, we study the class of \textit{quasar-convex functions} to close the gap between theory and practice. We study the convergence of first order methods in a variety of different settings and under different optimality criterions. We prove complexity upper bounds that are similar to standard results established for convex functions and much better that state-of-the-art convergence rates of non-convex functions. Overall, this paper suggests that \textit{quasar-convexity} allows efficient optimization procedures, and we are looking forward to seeing more problems that demonstrate similar properties in practice.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源