论文标题

SGB:用于优化分区功能的随机梯度界限方法

SGB: Stochastic Gradient Bound Method for Optimizing Partition Functions

论文作者

Wang, Jing, Choromanska, Anna

论文摘要

本文解决了在随机学习设置中优化分区功能的问题。我们提出了一个结合多数化算法的随机变体,该变体依赖于使用二次替代的​​分区函数的上限。我们称为随机分区函数绑定(SPFB)的提议方法的更新类似于缩放的随机梯度下降,其中缩放因子依赖于二阶项,但与Hessian不同。与准Newton方案相似,该术语是使用函数及其梯度的随机近似构建的。我们证明了所提出的方法的子线性收敛速率,并显示其低级别变体(LSPFB)的构建。对逻辑回归的实验表明,所提出的方案显着胜过SGD。我们还讨论了如何使用二次分区功能约束,以进行深度学习模型和非凸优化的有效培训。

This paper addresses the problem of optimizing partition functions in a stochastic learning setting. We propose a stochastic variant of the bound majorization algorithm that relies on upper-bounding the partition function with a quadratic surrogate. The update of the proposed method, that we refer to as Stochastic Partition Function Bound (SPFB), resembles scaled stochastic gradient descent where the scaling factor relies on a second order term that is however different from the Hessian. Similarly to quasi-Newton schemes, this term is constructed using the stochastic approximation of the value of the function and its gradient. We prove sub-linear convergence rate of the proposed method and show the construction of its low-rank variant (LSPFB). Experiments on logistic regression demonstrate that the proposed schemes significantly outperform SGD. We also discuss how to use quadratic partition function bound for efficient training of deep learning models and in non-convex optimization.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源