论文标题

Stein变分梯度下降的有限粒子收敛速率

A Finite-Particle Convergence Rate for Stein Variational Gradient Descent

论文作者

Shi, Jiaxin, Mackey, Lester

论文摘要

我们为Stein变异梯度下降(SVGD)提供了第一个有限粒子收敛速率,这是一种流行的算法,用于近似概率分布与颗粒集合的概率分布。具体而言,只要目标分布是lipschitz评分的亚高斯,n颗粒的SVGD和适当的步长序列都以1/sqrt(log log n)速率以1/sqrt(log log n)速率将内核stein差异驱动到零。我们怀疑可以改善对n的依赖,我们希望我们的明确的非征求力证明策略将作为未来改进的模板。

We provide the first finite-particle convergence rate for Stein variational gradient descent (SVGD), a popular algorithm for approximating a probability distribution with a collection of particles. Specifically, whenever the target distribution is sub-Gaussian with a Lipschitz score, SVGD with n particles and an appropriate step size sequence drives the kernel Stein discrepancy to zero at an order 1/sqrt(log log n) rate. We suspect that the dependence on n can be improved, and we hope that our explicit, non-asymptotic proof strategy will serve as a template for future refinements.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源