论文标题

在一般假设下,分散SGD的加速度与随机噪声低的一般假设

An acceleration of decentralized SGD under general assumptions with low stochastic noise

论文作者

Ekaterina, Trimbach, Alexander, Rogozin

论文摘要

分布式优化方法由优化社区积极研究。由于在分布式机器学习中的应用,现代研究方向包括随机目标,降低通信频率以及随时间变化的通信网络拓扑。最近,在Koloskova等人中开发了一项分析,该分析统一了几种集中式和分散的随机分布式优化方法。 (2020)。在这项工作中,我们采用了催化剂框架,并加速了Koloskova等人的速度。 (2020)在低随机噪声的情况下。

Distributed optimization methods are actively researched by optimization community. Due to applications in distributed machine learning, modern research directions include stochastic objectives, reducing communication frequency, and time-varying communication network topology. Recently, an analysis unifying several centralized and decentralized approaches to stochastic distributed optimization was developed in Koloskova et al. (2020). In this work, we employ a Catalyst framework and accelerate the rates of Koloskova et al. (2020) in the case of low stochastic noise.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源