论文标题
通过审查,量化和广义的ADMM进行沟通有效的分布式学习
Communication Efficient Distributed Learning with Censored, Quantized, and Generalized Group ADMM
论文作者
论文摘要
在本文中,我们提出了一个沟通有效的分散的机器学习框架,该框架解决了通过连接工人网络定义的共识优化问题。提出的算法,审查并量化了广义GADMM(CQ-GGADMM),利用了工人组和分散的学习思想,用于小组交替方向方法(GADMM)(GADMM),并通过将其适用性扩展到通用网络拓扑,同时将链接审查扩展到链接审查中,同时将链接审查扩展到量身定量的量化中,从而将沟通效率扩展到了量化量的范围内,以将链接审查量化以量化量化量化。从理论上讲,当局部目标函数在某些温和的假设下强烈凸出时,CQ-GGADMM可以达到线性收敛速率。数值模拟证实了CQ-GGADMM在通信循环数量和传输能量消耗方面表现出较高的通信效率,而不会损害与审查的分散ADMM以及GADMM的工人分组方法相比,不损害准确性和收敛速度。
In this paper, we propose a communication-efficiently decentralized machine learning framework that solves a consensus optimization problem defined over a network of inter-connected workers. The proposed algorithm, Censored and Quantized Generalized GADMM (CQ-GGADMM), leverages the worker grouping and decentralized learning ideas of Group Alternating Direction Method of Multipliers (GADMM), and pushes the frontier in communication efficiency by extending its applicability to generalized network topologies, while incorporating link censoring for negligible updates after quantization. We theoretically prove that CQ-GGADMM achieves the linear convergence rate when the local objective functions are strongly convex under some mild assumptions. Numerical simulations corroborate that CQ-GGADMM exhibits higher communication efficiency in terms of the number of communication rounds and transmit energy consumption without compromising the accuracy and convergence speed, compared to the censored decentralized ADMM, and the worker grouping method of GADMM.