论文标题

自适应联合最小值优化,复杂性较低

Adaptive Federated Minimax Optimization with Lower Complexities

论文作者

Huang, Feihu, Wang, Xinrui, Li, Junyi, Chen, Songcan

论文摘要

联合学习是机器学习中流行的分布式和隐私学习范式。最近,已经提出了一些联合学习算法来解决分布式的minimax问题。但是,这些联合的最小算法仍然患有高梯度或通信复杂性。同时,很少有算法专注于使用自适应学习率来加速这些算法。为了填补这一空白,我们在论文中研究了一类NonCovex Minimax优化,并提出了有效的自适应联合联合最小的最小优化算法(即ADAFGDA)来解决这些分布式的最小问题。具体而言,我们的ADAFGDA建立在基于动量的差异和局部SGD技术的基础上,并且可以通过使用统一的自适应矩阵灵活地合并各种自适应学习率。从理论上讲,我们在非i.i.d下为我们的ADAFGDA算法提供了一个可靠的收敛分析框架。环境。此外,我们证明我们的ADAFGDA算法获得了$ \ tilde {o}(O}(ε^{ - 3})$的较低梯度(即,随机的一阶甲骨文,SFO)的复杂性,其通信复杂性较低,$ \ tilde {O}(O}(O}(O})(ε^{ε^{ - 2} $)在实验上,我们对深度AUC最大化和强大的神经网络训练任务进行了一些实验,以验证算法的效率。

Federated learning is a popular distributed and privacy-preserving learning paradigm in machine learning. Recently, some federated learning algorithms have been proposed to solve the distributed minimax problems. However, these federated minimax algorithms still suffer from high gradient or communication complexity. Meanwhile, few algorithm focuses on using adaptive learning rate to accelerate these algorithms. To fill this gap, in the paper, we study a class of nonconvex minimax optimization, and propose an efficient adaptive federated minimax optimization algorithm (i.e., AdaFGDA) to solve these distributed minimax problems. Specifically, our AdaFGDA builds on the momentum-based variance reduced and local-SGD techniques, and it can flexibly incorporate various adaptive learning rates by using the unified adaptive matrices. Theoretically, we provide a solid convergence analysis framework for our AdaFGDA algorithm under non-i.i.d. setting. Moreover, we prove our AdaFGDA algorithm obtains a lower gradient (i.e., stochastic first-order oracle, SFO) complexity of $\tilde{O}(ε^{-3})$ with lower communication complexity of $\tilde{O}(ε^{-2})$ in finding $ε$-stationary point of the nonconvex minimax problems. Experimentally, we conduct some experiments on the deep AUC maximization and robust neural network training tasks to verify efficiency of our algorithms.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源