论文标题

具有可证明保证的多模型联合学习

Multi-Model Federated Learning with Provable Guarantees

论文作者

Bhuyan, Neelkamal, Moharir, Sharayu, Joshi, Gauri

论文摘要

联合学习(FL)是分布式学习的一种变体,其中Edge设备可以协作学习模型,而无需与中央服务器或彼此共享数据。我们将使用通用客户库作为多模型FL的联合设置同时训练多个独立模型的过程。在这项工作中,我们提出了用于多模型FL的流行FedAvg算法的两个变体,并提供可证明的收敛保证。我们进一步表明,对于相同数量的计算,多模型FL可以比单独训练每个模型具有更好的性能。我们通过在强凸,凸和非凸面设置中进行实验来补充理论结果。

Federated Learning (FL) is a variant of distributed learning where edge devices collaborate to learn a model without sharing their data with the central server or each other. We refer to the process of training multiple independent models simultaneously in a federated setting using a common pool of clients as multi-model FL. In this work, we propose two variants of the popular FedAvg algorithm for multi-model FL, with provable convergence guarantees. We further show that for the same amount of computation, multi-model FL can have better performance than training each model separately. We supplement our theoretical results with experiments in strongly convex, convex, and non-convex settings.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源