论文标题

用稀疏的本地模型实现个性化的联合学习

Achieving Personalized Federated Learning with Sparse Local Models

论文作者

Huang, Tiansheng, Liu, Shiwei, Shen, Li, He, Fengxiang, Lin, Weiwei, Tao, Dacheng

论文摘要

联合学习(FL)容易受到异质分布数据的影响,因为FL中的常见全局模型可能无法适应每个用户的异质数据分布。为了解决这个问题,提出了个性化FL(PFL)为每个用户生成专用的本地模型。但是,PFL远非其成熟度,因为现有的PFL解决方案要么表现出对不同模型体系结构的概括,要么成本巨大的额外计算和内存。在这项工作中,我们建议使用个性化的稀疏面具(FEDSPA),这是一种新颖的PFL计划,该计划采用个性化的稀疏面具来自定义边缘的稀疏本地模型。 FEDSPA没有训练完整的PFL模型,而是在整个培训(又称稀疏到SPARSE培训)中仅保持固定数量的活动参数,这使用户的模型可以通过廉价的通信,计算和内存成本来实现个性化。从理论上讲,我们表明,通过FedSpa获得的迭代率将$ \ Mathcal {o}速率(\ frac {1} {\ sqrt {t}})$的速率收敛到公式SPFL问题的局部最小化器。全面的实验表明,FEDSPA可显着节省沟通和计算成本,同时在几种最先进的PFL方法中达到更高的模型准确性和更快的收敛速度。

Federated learning (FL) is vulnerable to heterogeneously distributed data, since a common global model in FL may not adapt to the heterogeneous data distribution of each user. To counter this issue, personalized FL (PFL) was proposed to produce dedicated local models for each individual user. However, PFL is far from its maturity, because existing PFL solutions either demonstrate unsatisfactory generalization towards different model architectures or cost enormous extra computation and memory. In this work, we propose federated learning with personalized sparse mask (FedSpa), a novel PFL scheme that employs personalized sparse masks to customize sparse local models on the edge. Instead of training an intact (or dense) PFL model, FedSpa only maintains a fixed number of active parameters throughout training (aka sparse-to-sparse training), which enables users' models to achieve personalization with cheap communication, computation, and memory cost. We theoretically show that the iterates obtained by FedSpa converge to the local minimizer of the formulated SPFL problem at rate of $\mathcal{O}(\frac{1}{\sqrt{T}})$. Comprehensive experiments demonstrate that FedSpa significantly saves communication and computation costs, while simultaneously achieves higher model accuracy and faster convergence speed against several state-of-the-art PFL methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源