论文标题
测试时间稳健的个性化对联合学习的个性化
Test-Time Robust Personalization for Federated Learning
论文作者
论文摘要
联合学习(FL)是一种机器学习范式,许多客户在该范围内通过分散的培训数据协作学习共享的全球模型。个性化的FL还将全球模型调整给不同的客户,从而在一致的本地培训和测试分布上取得了令人鼓舞的结果。但是,对于实际个性化的FL应用程序,至关重要的是,在部署过程中不断发展的本地测试集中,稳健的FL模型稳健,在此期间可能会发生各种分配变化。在这项工作中,我们确定了在测试时分配变化中的现有作品的陷阱,并提出了联合测试时间集合和调整(FEDTHE+)的联合测试时间集合,该工具对各种测试时间分配变化的fl模型个性化。我们通过在CIFAR10和IMAGENET上训练各种神经体系结构(CNN,Resnet和Transferer),以各种测试分布来说明FEDTE+(及其计算有效的FEDTE)的进步。除此之外,我们建立了一个基准,用于评估部署过程中个性化FL方法的性能和鲁棒性。代码:https://github.com/lins-lab/fedthe。
Federated Learning (FL) is a machine learning paradigm where many clients collaboratively learn a shared global model with decentralized training data. Personalized FL additionally adapts the global model to different clients, achieving promising results on consistent local training and test distributions. However, for real-world personalized FL applications, it is crucial to go one step further: robustifying FL models under the evolving local test set during deployment, where various distribution shifts can arise. In this work, we identify the pitfalls of existing works under test-time distribution shifts and propose Federated Test-time Head Ensemble plus tuning(FedTHE+), which personalizes FL models with robustness to various test-time distribution shifts. We illustrate the advancement of FedTHE+ (and its computationally efficient variant FedTHE) over strong competitors, by training various neural architectures (CNN, ResNet, and Transformer) on CIFAR10 andImageNet with various test distributions. Along with this, we build a benchmark for assessing the performance and robustness of personalized FL methods during deployment. Code: https://github.com/LINs-lab/FedTHE.