论文标题
联合子模型学习(FSL)中的私人读取更新写(PRUW):具有和不稀疏的沟通有效方案
Private Read Update Write (PRUW) in Federated Submodel Learning (FSL): Communication Efficient Schemes With and Without Sparsification
论文作者
论文摘要
我们研究了与私人联合子模型学习(FSL)有关的私人阅读更新写入(PRUW)的问题,在该问题中,机器学习模型根据用于训练该模型的不同类型的数据将机器学习模型分为多个子模型。在PRUW中,每个用户在阅读阶段下载所需的子模型,而无需在阅读阶段揭示其索引,并在不揭示子模型索引或在写作阶段中的更新值上载了子模型的更新。在这项工作中,我们首先提供了一种基本的沟通有效的PRUW计划,并研究了通过稀疏来降低沟通成本的进一步手段。梯度稀疏是在学习应用程序中广泛使用的概念,仅下载和更新了一组选定的参数,从而大大降低了通信成本。在本文中,我们研究了如何将稀疏概念纳入私人FSL,以降低通信成本,同时保证更新的子模型索引的信息理论隐私以及更新的值。为此,我们介绍了两个方案:PRUW,具有顶级$ r $稀疏性和随机稀疏的pruw。前者仅在服务器和用户之间传达最重要的参数/更新,而后者则传达了随机选择的一组参数/更新。提出的两个方案介绍了新技术,例如参数/更新(嘈杂)排列,以处理由稀疏引起的PRUW中信息泄漏的其他来源。与基本的(非SPARSE)PRUW方案相比,这两种方案都大大降低了通信成本。
We investigate the problem of private read update write (PRUW) in relation to private federated submodel learning (FSL), where a machine learning model is divided into multiple submodels based on the different types of data used to train the model. In PRUW, each user downloads the required submodel without revealing its index in the reading phase, and uploads the updates of the submodel without revealing the submodel index or the values of the updates in the writing phase. In this work, we first provide a basic communication efficient PRUW scheme, and study further means of reducing the communication cost via sparsification. Gradient sparsification is a widely used concept in learning applications, where only a selected set of parameters is downloaded and updated, which significantly reduces the communication cost. In this paper, we study how the concept of sparsification can be incorporated in private FSL with the goal of reducing the communication cost, while guaranteeing information theoretic privacy of the updated submodel index as well as the values of the updates. To this end, we introduce two schemes: PRUW with top $r$ sparsification and PRUW with random sparsification. The former communicates only the most significant parameters/updates among the servers and the users, while the latter communicates a randomly selected set of parameters/updates. The two proposed schemes introduce novel techniques such as parameter/update (noisy) permutations to handle the additional sources of information leakage in PRUW caused by sparsification. Both schemes result in significantly reduced communication costs compared to that of the basic (non-sparse) PRUW scheme.