论文标题
我们可以在生成型中找到强大的彩票吗?
Can We Find Strong Lottery Tickets in Generative Models?
论文作者
论文摘要
是的。在本文中,我们研究了生成模型中强大的彩票,即在没有任何重量更新的情况下实现良好生成性能的子网。神经网络修剪被认为是模型压缩的主要基石,用于降低计算和记忆的成本。不幸的是,修剪生成模型尚未得到广泛的探索,并且所有现有的修剪算法都遭受了过度的重量训练成本,性能降解,有限的可推广性或复杂的培训。为了解决这些问题,我们建议通过匹配分数找到强大的彩票。我们的实验结果表明,即使仅保留10%的重量,发现的子网也可以比训练有素的密集模型相似或更好。据我们所知,我们是第一个在生成模型中展示强大彩票的存在并提供算法以稳定发现的人。我们的代码和补充材料公开可用。
Yes. In this paper, we investigate strong lottery tickets in generative models, the subnetworks that achieve good generative performance without any weight update. Neural network pruning is considered the main cornerstone of model compression for reducing the costs of computation and memory. Unfortunately, pruning a generative model has not been extensively explored, and all existing pruning algorithms suffer from excessive weight-training costs, performance degradation, limited generalizability, or complicated training. To address these problems, we propose to find a strong lottery ticket via moment-matching scores. Our experimental results show that the discovered subnetwork can perform similarly or better than the trained dense model even when only 10% of the weights remain. To the best of our knowledge, we are the first to show the existence of strong lottery tickets in generative models and provide an algorithm to find it stably. Our code and supplementary materials are publicly available.