论文标题

l $ _0 $ onie:用l $ _0 $ -constraints压缩硬币

L$_0$onie: Compressing COINs with L$_0$-constraints

论文作者

Ramirez, Juan, Gallego-Posada, Jose

论文摘要

隐式神经表示(INR)的进步激发了对域 - 不合稳定技术的研究。这些方法训练神经网络以近似对象,然后存储训练有素的模型的权重。例如,给定图像,训练网络以学习从像素位置到RGB值的映射。在本文中,我们提出了L $ _0 $ ONIE,这是硬币压缩方法的稀疏限制扩展。稀疏性可以利用过度参数化网络的更快学习,同时保留较小型号的理想压缩率。此外,我们受约束的配方可确保最终模型尊重预定的压缩率,从而消除对昂贵的体系结构搜索的需求。

Advances in Implicit Neural Representations (INR) have motivated research on domain-agnostic compression techniques. These methods train a neural network to approximate an object, and then store the weights of the trained model. For example, given an image, a network is trained to learn the mapping from pixel locations to RGB values. In this paper, we propose L$_0$onie, a sparsity-constrained extension of the COIN compression method. Sparsity allows to leverage the faster learning of overparameterized networks, while retaining the desirable compression rate of smaller models. Moreover, our constrained formulation ensures that the final model respects a pre-determined compression rate, dispensing of the need for expensive architecture search.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源