论文标题

自动压缩子集修剪语义图像分割

Auto-Compressing Subset Pruning for Semantic Image Segmentation

论文作者

Ditschuneit, Konstantin, Otterbach, Johannes S.

论文摘要

最新的语义分割模型的特征是高参数计数和缓慢的推理时间,使其不适合在资源约束环境中部署。为了应对这一挑战,我们建议\ textsc {自动压缩子集修剪},\ acosp,作为一种新的在线压缩方法。 \ ACOSP的核心是根据有效的温度退火时间表学习分割模型中每个卷积的单个频道的通道选择机制。我们在训练开始时提供了高容量模型与压缩压力迫使模型将概念压缩到保留通道之间的关键相互作用。我们将\ acosp应用于\ segnet和\ pspnet架构,并在\ camvid,\ city,\ voc和\ ade数据集中训练时显示成功。结果与现有基准的竞争性在低压缩比以压缩模型的压缩为竞争力,并且在高压缩比下的表现明显优于它们,即使取消了超过$ 93 \%的参数$,也会产生可接受的结果。此外,\ acosp在概念上很简单,易于实现,并且可以轻松地将其推广到其他数据模式,任务和体系结构。我们的代码可在\ url {https://github.com/merantix/acosp}上找到。

State-of-the-art semantic segmentation models are characterized by high parameter counts and slow inference times, making them unsuitable for deployment in resource-constrained environments. To address this challenge, we propose \textsc{Auto-Compressing Subset Pruning}, \acosp, as a new online compression method. The core of \acosp consists of learning a channel selection mechanism for individual channels of each convolution in the segmentation model based on an effective temperature annealing schedule. We show a crucial interplay between providing a high-capacity model at the beginning of training and the compression pressure forcing the model to compress concepts into retained channels. We apply \acosp to \segnet and \pspnet architectures and show its success when trained on the \camvid, \city, \voc, and \ade datasets. The results are competitive with existing baselines for compression of segmentation models at low compression ratios and outperform them significantly at high compression ratios, yielding acceptable results even when removing more than $93\%$ of the parameters. In addition, \acosp is conceptually simple, easy to implement, and can readily be generalized to other data modalities, tasks, and architectures. Our code is available at \url{https://github.com/merantix/acosp}.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源