论文标题

修剪深神经网络的特征图歧视观点

A Feature-map Discriminant Perspective for Pruning Deep Neural Networks

论文作者

Hou, Zejiang, Kung, Sun-Yuan

论文摘要

网络修剪已成为为移动和边缘应用加速深层神经网络的事实上的工具。最近,基于特征图判别的频道修剪已显示出令人鼓舞的结果,因为它与CNN区分多个类别的目标非常吻合,并且可以更好地解释修剪决策。但是,由于缺乏量化特征映射判别能力的理论指南,因此现有基于判别的方法受到计算效率低下的挑战。在本文中,我们提出了一种新的数学公式,以准确有效地量化特征映射歧视性,从而产生了新的标准,即判别信息(DI)。我们分析了DI的理论属性,特别是非偏重属性,这使DI成为有效的选择标准。基于DI的修剪可以消除对DI值的最小影响的渠道,因为它们几乎没有有关判别能力的信息。 DI标准的多功能性还可以实现内部混合精度量化,以进一步压缩网络。此外,我们提出了一种基于DI的贪婪修剪算法和结构蒸馏技术,以自动确定满足某些资源预算的修剪结构,这在现实中是常见的要求。广泛的实验证明了我们方法的有效性:与未经修复的模型相比

Network pruning has become the de facto tool to accelerate deep neural networks for mobile and edge applications. Recently, feature-map discriminant based channel pruning has shown promising results, as it aligns well with the CNN objective of differentiating multiple classes and offers better interpretability of the pruning decision. However, existing discriminant-based methods are challenged by computation inefficiency, as there is a lack of theoretical guidance on quantifying the feature-map discriminant power. In this paper, we present a new mathematical formulation to accurately and efficiently quantify the feature-map discriminativeness, which gives rise to a novel criterion,Discriminant Information(DI). We analyze the theoretical property of DI, specifically the non-decreasing property, that makes DI a valid selection criterion. DI-based pruning removes channels with minimum influence to DI value, as they contain little information regarding to the discriminant power. The versatility of DI criterion also enables an intra-layer mixed precision quantization to further compress the network. Moreover, we propose a DI-based greedy pruning algorithm and structure distillation technique to automatically decide the pruned structure that satisfies certain resource budget, which is a common requirement in reality. Extensive experiments demonstratethe effectiveness of our method: our pruned ResNet50 on ImageNet achieves 44% FLOPs reduction without any Top-1 accuracy loss compared to unpruned model

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源