论文标题

使用批处理桥梁修剪卷积过滤器

Pruning Convolutional Filters using Batch Bridgeout

论文作者

Khan, Najeeb, Stavness, Ian

论文摘要

最先进的计算机视觉模型的容量正在迅速增加,其中参数的数量远远超过了适合训练集所需的数量。这会带来更好的优化和泛化性能。但是,当代模型的巨大尺寸导致了巨大的推理成本,并限制了它们在资源有限设备上的使用。为了降低推理成本,可以修剪训练有素的神经网络中的卷积过滤器,以减少推断期间的运行时内存和计算要求。但是,如果训练算法导致致密的载体,则严重的训练后修剪会导致性能降解。我们建议使用批处理桥接(一种诱导随机正规化方案)来训练神经网络,以便可以通过最小的性能降解来有效修剪它们。我们在CIFAR Image分类任务上评估了常见的计算机视觉模型,Resnet和Wide-Resnet的建议方法。对于所有网络,实验结果表明,与辍学和重量衰减正则化相比,经过批处理桥梁训练的网络在各种修剪强度方面具有更高的精度。

State-of-the-art computer vision models are rapidly increasing in capacity, where the number of parameters far exceeds the number required to fit the training set. This results in better optimization and generalization performance. However, the huge size of contemporary models results in large inference costs and limits their use on resource-limited devices. In order to reduce inference costs, convolutional filters in trained neural networks could be pruned to reduce the run-time memory and computational requirements during inference. However, severe post-training pruning results in degraded performance if the training algorithm results in dense weight vectors. We propose the use of Batch Bridgeout, a sparsity inducing stochastic regularization scheme, to train neural networks so that they could be pruned efficiently with minimal degradation in performance. We evaluate the proposed method on common computer vision models VGGNet, ResNet, and Wide-ResNet on the CIFAR image classification task. For all the networks, experimental results show that Batch Bridgeout trained networks achieve higher accuracy across a wide range of pruning intensities compared to Dropout and weight decay regularization.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源