论文标题
高效网络精英:通过网络候选搜索,用于边缘设备的极轻巧有效的CNN模型
EfficientNet-eLite: Extremely Lightweight and Efficient CNN Models for Edge Devices by Network Candidate Search
论文作者
论文摘要
将卷积神经网络(CNN)嵌入到Edge设备中以进行推理是一项非常具有挑战性的任务,因为这样的轻量级硬件并非诞生于处理此重量级软件,这是现代最先进的CNN模型的常见开销。在本文中,我们针对以尽可能少的贸易准确性来降低开销,我们提出了一部网络候选搜索(NCS)的小说,这是研究资源使用与绩效之间通过分组概念和消除锦标赛之间进行权衡的替代方法。此外,NC也可以在任何神经网络中概括。在我们的实验中,我们收集了从效率网络B0的候选CNN模型,以通过宽度,深度,输入分辨率和化合物缩小缩小的方式来缩小,并应用NCS来研究缩放缩小的权衡。同时,获得了一个极其轻巧的效率网络的家族,称为效率网络精英。为了进一步使用特定于应用程序的集成电路(ASIC)接管CNN Edge应用程序,我们调整了EfficityNet-Elite的体系结构,以构建更易于硬件的版本EfficitedNet-HF。在Imagenet数据集上进行评估,提出的有效网络精英和有效网络HF的参数用法和准确性都比以前的开始使用CNN更好。特别是,高效网元素的最小成员比最佳和最小的MNASNET更轻巧,其参数少1.46倍,精度提高了0.56%。代码可在https://github.com/ching-chen-wang/effidicenet-elite上获得
Embedding Convolutional Neural Network (CNN) into edge devices for inference is a very challenging task because such lightweight hardware is not born to handle this heavyweight software, which is the common overhead from the modern state-of-the-art CNN models. In this paper, targeting at reducing the overhead with trading the accuracy as less as possible, we propose a novel of Network Candidate Search (NCS), an alternative way to study the trade-off between the resource usage and the performance through grouping concepts and elimination tournament. Besides, NCS can also be generalized across any neural network. In our experiment, we collect candidate CNN models from EfficientNet-B0 to be scaled down in varied way through width, depth, input resolution and compound scaling down, applying NCS to research the scaling-down trade-off. Meanwhile, a family of extremely lightweight EfficientNet is obtained, called EfficientNet-eLite. For further embracing the CNN edge application with Application-Specific Integrated Circuit (ASIC), we adjust the architectures of EfficientNet-eLite to build the more hardware-friendly version, EfficientNet-HF. Evaluation on ImageNet dataset, both proposed EfficientNet-eLite and EfficientNet-HF present better parameter usage and accuracy than the previous start-of-the-art CNNs. Particularly, the smallest member of EfficientNet-eLite is more lightweight than the best and smallest existing MnasNet with 1.46x less parameters and 0.56% higher accuracy. Code is available at https://github.com/Ching-Chen-Wang/EfficientNet-eLite