论文标题

CDC:带宽有效边缘云协作的分类驱动压缩

CDC: Classification Driven Compression for Bandwidth Efficient Edge-Cloud Collaborative Deep Learning

论文作者

Dong, Yuanrui, Zhao, Peng, Yu, Hanqiao, Zhao, Cong, Yang, Shusen

论文摘要

新兴的边缘云协作深度学习(DL)范式旨在在云带宽消耗,响应延迟和数据隐私保护方面提高实用DL实现的性能。我们介绍了基于DNN的分类器的带宽有效的边缘云协作培训,我们提出了CDC,这是一个分类驱动的压缩框架,可降低带宽消耗,同时保留Edge-Cloud协作DL的分类精度。具体来说,为了减少带宽消耗,对于资源有限的边缘服务器,我们开发了一个轻巧的自动编码器,并具有针对压缩的分类指南,并通过分类驱动的功能保存,该指南允许Edges仅上传原始数据的潜在代码,以在云上进行准确的全球培训。此外,我们设计了一种可调节的量化方案,可自适应地追求不同网络条件下的带宽消耗和分类精度之间的权衡,在这种情况下,只需进行快速压缩比调整就需要微调。广泛实验的结果表明,与原始数据的DNN培训相比,CDC消耗的带宽减少了14.9倍,精度损失不超过1.06%,并且与未经AE的DNN培训相比,CDC引入了至少100%的精度损失。

The emerging edge-cloud collaborative Deep Learning (DL) paradigm aims at improving the performance of practical DL implementations in terms of cloud bandwidth consumption, response latency, and data privacy preservation. Focusing on bandwidth efficient edge-cloud collaborative training of DNN-based classifiers, we present CDC, a Classification Driven Compression framework that reduces bandwidth consumption while preserving classification accuracy of edge-cloud collaborative DL. Specifically, to reduce bandwidth consumption, for resource-limited edge servers, we develop a lightweight autoencoder with a classification guidance for compression with classification driven feature preservation, which allows edges to only upload the latent code of raw data for accurate global training on the Cloud. Additionally, we design an adjustable quantization scheme adaptively pursuing the tradeoff between bandwidth consumption and classification accuracy under different network conditions, where only fine-tuning is required for rapid compression ratio adjustment. Results of extensive experiments demonstrate that, compared with DNN training with raw data, CDC consumes 14.9 times less bandwidth with an accuracy loss no more than 1.06%, and compared with DNN training with data compressed by AE without guidance, CDC introduces at least 100% lower accuracy loss.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源