CDC: Classification Driven Compression for Bandwidth Efficient Edge-Cloud Collaborative Deep Learning

CDC: Classification Driven Compression for Bandwidth Efficient Edge-Cloud Collaborative Deep Learning

Yuanrui Dong, Peng Zhao, Hanqiao Yu, Cong Zhao, Shusen Yang

Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 3378-3384. https://doi.org/10.24963/ijcai.2020/467

The emerging edge-cloud collaborative Deep Learning (DL) paradigm aims at improving the performance of practical DL implementations in terms of cloud bandwidth consumption, response latency, and data privacy preservation. Focusing on bandwidth efficient edge-cloud collaborative training of DNN-based classifiers, we present CDC, a Classification Driven Compression framework that reduces bandwidth consumption while preserving classification accuracy of edge-cloud collaborative DL. Specifically, to reduce bandwidth consumption, for resource-limited edge servers, we develop a lightweight autoencoder with a classification guidance for compression with classification driven feature preservation, which allows edges to only upload the latent code of raw data for accurate global training on the Cloud. Additionally, we design an adjustable quantization scheme adaptively pursuing the tradeoff between bandwidth consumption and classification accuracy under different network conditions, where only fine-tuning is required for rapid compression ratio adjustment. Results of extensive experiments demonstrate that, compared with DNN training with raw data, CDC consumes 14.9× less bandwidth with an accuracy loss no more than 1.06%, and compared with DNN training with data compressed by AE without guidance, CDC introduces at least 100% lower accuracy loss.
Keywords:
Machine Learning Applications: Applications of Supervised Learning
Agent-based and Multi-agent Systems: Coordination and Cooperation
Machine Learning: Federated Learning