4.6 Article

Deep Model Compression for Mobile Platforms: A Survey

期刊

TSINGHUA SCIENCE AND TECHNOLOGY
卷 24, 期 6, 页码 677-693

出版社

TSINGHUA UNIV PRESS
DOI: 10.26599/TST.2018.9010103

关键词

deep learning; model compression; run-time resource management; cost optimization

资金

  1. National Key Research and Development Program of China [2018YFB1003605]
  2. Foundations of CARCH [CARCH201704]
  3. National Natural Science Foundation of China [61472312]
  4. Foundations of Shaanxi Province and Xi'an Science and Technology Plan [B018230008, BD34017020001]
  5. Foundations of Xidian University [JBZ171002]

向作者/读者索取更多资源

Despite the rapid development of mobile and embedded hardware, directly executing computation-expensive and storage-intensive deep learning algorithms on these devices' local side remains constrained for sensory data analysis. In this paper, we first summarize the layer compression techniques for the state-of-the-art deep learning model from three categories: weight factorization and pruning, convolution decomposition, and special layer architecture designing. For each category of layer compression techniques, we quantify their storage and computation tunable by layer compression techniques and discuss their practical challenges and possible improvements. Then, we implement Android projects using TensorFlow Mobile to test these 10 compression methods and compare their practical performances in terms of accuracy, parameter size, intermediate feature size, computation, processing latency, and energy consumption. To further discuss their advantages and bottlenecks, we test their performance over four standard recognition tasks on six resource-constrained Android smartphones. Finally, we survey two types of run-time Neural Network (NN) compression techniques which are orthogonal with the layer compression techniques, run-time resource management and cost optimization with special NN architecture, which are orthogonal with the layer compression techniques.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据