4.8 Article

Structured Knowledge Distillation for Dense Prediction

出版社

IEEE COMPUTER SOC
DOI: 10.1109/TPAMI.2020.3001940

关键词

Task analysis; Semantics; Training; Object detection; Image segmentation; Estimation; Knowledge engineering; Structured knowledge distillation; adversarial training; knowledge transferring; dense prediction

向作者/读者索取更多资源

This work focuses on transferring structure information from larger networks to compact ones for dense prediction tasks in computer vision. Existing knowledge distillation strategies for dense prediction tasks do not achieve optimal performance by distilling knowledge for each pixel separately. This work proposes structured distillation schemes, including pair-wise distillation and holistic distillation, to effectively distill structured knowledge. Experiments on three dense prediction tasks validate the effectiveness of the proposed approaches. Code is available at https://git.io/StructKD.
In this work, we consider transferring the structure information from large networks to compact ones for dense prediction tasks in computer vision. Previous knowledge distillation strategies used for dense prediction tasks often directly borrow the distillation scheme for image classification and perform knowledge distillation for each pixel separately, leading to sub-optimal performance. Here we propose to distill structured knowledge from large networks to compact networks, taking into account the fact that dense prediction is a structured prediction problem. Specifically, we study two structured distillation schemes: i) pair-wise distillation that distills the pair-wise similarities by building a static graph; and ii) holistic distillation that uses adversarial training to distill holistic knowledge. The effectiveness of our knowledge distillation approaches is demonstrated by experiments on three dense prediction tasks: semantic segmentation, depth estimation and object detection. Code is available at https://git.io/StructKD.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据