Preparing Lessons: Improve Knowledge Distillation with Better Supervision

Title
Preparing Lessons: Improve Knowledge Distillation with Better Supervision
Authors
Keywords
Knowledge distillation, Label regularization, Hard example mining
Journal
NEUROCOMPUTING
Volume -, Issue -, Pages -
Publisher
Elsevier BV
Online
2021-04-30
DOI
10.1016/j.neucom.2021.04.102

Ask authors/readers for more resources

Reprint

Contact the author

Discover Peeref hubs

Discuss science. Find collaborators. Network.

Join a conversation

Ask a Question. Answer a Question.

Quickly pose questions to the entire community. Debate answers and get clarity on the most important issues facing researchers.

Get Started