Throughput Maximization of Delay-Aware DNN Inference in Edge Computing by Exploring DNN Model Partitioning and Inference Parallelism

Title
Throughput Maximization of Delay-Aware DNN Inference in Edge Computing by Exploring DNN Model Partitioning and Inference Parallelism
Authors
Keywords
-
Journal
IEEE TRANSACTIONS ON MOBILE COMPUTING
Volume 22, Issue 5, Pages 3017-3030
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Online
2021-11-09
DOI
10.1109/tmc.2021.3125949

Ask authors/readers for more resources

Publish scientific posters with Peeref

Peeref publishes scientific posters from all research disciplines. Our Diamond Open Access policy means free access to content and no publication fees for authors.

Learn More

Ask a Question. Answer a Question.

Quickly pose questions to the entire community. Debate answers and get clarity on the most important issues facing researchers.

Get Started