Distributed optimization for deep learning with gossip exchange

Title
Distributed optimization for deep learning with gossip exchange
Authors
Keywords
Optimization, Distributed gradient descent, Gossip, Deep, Learning, Neural networks
Journal
NEUROCOMPUTING
Volume 330, Issue -, Pages 287-296
Publisher
Elsevier BV
Online
2018-11-09
DOI
10.1016/j.neucom.2018.11.002

Ask authors/readers for more resources

Reprint

Contact the author

Find the ideal target journal for your manuscript

Explore over 38,000 international journals covering a vast array of academic fields.

Search

Create your own webinar

Interested in hosting your own webinar? Check the schedule and propose your idea to the Peeref Content Team.

Create Now