4.6 Article

SHARPNESS, RESTART, AND ACCELERATION

期刊

SIAM JOURNAL ON OPTIMIZATION
卷 30, 期 1, 页码 262-289

出版社

SIAM PUBLICATIONS
DOI: 10.1137/18M1224568

关键词

error bounds; sharpness; restart; accelerated methods

资金

  1. Fonds AXA Pour la Recherche
  2. AMX fellowship
  3. Google Focused award

向作者/读者索取更多资源

The Lojasiewicz inequality shows that sharpness bounds on the minimum of convex optimization problems hold almost generically. Sharpness directly controls the performance of restart schemes, as observed by Nemirovskii and Nesterov [USSR Comput. Math. Math. Phys., 25 (1985), pp. 21-30]. The constants quantifying these sharpness bounds are of course unobservable, but we show that optimal restart strategies are robust, and searching for the best scheme only increases the complexity by a logarithmic factor compared to the optimal bound. Overall then, restart schemes generically accelerate accelerated methods.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据