4.4 Article

Restricted strong convexity and its applications to convergence analysis of gradient-type methods in convex optimization

期刊

OPTIMIZATION LETTERS
卷 9, 期 5, 页码 961-979

出版社

SPRINGER HEIDELBERG
DOI: 10.1007/s11590-014-0795-x

关键词

Restricted strong convexity; Gradient method; Inexact gradient method; Rate of linear convergence

资金

  1. Graduate School of NUDT [B110202]
  2. NSFC [61201328, 61271014, 61072118]
  3. NNSF of Hunan Province [13JJ2011]
  4. SP of NUDT [JC120201]

向作者/读者索取更多资源

A great deal of interest of solving large-scale convex optimization problems has, recently, turned to gradient method and its variants. To ensure rates of linear convergence, current theory regularly assumes that the objective functions are strongly convex. This paper goes beyond the traditional wisdom by studying a strictly weaker concept than the strong convexity, named restricted strongly convexity which was recently proposed and can be satisfied by a much broader class of functions. Utilizing the restricted strong convexity, we derive rates of linear convergence for (in)exact gradient-type methods. Besides, we obtain two by-products: (1) we rederive rates of linear convergence of inexact gradient method for a class of structured smooth convex optimizations; (2) we improve the rate of linear convergence for the linearized Bregman algorithm.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.4
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据