4.6 Article

Linear convergence of first order methods for non-strongly convex optimization

期刊

MATHEMATICAL PROGRAMMING
卷 175, 期 1-2, 页码 69-107

出版社

SPRINGER HEIDELBERG
DOI: 10.1007/s10107-018-1232-1

关键词

-

资金

  1. Executive Agency for Higher Education, Research and Innovation Funding (UEFISCDI), Romania [PN-III-P4-PCE-2016-0731, 39/2017]

向作者/读者索取更多资源

The standard assumption for proving linear convergence of first order methods for smooth convex optimization is the strong convexity of the objective function, an assumption which does not hold for many practical applications. In this paper, we derive linear convergence rates of several first order methods for solving smooth non-strongly convex constrained optimization problems, i.e. involving an objective function with a Lipschitz continuous gradient that satisfies some relaxed strong convexity condition. In particular, in the case of smooth constrained convex optimization, we provide several relaxations of the strong convexity conditions and prove that they are sufficient for getting linear convergence for several first order methods such as projected gradient, fast gradient and feasible descent methods. We also provide examples of functional classes that satisfy our proposed relaxations of strong convexity conditions. Finally, we show that the proposed relaxed strong convexity conditions cover important applications ranging from solving linear systems, Linear Programming, and dual formulations of linearly constrained convex problems.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据