4.5 Article

Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization

期刊

JOURNAL OF GLOBAL OPTIMIZATION
卷 61, 期 1, 页码 19-46

出版社

SPRINGER
DOI: 10.1007/s10898-014-0151-9

关键词

Large-scale nonconvex optimization; Random coordinate descent algorithms; Convergence analysis; Asymptotic convergence; Convergence rate in expectation

资金

  1. European Union [248940]
  2. CNCS [TE-231, 19/11.08.2010]
  3. ANCS [80EU/2010]
  4. [POSDRU/89/1.5/S/62557]

向作者/读者索取更多资源

In this paper we analyze several new methods for solving nonconvex optimization problems with the objective function consisting of a sum of two terms: one is nonconvex and smooth, and another is convex but simple and its structure is known. Further, we consider both cases: unconstrained and linearly constrained nonconvex problems. For optimization problems of the above structure, we propose random coordinate descent algorithms and analyze their convergence properties. For the general case, when the objective function is nonconvex and composite we prove asymptotic convergence for the sequences generated by our algorithms to stationary points and sublinear rate of convergence in expectation for some optimality measure. Additionally, if the objective function satisfies an error bound condition we derive a local linear rate of convergence for the expected values of the objective function. We also present extensive numerical experiments for evaluating the performance of our algorithms in comparison with state-of-the-art methods.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据