4.6 Article

Incremental proximal methods for large scale convex optimization

期刊

MATHEMATICAL PROGRAMMING
卷 129, 期 2, 页码 163-195

出版社

SPRINGER HEIDELBERG
DOI: 10.1007/s10107-011-0472-0

关键词

Proximal algorithm; Incremental method; Gradient method; Convex

资金

  1. AFOSR [FA9550-10-1-0412]

向作者/读者索取更多资源

We consider the minimization of a sum Sigma(m)(i=1) f(i)(x) consisting of a large number of convex component functions f(i). For this problem, incremental methods consisting of gradient or subgradient iterations applied to single components have proved very effective. We propose new incremental methods, consisting of proximal iterations applied to single components, as well as combinations of gradient, subgradient, and proximal iterations. We provide a convergence and rate of convergence analysis of a variety of such methods, including some that involve randomization in the selection of components. We also discuss applications in a few contexts, including signal processing and inference/machine learning.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据