4.6 Article

ON THE NONASYMPTOTIC CONVERGENCE OF CYCLIC COORDINATE DESCENT METHODS

期刊

SIAM JOURNAL ON OPTIMIZATION
卷 23, 期 1, 页码 576-601

出版社

SIAM PUBLICATIONS
DOI: 10.1137/110840054

关键词

convex optimization; cyclic coordinate descent; convergence rates; sparsity

向作者/读者索取更多资源

Cyclic coordinate descent is a classic optimization method that has witnessed a resurgence of interest in signal processing, statistics, and machine learning. Reasons for this renewed interest include the simplicity, speed, and stability of the method, as well as its competitive performance on l(1) regularized smooth optimization problems. Surprisingly, very little is known about its nonasymptotic convergence behavior on these problems. Most existing results either just prove convergence or provide asymptotic rates. We fill this gap in the literature by proving O(1/k) convergence rates (where k is the iteration count) for two variants of cyclic coordinate descent under an isotonicity assumption. Our analysis proceeds by comparing the objective values attained by the two variants with each other, as well as with the gradient descent algorithm. We show that the iterates generated by the cyclic coordinate descent methods remain better than those of gradient descent uniformly over time.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据