4.6 Article

NEWTON SKETCH: A NEAR LINEAR-TIME OPTIMIZATION ALGORITHM WITH LINEAR-QUADRATIC CONVERGENCE

期刊

SIAM JOURNAL ON OPTIMIZATION
卷 27, 期 1, 页码 205-245

出版社

SIAM PUBLICATIONS
DOI: 10.1137/15M1021106

关键词

convex optimization; large-scale problems; Newton's method; random projection; randomized algorithms; random matrices; self-concordant functions; interior point method

资金

  1. Office of Naval Research MURI [N00014-11-1-0688]
  2. Air Force Office of Scientific Research [AFOSR-FA9550-14-1-0016]
  3. National Science Foundation [CIF-31712-23800, DMS-1107000]
  4. Microsoft Research Fellowship
  5. Direct For Mathematical & Physical Scien
  6. Division Of Mathematical Sciences [1612948] Funding Source: National Science Foundation

向作者/读者索取更多资源

We propose a randomized second-order method for optimization known as the Newton sketch: it is based on performing an approximate Newton step using a randomly projected Hessian. For self-concordant functions, we prove that the algorithm has superlinear convergence with exponentially high probability, with convergence and complexity guarantees that are independent of condition numbers and related problem-dependent quantities. Given a suitable initialization, similar guarantees also hold for strongly convex and smooth objectives without self-concordance. When implemented using randomized projections based on a subsampled Hadamard basis, the algorithm typically has substantially lower complexity than Newton's method. We also describe extensions of our methods to programs involving convex constraints that are equipped with self-concordant barriers. We discuss and illustrate applications to linear programs, quadratic programs with convex constraints, logistic regression, and other generalized linear models, as well as semidefinite programs.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据