Journal
SIAM JOURNAL ON NUMERICAL ANALYSIS
Volume 47, Issue 2, Pages 997-1018Publisher
SIAM PUBLICATIONS
DOI: 10.1137/08072019X
Keywords
line search; trust region; subspace acceleration; sequential subspace method; Riemannian manifold; optimization on manifolds; Riemannian optimization; Arnoldi; Jacobi-Davidson; locally optimal block preconditioned conjugate gradient (LOBPCG)
Categories
Funding
- US National Science Foundation [OCI0324944]
- Florida State University
Ask authors/readers for more resources
In numerical optimization, line-search and trust-region methods are two important classes of descent schemes, with well-understood global convergence properties. We say that these methods are accelerated when the conventional iterate is replaced by any point that produces at least as much of a decrease in the cost function as a fixed fraction of the decrease produced by the conventional iterate. A detailed convergence analysis reveals that global convergence properties of line-search and trust-region methods still hold when the methods are accelerated. The analysis is performed in the general context of optimization on manifolds, of which optimization in R-n is a particular case. This general convergence analysis sheds new light on the behavior of several existing algorithms.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available