Journal
SIAM JOURNAL ON NUMERICAL ANALYSIS
Volume 51, Issue 2, Pages 927-957Publisher
SIAM PUBLICATIONS
DOI: 10.1137/110840364
Keywords
sparse optimization; sparse vector recovery; compressed sensing; low-rank matrix recovery; matrix completion; iterative reweighted least squares; l(q) minimization
Categories
Funding
- ARL
- ARO [W911NF-09-1-0383]
- AFOSR [FA9550-10-C-0108]
- NSF [DMS-0748839]
- ONR [N00014-08-1-1101]
- Directorate For Engineering [1028790] Funding Source: National Science Foundation
Ask authors/readers for more resources
In this paper, we first study l(q) minimization and its associated iterative reweighted algorithm for recovering sparse vectors. Unlike most existing work, we focus on unconstrained l(q) minimization, for which we show a few advantages on noisy measurements and/or approximately sparse vectors. Inspired by the results in [Daubechies et al., Comm. Pure Appl. Math., 63 (2010), pp. 1-38] for constrained l(q) minimization, we start with a preliminary yet novel analysis for unconstrained l(q) minimization, which includes convergence, error bound, and local convergence behavior. Then, the algorithm and analysis are extended to the recovery of low-rank matrices. The algorithms for both vector and matrix recovery have been compared to some state-of-the-art algorithms and show superior performance on recovering sparse vectors and low-rank matrices.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available