期刊
IEEE TRANSACTIONS ON INFORMATION THEORY
卷 55, 期 12, 页码 5773-5782出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2009.2032712
关键词
Best k-term approximation; compressed sensing (CS); cross validation; encoding/decoding; error estimates; Johnson-Lindenstrauss (JL) lemma; measurements
资金
- National Science Foundation (NSF)
Compressed sensing (CS) decoding algorithms can efficiently recover an N-dimensional real-valued vector x to within a factor of its best k-term approximation by taking m = O(k log N/k) measurements y = Phi x. If the sparsity or approximate sparsity level of x were known, then this theoretical guarantee would imply quality assurance of the resulting CS estimate. However, because the underlying sparsity of the signal x is unknown, the quality of a CS estimate (x) over cap using m measurements is not assured. It is nevertheless shown in this paper that sharp bounds on the error parallel to x - (x) over cap parallel to(l2N) can be achieved with almost no effort. More precisely, suppose that a maximum number of measurements m is preimposed. One can reserve 10 log p of these m measurements and compute a sequence of possible estimates ((x) over cap (j))(j=1)(p) to x from the m - 10 log p remaining measurements; the errors parallel to x - (x) over cap parallel to(l2N) for j = 1, ..., p can then be bounded with high probability. As a consequence, numerical upper and lower bounds on the error between x and the best k-term approximation to x can be estimated for p values of k with almost no cost. This observation has applications outside CS as well.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据