Compressed Sensing over the Grassmann Manifold: A Unified Analytical Framework

被引:26
作者
Xu, Weiyu [1 ]
Hassibi, Babak [1 ]
机构
[1] CALTECH, Dept Elect Engn, Pasadena, CA 91125 USA
来源
2008 46TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING, VOLS 1-3 | 2008年
关键词
compressed sensing; basis pursuit; l(1)-optimization; k-balancedness; Grassmann manifold; Grassmann angle; high-dimensional integral geometry; geometric probability; convex polytopes; random linear subspaces; neighborly polytopes;
D O I
10.1109/ALLERTON.2008.4797608
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
It is well known that compressed sensing problems reduce to finding the sparse solutions for large under-determined systems of equations. Although finding the sparse solutions in general may be computationally difficult, starting with the seminal work of [2], it has been shown that linear programming techniques, obtained from an l(1)-norm relaxation of the original non-convex problem, can provably find the unknown vector in certain instances. In particular, using a certain restricted isometry property, [2] shows that for measurement matrices chosen from a random Gaussian ensemble, 11 optimization can find the correct solution with overwhelming probability even when the support size of the unknown vector is proportional to its dimension. The paper [1] uses results on neighborly polytopes from [6] to give a "sharp" bound on what this proportionality should be in the Gaussian measurement ensemble. In this paper we shall focus on finding sharp bounds on the recovery of "approximately sparse" signals (also possibly under noisy measurements). While the restricted isometry property can be used to study the recovery of approximately sparse signals (and also in the presence of noisy measurements), the obtained bounds can be quite loose. On the other hand, the neighborly polytopes technique which yields sharp bounds for ideally sparse signals cannot be generalized to approximately sparse signals. In this paper, starting from a necessary and sufficient condition for achieving a certain signal recovery accuracy, using high-dimensional geometry, we give a unified null-space Grassmannian angle-based analytical framework for compressive sensing. This new framework gives sharp quantitative tradeoffs between the signal sparsity and the recovery accuracy of the l(1) optimization for approximately sparse signals. As it will turn out, the neighborly polytopes result of [1] for ideally sparse signals can be viewed as a special case of ours. Our result concerns fundamental properties of linear subspaces and so may be of independent mathematical interest.
引用
收藏
页码:562 / 567
页数:6
相关论文
共 27 条
[1]   RANDOM PROJECTIONS OF REGULAR SIMPLICES [J].
AFFENTRANGER, F ;
SCHNEIDER, R .
DISCRETE & COMPUTATIONAL GEOMETRY, 1992, 7 (03) :219-226
[2]  
[Anonymous], 2003, GRADUATE TEXTS MATH
[3]  
BARANIUK R, CONSTRUCTIV IN PRESS
[4]  
BOOTHBY WM, 1986, INTRO DIFFERENTIALS
[5]  
Boroczky K, 1999, ARCH MATH, V73, P465
[6]  
Candes E J., 2006, Proceedings of the International Congress of Mathematicians, V17, P1433, DOI [10.4171/022, DOI 10.4171/022]
[7]   Robust uncertainty principles:: Exact signal reconstruction from highly incomplete frequency information [J].
Candès, EJ ;
Romberg, J ;
Tao, T .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (02) :489-509
[8]   Decoding by linear programming [J].
Candes, EJ ;
Tao, T .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2005, 51 (12) :4203-4215
[9]  
COHEN A, 2006, COMPRESSED SENSING B
[10]   High-dimensional centrally symmetric polytopes with neighborliness proportional to dimension [J].
Donoho, DL .
DISCRETE & COMPUTATIONAL GEOMETRY, 2006, 35 (04) :617-652