Random Projections of Smooth Manifolds

被引:256
作者
Baraniuk, Richard G. [1 ]
Wakin, Michael B. [2 ]
机构
[1] Rice Univ, Dept Elect & Comp Engn, Houston, TX 77005 USA
[2] Univ Michigan, Dept Elect Engn & Comp Sci, Ann Arbor, MI 48109 USA
基金
美国国家科学基金会;
关键词
Manifolds; Dimensionality reduction; Random projections; Compressed sensing; Sparsity; Manifold learning; Johnson-Lindenstrauss lemma; DIMENSIONALITY REDUCTION; SIGNAL RECONSTRUCTION; IMAGE MANIFOLDS; NEIGHBORLINESS; EIGENMAPS; RECOVERY;
D O I
10.1007/s10208-007-9011-z
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
We propose a new approach for nonadaptive dimensionality reduction of manifold-modeled data, demonstrating that a small number of random linear projections can preserve key information about a manifold-modeled signal. We center our analysis on the effect of a random linear projection operator Phi : R-N -> R-M, M < N, on a smooth well-conditioned K-dimensional submanifold M subset of R-N. As our main theoretical contribution, we establish a sufficient number M of random projections to guarantee that, with high probability, all pairwise Euclidean and geodesic distances between points on M are well preserved under the mapping Phi. Our results bear strong resemblance to the emerging theory of Compressed Sensing (CS), in which sparse signals can be recovered from small numbers of random linear measurements. As in CS, the random measurements we propose can be used to recover the original data in R-N. Moreover, like the fundamental bound in CS, our requisite M is linear in the "information level" K and logarithmic in the am-bient dimension N; we also identify a logarithmic dependence on the volume and conditioning of the manifold. In addition to recovering faithful approximations to manifold-modeled signals, however, the random projections we propose can also be used to discern key properties about the manifold. We discuss connections and contrasts with existing techniques in manifold learning, a setting where dimensionality reducing mappings are typically nonlinear and constructed adaptively from a set of sampled training data.
引用
收藏
页码:51 / 77
页数:27
相关论文
共 48 条
[1]  
Achlioptas D, 2001, P 20 ACM SIGMOD SIGA, DOI [DOI 10.1145/375551.375608, 10.1145/375551.375608]
[2]  
[Anonymous], 2001, JPEG 2000: Image Compression Fun-damentals, Standards and Practice
[3]  
[Anonymous], 1976, DIFFERENTIAL TOPOLOG
[4]  
[Anonymous], 2005, Distributed compressed sensing
[5]  
[Anonymous], 1999, A Wavelet Tour of Signal Processing
[6]  
[Anonymous], P IEEE DALL CIRC SYS
[7]  
BARANIUK R, 2008, CONSTR APPR IN PRESS
[8]   Laplacian eigenmaps for dimensionality reduction and data representation [J].
Belkin, M ;
Niyogi, P .
NEURAL COMPUTATION, 2003, 15 (06) :1373-1396
[9]  
Brand Matthew, 2003, Advances in Neural Information Processing Systems, P985
[10]   A new approach to dimensionality reduction: theory and algorithms [J].
Broomhead, DS ;
Kirby, M .
SIAM JOURNAL ON APPLIED MATHEMATICS, 2000, 60 (06) :2114-2142