Sparse Bayesian learning for basis selection

被引:1123
作者
Wipf, DP [1 ]
Rao, BD [1 ]
机构
[1] Univ Calif San Diego, Dept Elect & Comp Engn, La Jolla, CA 92092 USA
关键词
basis selection; diversity measures; linear inverse problems; sparse Bayesian learning; sparse representations;
D O I
10.1109/TSP.2004.831016
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Sparse Bayesian learning (SBL) and specifically relevance vector machines have received much attention in the machine learning literature as a means of achieving parsimonious representations in the context of regression and classification. The methodology relies on a parameterized prior that encourages models with few nonzero weights. In this paper, we adapt SBL to the signal processing problem of basis selection from overcomplete dictionaries, proving, several results About the SBL cost function that elucidate its general behavior and provide solid theoretical justification for this application. Specifically, we have shown that SBL retains a desirable property of the l(0)-norm diversity measure (i.e., the global minimum is achieved at the maximally sparse solution) while often possessing a more limited constellation of local minima. We have also demonstrated that the local minima that do exist are achieved at sparse solutions. Later, we provide a novel interpretation of SBL that gives us valuable insight into why it is successful in producing sparse representations. Finally, we include simulation studies comparing sparse Bayesian learning with Basis Pursuit and the more recent FOCal Underdetermined System Solver (FOCUSS) class of basis selection algorithms. These results indicate that our theoretical insights translate directly into improved performance.
引用
收藏
页码:2153 / 2164
页数:12
相关论文
共 36 条
[11]   Proportionate normalized least-mean-squares adaptation in echo cancelers [J].
Duttweiler, DL .
IEEE TRANSACTIONS ON SPEECH AND AUDIO PROCESSING, 2000, 8 (05) :508-518
[12]  
Faul AC, 2002, ADV NEUR IN, V14, P383
[13]   Reduced complexity decision feedback equalization for multipath channels with large delay spreads [J].
Fevrier, IJ ;
Gelfand, SB ;
Fitz, MP .
IEEE TRANSACTIONS ON COMMUNICATIONS, 1999, 47 (06) :927-937
[14]   A variational method for learning sparse and overcomplete representations [J].
Girolami, M .
NEURAL COMPUTATION, 2001, 13 (11) :2517-2532
[15]   NEUROMAGNETIC SOURCE IMAGING WITH FOCUSS - A RECURSIVE WEIGHTED MINIMUM NORM ALGORITHM [J].
GORODNITSKY, IF ;
GEORGE, JS ;
RAO, BD .
ELECTROENCEPHALOGRAPHY AND CLINICAL NEUROPHYSIOLOGY, 1995, 95 (04) :231-251
[16]   Sparse signal reconstruction from limited data using FOCUSS: A re-weighted minimum norm algorithm [J].
Gorodnitsky, IF ;
Rao, BD .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 1997, 45 (03) :600-616
[17]  
Herbrich R., 2002, LEARNING KERNEL CLAS
[18]  
Horn R., 1985, Matrix Analysis, DOI [DOI 10.1017/CBO9780511810817, 10.1017/CBO9780511810817]
[19]  
Jeffs BD, 1998, INT CONF ACOUST SPEE, P1885, DOI 10.1109/ICASSP.1998.681832
[20]   Restoration of blurred star field images by maximally sparse optimization [J].
Jeffs, Brian D. ;
Gunsay, Metin .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 1993, 2 (02) :202-211