A variational method for learning sparse and overcomplete representations

被引:95
作者
Girolami, M [1 ]
机构
[1] Aalto Univ, Lab Comp & Informat Sci, FIN-02150 Espoo, Finland
关键词
D O I
10.1162/089976601753196003
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
An expectation-maximization algorithm for learning sparse and overcomplete data representations is presented. The proposed algorithm exploits a variational approximation to a range of heavy-tailed distributions whose limit is the Laplacian. A rigorous lower bound on the sparse prior distribution is derived, which enables the analytic marginalization of a lower bound on the data likelihood. This lower bound enables the development of an expectation-maximization algorithm for learning the overcomplete basis vectors and inferring the most probable basis coefficients.
引用
收藏
页码:2517 / 2532
页数:16
相关论文
共 19 条
[1]  
Amari S, 1996, ADV NEUR IN, V8, P757
[2]   Independent factor analysis [J].
Attias, H .
NEURAL COMPUTATION, 1999, 11 (04) :803-851
[3]   AN INFORMATION MAXIMIZATION APPROACH TO BLIND SEPARATION AND BLIND DECONVOLUTION [J].
BELL, AJ ;
SEJNOWSKI, TJ .
NEURAL COMPUTATION, 1995, 7 (06) :1129-1159
[4]  
Bermond O., 1999, P 1 INT C IND COMP A, P325
[5]  
BUHMANN JM, 1998, LEARNING GRAPHICAL M
[6]  
CHEN S, 1996, ATOMIC DECOMPOSITION
[7]   MAXIMUM LIKELIHOOD FROM INCOMPLETE DATA VIA EM ALGORITHM [J].
DEMPSTER, AP ;
LAIRD, NM ;
RUBIN, DB .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-METHODOLOGICAL, 1977, 39 (01) :1-38
[8]  
Jaakkola Tommi Sakari, 1997, THESIS MIT
[9]  
JAAKKOLA TS, 1997, P 1997 C ART INT STA, P283
[10]  
KREUTZDELGADO K, 2000, P SPIE, V4119