Sparse kernel density construction using orthogonal forward regression with leave-one-out test score and local regularization

被引:46
作者
Chen, S [1 ]
Hong, X
Harris, CJ
机构
[1] Univ Southampton, Sch Elect & Comp Sci, Southampton SO17 1BJ, Hants, England
[2] Univ Reading, Dept Cybernet, Reading RG6 6AY, Berks, England
来源
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS | 2004年 / 34卷 / 04期
关键词
cross validation; leave-one-out test score; orthogonal least squares; Parzen window estimate; probability density function; regularization; sparse kernel modeling;
D O I
10.1109/TSMCB.2004.828199
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 [计算机科学与技术];
摘要
This paper presents an efficient construction algorithm for obtaining sparse kernel density estimates based on a regression approach that directly optimizes model generalization capability. Computational efficiency of the density construction is ensured using an orthogonal forward regression, and the algorithm incrementally minimizes the leave-one-out test score. A local regularization method is incorporated naturally into the density construction process to further enforce sparsity. An additional advantage of the proposed algorithm is that it is fully automatic and the user is not required to specify any criterion to terminate the density construction procedure. This is in contrast to an existing state-of-art kernel density estimation method using the support vector machine (SVM), where the user is required to specify some critical algorithm parameter. Several examples are included to demonstrate the ability of the proposed algorithm to effectively construct a very sparse kernel density estimate with comparable accuracy to that of the full sample optimized Parzen window density estimate. Our experimental results also demonstrate that the proposed algorithm compares favorably with the SVM method, in terms of both test accuracy and sparsity, for constructing kernel density estimates.
引用
收藏
页码:1708 / 1717
页数:10
相关论文
共 24 条
[1]
NEW LOOK AT STATISTICAL-MODEL IDENTIFICATION [J].
AKAIKE, H .
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 1974, AC19 (06) :716-723
[2]
Bishop C. M., 1996, Neural networks for pattern recognition
[3]
ORTHOGONAL LEAST-SQUARES METHODS AND THEIR APPLICATION TO NON-LINEAR SYSTEM-IDENTIFICATION [J].
CHEN, S ;
BILLINGS, SA ;
LUO, W .
INTERNATIONAL JOURNAL OF CONTROL, 1989, 50 (05) :1873-1896
[4]
Sparse kernel regression modeling using combined locally regularized orthogonal least squares and D-optimality experimental design [J].
Chen, S ;
Hong, X ;
Harris, CJ .
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2003, 48 (06) :1029-1036
[5]
Sparse mzodeling using orthogonal forward regression with PRESS statistic and regularization [J].
Chen, S ;
Hong, X ;
Harris, CJ ;
Sharkey, PM .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2004, 34 (02) :898-911
[6]
ORTHOGONAL LEAST-SQUARES LEARNING ALGORITHM FOR RADIAL BASIS FUNCTION NETWORKS [J].
CHEN, S ;
COWAN, CFN ;
GRANT, PM .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1991, 2 (02) :302-309
[7]
Adaptive minimum-BER linear multiuser detection for DS-CDMA signals in multipath channels [J].
Chen, S ;
Samingan, AK ;
Mulgrew, B ;
Hanzo, L .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2001, 49 (06) :1240-1247
[8]
Chen SSB, 2001, SIAM REV, V43, P129, DOI [10.1137/S003614450037906X, 10.1137/S1064827596304010]
[9]
Choudhury A., 2002, FAST MACHINE LEARNIN
[10]
Linear unlearning for cross-validation [J].
Hansen, LK ;
Larsen, J .
ADVANCES IN COMPUTATIONAL MATHEMATICS, 1996, 5 (2-3) :269-280