An orthogonal forward regression technique for sparse kernel density estimation

被引:40
作者
Chen, S. [1 ]
Hong, X. [2 ]
Harris, C. J. [1 ]
机构
[1] Univ Southampton, Sch Elect & Comp Sci, Southampton SO17 1BJ, Hants, England
[2] Univ Reading, Sch Syst Engn, Reading RG6 6AY, Berks, England
关键词
probability density function; Parzen window estimate; sparse kernel modelling; orthogonal forward regression; cross validation; regularisation; multiplicative nonnegative quadratic programming;
D O I
10.1016/j.neucom.2007.02.008
中图分类号
TP18 [人工智能理论];
学科分类号
081104 [模式识别与智能系统]; 0812 [计算机科学与技术]; 0835 [软件工程]; 1405 [智能科学与技术];
摘要
Using the classical Parzen window (PW) estimate as the desired response, the kernel density estimation is formulated as a regression problem and the orthogonal forward regression technique is adopted to construct sparse kernel density (SKD) estimates. The proposed algorithm incrementally minimises a leave-one-out test score to select a sparse kernel model, and a local regularisation method is incorporated into the density construction process to further enforce sparsity. The kernel weights of the selected sparse model are finally updated using the multiplicative nonnegative quadratic programming algorithm, which ensures the nonnegative and unity constraints for the kernel weights and has the desired ability to reduce the model size further. Except for the kernel width, the proposed method has no other parameters that need tuning, and the user is not required to specify any additional criterion to terminate the density construction procedure. Several examples demonstrate the ability of this simple regression-based approach to effectively construct a SKID estimate with comparable accuracy to that of the full-sample optimised PW density estimate. (c) 2007 Elsevier B.V. All rights reserved.
引用
收藏
页码:931 / 943
页数:13
相关论文
共 25 条
[1]
NEW LOOK AT STATISTICAL-MODEL IDENTIFICATION [J].
AKAIKE, H .
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 1974, AC19 (06) :716-723
[2]
Bishop CM., 1995, Neural networks for pattern recognition
[3]
ORTHOGONAL LEAST-SQUARES METHODS AND THEIR APPLICATION TO NON-LINEAR SYSTEM-IDENTIFICATION [J].
CHEN, S ;
BILLINGS, SA ;
LUO, W .
INTERNATIONAL JOURNAL OF CONTROL, 1989, 50 (05) :1873-1896
[4]
Local regularization assisted orthogonal least squares regression [J].
Chen, S .
NEUROCOMPUTING, 2006, 69 (4-6) :559-585
[5]
Sparse kernel density construction using orthogonal forward regression with leave-one-out test score and local regularization [J].
Chen, S ;
Hong, X ;
Harris, CJ .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2004, 34 (04) :1708-1717
[6]
Sparse kernel regression modeling using combined locally regularized orthogonal least squares and D-optimality experimental design [J].
Chen, S ;
Hong, X ;
Harris, CJ .
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2003, 48 (06) :1029-1036
[7]
Sparse mzodeling using orthogonal forward regression with PRESS statistic and regularization [J].
Chen, S ;
Hong, X ;
Harris, CJ ;
Sharkey, PM .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2004, 34 (02) :898-911
[8]
CHOUDHURY A, 2002, THESIS U SOUTHAMPTON
[9]
Probability density estimation from optimally condensed data samples [J].
Girolami, M ;
He, C .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2003, 25 (10) :1253-1264
[10]
Linear unlearning for cross-validation [J].
Hansen, LK ;
Larsen, J .
ADVANCES IN COMPUTATIONAL MATHEMATICS, 1996, 5 (2-3) :269-280