Bayesian regression filters and the issue of priors

被引:11
作者
Zhu, HY
Rohwer, R
机构
[1] Neural Computing Research Group, Dept. of Comp. Sci. and Appl. Math., Aston University, Birmingham
[2] Neural Computing Research Group, Dept. of Comp. Sci. and Appl. Math., Aston University
关键词
approximation; Bayesian method; Kalman filter; online learning; prior selection; radial basis functions; regression;
D O I
10.1007/BF01414873
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a Bayesian framework for regression problems, which covers areas usually dealt with by function approximation. An online learning algorithm is derived which solves regression problems with a Kalman filter. its solution always improves with increasing model complexity, without the risk of over-fitting. In the infinite dimension limit it approaches the hue Bayesian posterior. The issues of prior selection and over-fitting are also discussed, showing that some of the commonly held beliefs are misleading. The practical implementation is summarised. Simulations using 13 popular publicly available data sets are used to demonstrate the method and highlight important issues concerning the choice of priors.
引用
收藏
页码:130 / 142
页数:13
相关论文
共 15 条
[1]  
[Anonymous], 1995, NCRG4350 AST U
[2]  
[Anonymous], 1993, ADV NEURAL INFORM PR
[3]  
CHUI CK, 1987, SPRINGER SERIES INFO
[4]   NEURAL NETWORKS AND THE BIAS VARIANCE DILEMMA [J].
GEMAN, S ;
BIENENSTOCK, E ;
DOURSAT, R .
NEURAL COMPUTATION, 1992, 4 (01) :1-58
[5]  
HANSON SJ, 1993, ADV NEURAL INFORMATI, V5
[6]  
Jenkins G. M., 1968, SPECTRAL ANAL ITS AP
[7]  
MACKAY DJC, 1992, NEURAL COMPUT, V4, P415, DOI [10.1162/neco.1992.4.3.415, 10.1162/neco.1992.4.3.448]
[8]  
MACKAY DJC, 1995, MAXIMUM ENTROPY BAYE
[9]  
Michie D., 1994, Technometrics, V37, P459, DOI DOI 10.2307/1269742
[10]  
Neal R.M, 1995, THESIS U TORONTO