On the construction and training of reformulated radial basis function neural networks

被引:93
作者
Karayiannis, NB [1 ]
Randolph-Gips, MM
机构
[1] Univ Houston, Dept Elect & Comp Engn, Houston, TX 77204 USA
[2] Univ Houston Clear Lake, Dept Comp Engn, Houston, TX 77058 USA
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2003年 / 14卷 / 04期
关键词
absolute sensitivity; active region; blind spot; cosine radial basis function; generator function; gradient descent learning; radial basis function (RBF) neural network; reformulation;
D O I
10.1109/TNN.2003.813841
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a systematic approach for constructing reformulated radial basis function (RBF) neural networks, which was developed to facilitate their training by supervised learning algorithms based on gradient descent. This approach reduces the construction of radial basis function models to the selection of admissible generator functions. The selection of generator functions relies on the concept of the blind spot, which is introduced in this paper. This paper also introduces a new family of reformulated radial basis function neural networks, which are referred to as cosine radial basis functions. Cosine radial basis functions are constructed by linear generator functions of a special form and their use as similarity measures in radial basis function models is justified by their geometric interpretation. A set of experiments on a variety of datasets indicate that cosine radial basis functions outperform considerably conventional radial basis function neural networks, with Gaussian radial basis functions. Cosine radial basis functions are also strong competitors to existing reformulated radial basis function models trained by gradient descent and feedforward neural networks with sigmoid hidden units.
引用
收藏
页码:835 / 846
页数:12
相关论文
共 19 条
[1]  
Bishop C. M., 1995, NEURAL NETWORKS PATT
[2]  
Broomhead D. S., 1988, Complex Systems, V2, P321
[3]   Interference cancellation using radial basis function networks [J].
Cha, I ;
Kassam, SA .
SIGNAL PROCESSING, 1995, 47 (03) :247-268
[4]   ORTHOGONAL LEAST-SQUARES LEARNING ALGORITHM FOR RADIAL BASIS FUNCTION NETWORKS [J].
CHEN, S ;
COWAN, CFN ;
GRANT, PM .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1991, 2 (02) :302-309
[5]   APPROXIMATION CAPABILITY TO FUNCTIONS OF SEVERAL VARIABLES, NONLINEAR FUNCTIONALS, AND OPERATORS BY RADIAL BASIS FUNCTION NEURAL NETWORKS [J].
CHEN, TP ;
CHEN, H .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1995, 6 (04) :904-910
[6]  
Cherkassky V.S., 1998, LEARNING DATA CONCEP, V1st ed.
[7]   Comparing neural networks: A benchmark on growing neural gas, growing cell structures, and fuzzy ARTMAP [J].
Heinke, D ;
Hamker, FH .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1998, 9 (06) :1279-1291
[8]   An axiomatic approach to soft learning vector quantization and clustering [J].
Karayiannis, NB .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1999, 10 (05) :1153-1165
[9]  
Karayiannis NB, 2000, INT SER COMPUTAT INT, P39
[10]   Reformulated radial basis neural networks trained by gradient descent [J].
Karayiannis, NB .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1999, 10 (03) :657-671