On the construction and training of reformulated radial basis function neural networks

被引:93
作者
Karayiannis, NB [1 ]
Randolph-Gips, MM
机构
[1] Univ Houston, Dept Elect & Comp Engn, Houston, TX 77204 USA
[2] Univ Houston Clear Lake, Dept Comp Engn, Houston, TX 77058 USA
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2003年 / 14卷 / 04期
关键词
absolute sensitivity; active region; blind spot; cosine radial basis function; generator function; gradient descent learning; radial basis function (RBF) neural network; reformulation;
D O I
10.1109/TNN.2003.813841
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a systematic approach for constructing reformulated radial basis function (RBF) neural networks, which was developed to facilitate their training by supervised learning algorithms based on gradient descent. This approach reduces the construction of radial basis function models to the selection of admissible generator functions. The selection of generator functions relies on the concept of the blind spot, which is introduced in this paper. This paper also introduces a new family of reformulated radial basis function neural networks, which are referred to as cosine radial basis functions. Cosine radial basis functions are constructed by linear generator functions of a special form and their use as similarity measures in radial basis function models is justified by their geometric interpretation. A set of experiments on a variety of datasets indicate that cosine radial basis functions outperform considerably conventional radial basis function neural networks, with Gaussian radial basis functions. Cosine radial basis functions are also strong competitors to existing reformulated radial basis function models trained by gradient descent and feedforward neural networks with sigmoid hidden units.
引用
收藏
页码:835 / 846
页数:12
相关论文
共 19 条
[11]  
Karayiannis NB, 1998, IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE, P2230, DOI 10.1109/IJCNN.1998.687207
[12]  
Karayiannis NB, 1997, 1997 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, P1815, DOI 10.1109/ICNN.1997.614174
[13]   Growing radial basis neural networks: Merging supervised and unsupervised learning with network growth techniques [J].
Karayiannis, NB ;
Mi, GWQ .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1997, 8 (06) :1492-1506
[14]  
KARAYIANNIS NB, 2000, P 2000 INT JOINT C N, V3, P614
[15]  
KARAYIANNIS NB, 2003, 2003 INT JOINT C NEU
[16]   INTERPOLATION OF SCATTERED DATA - DISTANCE MATRICES AND CONDITIONALLY POSITIVE DEFINITE FUNCTIONS [J].
MICCHELLI, CA .
CONSTRUCTIVE APPROXIMATION, 1986, 2 (01) :11-22
[17]   Fast Learning in Networks of Locally-Tuned Processing Units [J].
Moody, John ;
Darken, Christian J. .
NEURAL COMPUTATION, 1989, 1 (02) :281-294
[18]   REGULARIZATION ALGORITHMS FOR LEARNING THAT ARE EQUIVALENT TO MULTILAYER NETWORKS [J].
POGGIO, T ;
GIROSI, F .
SCIENCE, 1990, 247 (4945) :978-982
[19]   EVOLVING SPACE-FILLING CURVES TO DISTRIBUTE RADIAL BASIS FUNCTIONS OVER AN INPUT SPACE [J].
WHITEHEAD, BA ;
CHOATE, TD .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (01) :15-23