A Growing and Pruning Method for Radial Basis Function Networks

被引:69
作者
Bortman, M. [1 ]
Aladjem, M. [1 ]
机构
[1] Ben Gurion Univ Negev, Dept Elect & Comp Engn, IL-84105 Beer Sheva, Israel
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2009年 / 20卷 / 06期
关键词
Gaussian mixture model (GMM); growing and pruning algorithms; radial basis function (RBF) neural networks; resource-allocating network (RAN); sequential function approximation; SELF-ORGANIZATION; REGULARIZATION; ALGORITHM; MIXTURES; CODES;
D O I
10.1109/TNN.2009.2019270
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A recently published generalized growing and pruning (GGAP) training algorithm for radial basis function (RBF) neural networks is studied and modified. GGAP is a resource-allocating network (RAN) algorithm, which means that a created network unit that consistently makes little contribution to the network's performance can be removed during the training. GGAP states a formula for computing the significance of the network units, which requires a d-fold numerical integration for arbitrary probability density function p(x) of the input data x(x is an element of R-d). In this work, the GGAP formula is approximated using a Gaussian mixture model (GMM) for p(x) and an analytical solution of the approximated unit significance is derived. This makes it possible to employ the modified GGAP for input data having complex and high-dimensional p(x), which was not possible in the original GGAP. The results of an extensive experimental study show that the modified algorithm outperforms the original GGAP achieving both a lower prediction error and reduced complexity of the trained network.
引用
收藏
页码:1039 / 1045
页数:7
相关论文
共 35 条
[1]   Projection pursuit mixture density estimation [J].
Aladjem, M .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2005, 53 (11) :4376-4383
[2]  
Bishop C. M., 2009, Pattern Recognition and Machine Learning
[3]  
BISHOP CM, 2001, P 16 C UNC ART INT, P46
[4]   Advances in mixture models [J].
Boehning, Dankar ;
Seidel, Wilfried ;
Alfo, Macro ;
Garel, Bernard ;
Patilea, Valentin ;
Walther, Gunther .
COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2007, 51 (11) :5205-5210
[5]   A MASSIVELY PARALLEL ARCHITECTURE FOR A SELF-ORGANIZING NEURAL PATTERN-RECOGNITION MACHINE [J].
CARPENTER, GA ;
GROSSBERG, S .
COMPUTER VISION GRAPHICS AND IMAGE PROCESSING, 1987, 37 (01) :54-115
[6]   ART-2 - SELF-ORGANIZATION OF STABLE CATEGORY RECOGNITION CODES FOR ANALOG INPUT PATTERNS [J].
CARPENTER, GA ;
GROSSBERG, S .
APPLIED OPTICS, 1987, 26 (23) :4919-4930
[7]   Local regularization assisted orthogonal least squares regression [J].
Chen, S .
NEUROCOMPUTING, 2006, 69 (4-6) :559-585
[8]   Sparse mzodeling using orthogonal forward regression with PRESS statistic and regularization [J].
Chen, S ;
Hong, X ;
Harris, CJ ;
Sharkey, PM .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2004, 34 (02) :898-911
[9]   Unsupervised learning of Gaussian mixtures based on variational component splitting [J].
Constantinopoulos, Constantinos ;
Likas, Aristidis .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (03) :745-755
[10]   An incremental training method for the probabilistic RBF network [J].
Constantinopoulos, Constantinos ;
Likas, Aristidis .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2006, 17 (04) :966-974