MELM-GRBF: A modified version of the extreme learning machine for generalized radial basis function neural networks

被引:60
作者
Fernandez-Navarro, Francisco [1 ]
Hervas-Martinez, Cesar [1 ]
Sanchez-Monedero, Javier [1 ]
Antonio Gutierrez, Pedro [1 ]
机构
[1] Univ Cordoba, Dept Comp Sci & Numer Anal, Cordoba 14074, Spain
关键词
Generalized radial basis functions neural networks; Extreme learning machine; Multi-classification; Generalized Gaussian distribution; APPROXIMATION; CLASSIFICATION;
D O I
10.1016/j.neucom.2010.11.032
中图分类号
TP18 [人工智能理论];
学科分类号
140502 [人工智能];
摘要
In this paper, we propose a methodology for training a new model of artificial neural network called the generalized radial basis function (GRBF) neural network. This model is based on generalized Gaussian distribution, which parametrizes the Gaussian distribution by adding a new parameter tau. The generalized radial basis function allows different radial basis functions to be represented by updating the new parameter tau. For example, when GRBF takes a value of tau = 2, it represents the standard Gaussian radial basis function. The model parameters are optimized through a modified version of the extreme learning machine (ELM) algorithm. In the methodology proposed (MELM-GRBF), the centers of each GRBF were taken randomly from the patterns of the training set and the radius and tau values were determined analytically, taking into account that the model must fulfil two constraints: locality and coverage. An thorough experimental study is presented to test its overall performance. Fifteen datasets were considered, including binary and multi-class problems, all of them taken from the UCI repository. The MELM-GRBF was compared to ELM with sigmoidal, hard-limit, triangular basis and radial basis functions in the hidden layer and to the ELM-RBF methodology proposed by Huang et al. (2004) [1]. The MELM-GRBF obtained better results in accuracy than the corresponding sigmoidal, hard-limit, triangular basis and radial basis functions for almost all datasets, producing the highest mean accuracy rank when compared with these other basis functions for all datasets. (C) 2011 Elsevier B.V. All rights reserved.
引用
收藏
页码:2502 / 2510
页数:9
相关论文
共 37 条
[1]
[Anonymous], 1987, MULTIPLE COMP PROCED, DOI DOI 10.1002/9780470316672
[2]
[Anonymous], 2005, NEURAL NETWORKS PATT
[3]
[Anonymous], 2007, Uci machine learning repository
[4]
Improving the Generalization Properties of Radial Basis Function Neural Networks [J].
Bishop, Chris .
NEURAL COMPUTATION, 1991, 3 (04) :579-588
[5]
Normalized Gaussian radial basis function networks [J].
Bugmann, G .
NEUROCOMPUTING, 1998, 20 (1-3) :97-110
[6]
Composite function wavelet neural networks with extreme learning machine [J].
Cao, Jiuwen ;
Lin, Zhiping ;
Huang, Guang-bin .
NEUROCOMPUTING, 2010, 73 (7-9) :1405-1416
[7]
Chauvin Y., 1995, Backpropagation: Theory, architectures, and applications
[8]
MULTIPLE COMPARISONS AMONG MEANS [J].
DUNN, OJ .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1961, 56 (293) :52-&
[9]
Error Minimized Extreme Learning Machine With Growth of Hidden Nodes and Incremental Learning [J].
Feng, Guorui ;
Huang, Guang-Bin ;
Lin, Qingping ;
Gay, Robert .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (08) :1352-1357
[10]
A dynamic over-sampling procedure based on sensitivity for multi-class problems [J].
Fernandez-Navarro, Francisco ;
Hervas-Martinez, Cesar ;
Antonio Gutierrez, Pedro .
PATTERN RECOGNITION, 2011, 44 (08) :1821-1833