Parallel Chaos Search Based Incremental Extreme Learning Machine

被引:24
作者
Yang, Yimin [1 ]
Wang, Yaonan [1 ]
Yuan, Xiaofang [1 ]
机构
[1] Hunan Univ, Coll Elect & Informat Engn, Changsha 410082, Hunan, Peoples R China
基金
中国国家自然科学基金;
关键词
Extreme learning machine; Convergence rate; Chaos optimization algorithm; Random hidden nodes; UNIVERSAL APPROXIMATION; FEEDFORWARD NETWORKS;
D O I
10.1007/s11063-012-9246-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, a simple and efficient learning steps referred to as extreme learning machine (ELM), was proposed by Huang et al. , which has shown that compared to some conventional methods, the training time of neural networks can be reduced even by thousands of times. However, recent study showed that some of random hidden nodes may paly a very minion role in the network output and thus eventually increase the network complexity. This paper proposes a parallel chaos search based incremental extreme learning machine (PC-ELM) with additional steps to obtain a more compact network architecture. At each learning step, optimal parameters of hidden node that are selected by parallel chaos optimization algorithm will be added to exist network in order to minimize the residual error between target function and network output. The optimization method is proposed parallel chaos optimization method. We prove the convergence of PC-ELM both in increased network architecture and fixed network architecture. Then we apply this approach to several regression and classification problems. Experiment of 19 benchmark testing data sets are used to test the performance of PC-ELM. Simulation results demonstrate that the proposed method provides better generalization performance and more compact network architecture.
引用
收藏
页码:277 / 301
页数:25
相关论文
共 13 条
[1]   Universal approximation and QoS violation application of extreme learning machine [J].
Chen, Lei ;
Zhou, LiFeng ;
Pung, Hung Keng .
NEURAL PROCESSING LETTERS, 2008, 28 (02) :81-95
[2]  
Cheng B, 2006, LECT NOTES COMPUT SC, V4304, P224
[3]   Error Minimized Extreme Learning Machine With Growth of Hidden Nodes and Incremental Learning [J].
Feng, Guorui ;
Huang, Guang-Bin ;
Lin, Qingping ;
Gay, Robert .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (08) :1352-1357
[4]  
Hornik K, 1991, IEEE T NEURAL NETWOR, V4, P251
[5]   Enhanced random search based incremental extreme learning machine [J].
Huang, Guang-Bin ;
Chen, Lei .
NEUROCOMPUTING, 2008, 71 (16-18) :3460-3468
[6]   Convex incremental extreme learning machine [J].
Huang, Guang-Bin ;
Chen, Lei .
NEUROCOMPUTING, 2007, 70 (16-18) :3056-3062
[7]   Extreme learning machine: Theory and applications [J].
Huang, Guang-Bin ;
Zhu, Qin-Yu ;
Siew, Chee-Kheong .
NEUROCOMPUTING, 2006, 70 (1-3) :489-501
[8]   Universal approximation using incremental constructive feedforward networks with random hidden nodes [J].
Huang, Guang-Bin ;
Chen, Lei ;
Siew, Chee-Kheong .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2006, 17 (04) :879-892
[9]  
Li B, 1998, CYBERNET SYST, V29, P409, DOI 10.1080/019697298125678
[10]   An optimization method inspired by "chaotic" ant behavior [J].
Li, Lixiang ;
Yang, Yixian ;
Peng, Haipeng ;
Wang, Xiangdong .
INTERNATIONAL JOURNAL OF BIFURCATION AND CHAOS, 2006, 16 (08) :2351-2364