Optimization approximation solution for regression problem based on extreme learning machine

被引:58
作者
Yuan, Yubo [1 ]
Wang, Yuguang [1 ]
Cao, Feilong [1 ]
机构
[1] China Jiliang Univ, Inst Metrol & Computat Sci, Hangzhou 310018, Peoples R China
关键词
Extreme learning machine; Regression; Optimization; Matrix theory; FEEDFORWARD NETWORKS;
D O I
10.1016/j.neucom.2010.12.037
中图分类号
TP18 [人工智能理论];
学科分类号
140502 [人工智能];
摘要
Extreme learning machine (ELM) is one of the most popular and important learning algorithms. It comes from single-hidden-layer feedforward neural networks. It has been proved that ELM can achieve better performance than support vector machine (SVM) in regression and classification. In this paper, mathematically, with regression problem, the step 3 of ELM is studied. First of all, the equation H beta=T are reformulated as an optimal model. With the optimality, the necessary conditions of optimal solution are presented. The equation H beta=T is replaced by (HH)-H-T beta= (HT)-T-T. We can prove that the latter must have one solution at least. Second, optimal approximation solution is discussed in cases of H is column full rank, row full rank, neither column nor row full rank. In the last case, the rank-1 and rank-2 methods are used to get optimal approximation solution. In theory, this paper present a better algorithm for ELM. (C) 2011 Elsevier B.V. All rights reserved.
引用
收藏
页码:2475 / 2482
页数:8
相关论文
共 23 条
[1]
Blake C. L., 1998, Uci repository of machine learning databases
[2]
Composite function wavelet neural networks with extreme learning machine [J].
Cao, Jiuwen ;
Lin, Zhiping ;
Huang, Guang-bin .
NEUROCOMPUTING, 2010, 73 (7-9) :1405-1416
[3]
Error Minimized Extreme Learning Machine With Growth of Hidden Nodes and Incremental Learning [J].
Feng, Guorui ;
Huang, Guang-Bin ;
Lin, Qingping ;
Gay, Robert .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (08) :1352-1357
[4]
Freund Y., 1996, ICML 96, P148
[5]
Huang GB, 2004, I C CONT AUTOMAT ROB, P1029
[6]
Huang GB, 2004, IEEE IJCNN, P985
[7]
Learning capability and storage capacity of two-hidden-layer feedforward networks [J].
Huang, GB .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2003, 14 (02) :274-281
[8]
Upper bounds on the number of hidden neurons in feedforward networks with arbitrary bounded nonlinear activation functions [J].
Huang, GB ;
Babri, HA .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1998, 9 (01) :224-229
[9]
Classification ability of single hidden layer feedforward neural networks [J].
Huang, GB ;
Chen, YQ ;
Babri, HA .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2000, 11 (03) :799-801
[10]
Enhanced random search based incremental extreme learning machine [J].
Huang, Guang-Bin ;
Chen, Lei .
NEUROCOMPUTING, 2008, 71 (16-18) :3460-3468