An incremental extreme learning machine for online sequential learning problems

被引:50
作者
Guo, Lu [1 ,2 ]
Hao, Jing-hua [1 ,2 ]
Liu, Min [1 ,2 ]
机构
[1] Tsinghua Univ, Dept Automat, Beijing 100084, Peoples R China
[2] Tsinghua Natl Lab Informat Sci & Technol, Beijing 100084, Peoples R China
基金
中国国家自然科学基金;
关键词
Incremental learning algorithm; Extreme learning machine (ELM); Incremental ELM (IELM); Online sequential ELM (OS-ELM); Fixed size LSSVM (FS-LSSVM);
D O I
10.1016/j.neucom.2013.03.055
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A fast and outstanding incremental learning algorithm is required to meet the demand of online applications where data comes one by one or chunk by chunk to avoid retraining and save precious time. Although many interesting research results have been achieved, there are still a lot of difficulties in real applications because of their unsatisfying generalization performance or intensive computation cost. This paper presents an Incremental Extreme Learning Machine (IELM) which is developed based on Extreme Learning Machine (ELM), a unified framework of LS-SVM and PSVM presented by Hang et al. (2011) in [15]. Under different application demand and different computational cost and efficiency, three different alternative solutions of IELM are achieved. Detailed comparisons of the IELM algorithm with other incremental algorithms are achieved by simulation on benchmark problems and real critical dimension (CD) prediction problem in lithography of actual semiconductor production line. The results show that kernel based IELM solution performs best while least square IELM solution is the fastest of the three alterative solutions when the number of training data is huge. All the results show that the presented IELM algorithms have better performance than other incremental algorithms such as online sequential ELM (OS-ELM) presented by Liang et al. (2006) [8] and fixed size LSSVM presented by Espinoza et al. (2006) [11]. (C) 2013 Elsevier B.V. All rights reserved.
引用
收藏
页码:50 / 58
页数:9
相关论文
共 18 条
[1]  
[Anonymous], 2010, P 18 EUR S ART NEUR
[2]  
[Anonymous], IEEE T SYST MAN CY B
[3]   SUPPORT-VECTOR NETWORKS [J].
CORTES, C ;
VAPNIK, V .
MACHINE LEARNING, 1995, 20 (03) :273-297
[4]   Fixed-size least squares support vector machines: A large scale application in electrical load forecasting [J].
Espinoza M. ;
Suykens J.A.K. ;
De Moor B. .
Computational Management Science, 2006, 3 (2) :113-129
[5]  
Fung G., 2001, KDD-2001. Proceedings of the Seventh ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, P77, DOI 10.1145/502512.502527
[6]  
Golub G. H., 1996, MATRIX COMPUTATIONS
[7]   UPDATING THE INVERSE OF A MATRIX [J].
HAGER, WW .
SIAM REVIEW, 1989, 31 (02) :221-239
[8]  
Haykin S., 1999, Neural Networks: A Comprehensive Foundation, DOI DOI 10.1017/S0269888998214044
[9]   A parallel incremental extreme SVM classifier [J].
He, Qing ;
Du, Changying ;
Wang, Qun ;
Zhuang, Fuzhen ;
Shi, Zhongzhi .
NEUROCOMPUTING, 2011, 74 (16) :2532-2540
[10]   A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation [J].
Huang, GB ;
Saratchandran, P ;
Sundararajan, N .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2005, 16 (01) :57-67