A new boosting algorithm for improved time-series forecasting with recurrent neural networks

被引:95
作者
Assaad, Mohammad [1 ]
Bone, Romuald [1 ]
Cardot, Hubert [1 ]
机构
[1] Univ Tours, Lab Informat, F-37200 Tours, France
关键词
learning algorithm; boosting; recurrent neural networks; time series forecasting; multi-step-ahead prediction;
D O I
10.1016/j.inffus.2006.10.009
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Ensemble methods for classification and regression have focused a great deal of attention in recent years. They have shown, both theoretically and empirically, that they are able to perform substantially better than single models in a wide range of tasks. We have adapted an ensemble method to the problem of predicting future values of time series using recurrent neural networks (RNNs) as base learners. The improvement is made by combining a large number of RNNs, each of which is generated by training on a different set of examples. This algorithm is based on the boosting algorithm where difficult points of the time series are concentrated on during the learning process however, unlike the original algorithm, we introduce a new parameter for tuning the boosting influence on available examples. We test our boosting algorithm for RNNs on single-step-ahead and multi-step-ahead prediction problems. The results are then compared to other regression methods, including those of different local approaches. The overall results obtained through our ensemble method are more accurate than those obtained through the standard method, backpropagation through time, on these datasets and perform significantly better even when long-range dependencies play an important role. (C) 2006 Elsevier B.V. All rights reserved.
引用
收藏
页码:41 / 55
页数:15
相关论文
共 63 条
[1]  
[Anonymous], WORKSH SELF ORG MAPS
[2]  
ASSAAD M, 2005, P 15 INT C ART NEUR, P169
[3]   A comparison between neural-network forecasting techniques - Case study: River flow forecasting [J].
Atiya, AF ;
El-Shoura, SM ;
Shaheen, SI ;
El-Sherif, MS .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1999, 10 (02) :402-409
[4]  
AUDRINO F, 2003, J COMPUT FINANC, V6, P1
[5]   Sufficient conditions for error backflow convergence in dynamical recurrent neural networks [J].
Aussem, A .
NEURAL COMPUTATION, 2002, 14 (08) :1907-1927
[6]   Dynamical recurrent neural networks towards prediction and modeling of dynamical systems [J].
Aussem, A .
NEUROCOMPUTING, 1999, 28 :207-232
[7]  
AUSSEM A, 1998, NEURAL NETWORKS THEI, P425
[8]   Boosting regression estimators [J].
Avnimelech, R ;
Intrator, N .
NEURAL COMPUTATION, 1999, 11 (02) :499-520
[9]  
Back A., 1994, Neural Networks for Signal Processing IV. Proceedings of the 1994 IEEE Workshop (Cat. No.94TH0688-2), P146, DOI 10.1109/NNSP.1994.366054
[10]  
BACK AD, 1992, ARTIFICIAL NEURAL NETWORKS, 2, VOLS 1 AND 2, P1113