DYNAMICAL RECURRENT NEURAL NETWORKS - TOWARDS ENVIRONMENTAL TIME-SERIES PREDICTION

被引:26
作者
AUSSEM, A
MURTAGH, F
SARAZIN, M
机构
[1] EUROPEAN SO OBSERV,DIV VERY LARGE TELESCOPE,D-85748 GARCHING,GERMANY
[2] UNIV PARIS 05,UFR MATH & INFORMAT,F-75006 PARIS,FRANCE
[3] EUROPEAN SPACE AGCY,DEPT SPACE SCI,DIV ASTROPHYS,EUROPEAN SO OBSERV,D-85748 GARCHING,GERMANY
关键词
D O I
10.1142/S0129065795000123
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Dynamical Recurrent Neural Networks (DRNN) (Aussem 1995a) are a class of fully recurrent networks obtained by modeling synapses as autoregressive filters. By virtue of their internal dynamic, these networks approximate the underlying law governing the time series by a system of nonlinear difference equations of internal variables. They therefore provide history-sensitive forecasts without having to be explicitly fed with external memory. The model is trained by a local and recursive error propagation algorithm called temporal-recurrent-backpropagation. The efficiency of the procedure benefits from the exponential decay of the gradient terms backpropagated through the adjoint network. We assess the predictive ability of the DRNN model with meteorological and astronomical time series recorded around the candidate observation sites for the future VLT telescope. The hope is that reliable environmental forecasts provided with the model will allow the modern telescopes to be preset, a few hours in advance, in the most suited instrumental mode. In this perspective, the model is first appraised on precipitation measurements with traditional nonlinear AR and ARMA techniques using feedforward networks. Then we tackle a complex problem, namely the prediction of astronomical seeing, known to be a very erratic time series. A fuzzy coding approach is used to reduce the complexity of the underlying laws governing the seeing. Then, a fuzzy correspondence analysis is carried out to explore the internal relationships in the data. Based on a carefully selected set of meteorological variables at the same time-point, a nonlinear multiple regression, termed nowcasting (Murtagh et al. 1993, 1995), is carried out on the fuzzily coded seeing records. The DRNN is shown to outperform the fuzzy k-nearest neighbors method.
引用
收藏
页码:145 / 170
页数:26
相关论文
共 35 条
[1]  
ALMEIDA B, 1987, NEURAL COMPUTERS, P199
[2]  
[Anonymous], 1987, LEARNING INTERNAL RE
[3]  
[Anonymous], 1991, INTRO THEORY NEURAL, DOI DOI 10.1201/9780429499661
[4]  
[Anonymous], 2002, CHAOS DYNAMICAL SYST
[5]  
Aussem A., 1994, Vistas in Astronomy, V38, P357, DOI 10.1016/0083-6656(94)90047-7
[6]  
AUSSEM A, IN PRESS NEUROCOMPUT
[7]  
AUSSEM A, 1995, UNPUB IEEE T NEURAL
[8]  
AUSSEM A, 1995, THESIS U R DESCARTES
[9]   LEARNING LONG-TERM DEPENDENCIES WITH GRADIENT DESCENT IS DIFFICULT [J].
BENGIO, Y ;
SIMARD, P ;
FRASCONI, P .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (02) :157-166
[10]  
Box G.E.P., 1976, TIME SERIES ANAL