Static, Dynamic, and Hybrid Neural Networks in Forecasting Inflation

被引:24
作者
Moshiri S. [1 ]
Cameron N.E. [1 ]
Scuse D. [1 ]
机构
[1] Department of Economics, University of Winnipeg, Winnipeg
关键词
Back-propagation neural network; Dynamic neural network; Forecasting; Hybrid neural network; Inflation; Radial basis function network; Recurrent neural network; Static neural network;
D O I
10.1023/A:1008752024721
中图分类号
学科分类号
摘要
The back-propagation neural network (BPN) model has been the most popular form of artificial neural network model used for forecasting, particularly in economics and finance. It is a static (feed-forward) model which has a learning process in both hidden and output layers. In this paper we compare the performance of the BPN model with that of two other neural network models, viz., the radial basis function network (RBFN) model and the recurrent neural network (RNN) model, in the context of forecasting inflation. The RBFN model is a hybrid model with a learning process that is much faster than the BPN model and that is able to generate almost the same results as the BPN model. The RNN model is a dynamic model which allows feedback from other layers to the input layer, enabling it to capture the dynamic behavior of the series. The results of the ANN models are also compared with those of the econometric time series models.
引用
收藏
页码:219 / 235
页数:16
相关论文
共 21 条
[1]  
Dematos G., Et al., Feed-forward versus recurrent neural networks for forecasting monthly Japanese Yen exchange rates, Financial Engineering and the Japanese Markets, 3, 1, pp. 59-75, (1996)
[2]  
Demuth H., Beale M., Neural Network Tool Box for Use with MATLAB, 2.a, (1995)
[3]  
Elman J.L., Finding Structure in Time, (1988)
[4]  
Faraway J., Chatfield C., Time Series Forecasting with Neural Networks: A Case Study, (1995)
[5]  
Hutchinson J., Poggio M., Lo T., Andrew W., A Nonparametric Approach to Pricing and Hedging Derivative Securities Via Learning Networks, (1994)
[6]  
Hartman E.J., Keeler J.D., Kowalsk J.M., Layered neural networks with gaussian hidden units as universal approximations, Neural Computation, 2, 2, pp. 210-215, (1990)
[7]  
Haykin S., Neural Networks, (1994)
[8]  
Hill T., O'Connor M., Rmus W., Neural network models for time series forecasts, Management Science, 42, 7, (1996)
[9]  
Jordan M.T., Serial Order: A Parallel Distributed Processing Approach, (1986)
[10]  
Kuan C.M., Hornik K., White H., A Convergence Result for Learning in Recurrent Neural Networks, (1993)