A recurrent self-organizing neural fuzzy inference network

被引:249
作者
Juang, CF [1 ]
Lin, CT [1 ]
机构
[1] Natl Chiao Tung Univ, Dept Elect & Control Engn, Hsinchu, Taiwan
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1999年 / 10卷 / 04期
关键词
context node; dynamic fuzzy inference; feedback term node; ordered derivative; projection-based correlation measure;
D O I
10.1109/72.774232
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A recurrent self-organizing neural fuzzy inference network (RSONFIN) is proposed in this paper, The RSONFIN is inherently a recurrent multilayered connectionist network for realizing the basic elements and functions of dynamic fuzzy inference, and may he considered to be constructed from a series of dynamic fuzzy rules, The temporal relations embedded in the network are built by adding some feedback connections representing the memory elements to a feedforward neural fuzzy network, Each weight as well as node in the RSONFIN has its own meaning and represents a special element in a fuzzy rule. There are no hidden nodes (i.e., no membership functions and fuzzy rules) initially in the RSONFIN, They are-created on-line via concurrent structure identification (the construction of dynamic fuzzy if-then rules) and parameter identification (the tuning of the free parameters of membership functions). The structure learning together with the parameter learning forms a fast learning algorithm for building a small,yet powerful, dynamic neural fuzzy network Two major characteristics-of the RSONFIN can thus be seen:1) the recurrent property of the RSONFIN makes it suitable for dealing with temporal problems and 2) no predetermination, Like the number of hidden nodes, must be given, since the RSONFIN can find its optimal structure and parameters automatically and quickly, Moreover, to reduce the number of fuzzy rules generated, a flexible input partition method,the aligned clustering-based algorithm, is proposed. Various simulations on temporal problems are done and performance comparisons with some existing recurrent networks are also made, Efficiency of the RSONFIN is verified from these results.
引用
收藏
页码:828 / 845
页数:18
相关论文
共 52 条
[1]  
[Anonymous], 1992, NEURAL NETWORKS FUZZ
[2]  
[Anonymous], INT J APPROXIMATE RE
[3]   LEARNING AND TUNING FUZZY-LOGIC CONTROLLERS THROUGH REINFORCEMENTS [J].
BERENJI, HR ;
KHEDKAR, P .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (05) :724-740
[4]   RECURRENT RADIAL BASIS FUNCTION NETWORKS FOR ADAPTIVE NOISE CANCELLATION [J].
BILLINGS, SA ;
FUNG, CF .
NEURAL NETWORKS, 1995, 8 (02) :273-290
[5]  
Callier F., 1992, LINEAR SYSTEM THEORY
[6]  
Fahlman S., 1990, ADV NEURAL INFORMATI, V2, P524
[7]   DYNAMIC RECURRENT NEURAL NETWORKS - THEORY AND APPLICATIONS [J].
GILES, CL ;
KUHN, GM ;
WILLIAMS, RJ .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (02) :153-156
[8]   CONSTRUCTIVE LEARNING OF RECURRENT NEURAL NETWORKS - LIMITATIONS OF RECURRENT CASADE CORRELATION AND A SIMPLE SOLUTION [J].
GILES, CL ;
CHEN, D ;
SUN, GZ ;
CHEN, HH ;
LEE, YC ;
GOUDREAU, MW .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1995, 6 (04) :829-836
[9]  
GORINI V, 1994, P IEEE INT C FUZZ SY, V1, P193
[10]  
GRANTNER J, 1994, P 3 IEEE C FUZZ SYST, V1, P205