FUZZY ARTMAP - A NEURAL NETWORK ARCHITECTURE FOR INCREMENTAL SUPERVISED LEARNING OF ANALOG MULTIDIMENSIONAL MAPS

被引:1125
作者
CARPENTER, GA
GROSSBERG, S
MARKUZON, N
REYNOLDS, JH
ROSEN, DB
机构
[1] BOSTON UNIV,DEPT COGNIT & NEURAL SYST,GRAD STUDIES,BOSTON,MA 02215
[2] BOSTON UNIV,MATH,BOSTON,MA 02215
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1992年 / 3卷 / 05期
关键词
D O I
10.1109/72.159059
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A new neural network architecture is introduced for incremental supervised learning of recognition categories and multidimensional maps in response to arbitrary sequences of analog or binary input vectors, which may represent fuzzy or crisp sets of features. The architecture, called fuzzy ARTMAP, achieves a synthesis of fuzzy logic and adaptive resonance theory (ART) neural networks by exploiting a close formal similarity between the computations of fuzzy subsethood and ART category choice, resonance, and learning. Fuzzy ARTMAP also realizes a new minimax learning rule that conjointly minimizes predictive error and maximizes code compression, or generalization. This is achieved by a match tracking process that increases the ART vigilance parameter by the minimum amount needed to correct a predictive error. As a result, the system automatically learns a minimal number of recognition categories, or "hidden units," to meet accuracy criteria. Category proliferation is prevented by normalizing input vectors at a preprocessing stage. A normalization procedure called complement coding leads to a symmetric theory in which the AND Operator (OR) and the OR operator (AND) of fuzzy logic play complementary roles. Complement coding uses on cells and off cells to represent the input pattern, and preserves individual feature amplitudes while normalizing the total on cell/off cell vector. Learning is stable because all adaptive weights can only decrease in time. Decreasing weights correspond to increasing sizes of category "boxes." Smaller vigilance values lead to larger category boxes. Improved prediction is achieved by training the system several times using different orderings of the input set. This voting strategy can also be used to assign confidence estimates to competing predictions given small, noisy, or incomplete training sets. Four classes of simulations illustrate fuzzy ARTMAP performance in relation to benchmark back-propagation and genetic algorithm systems. These simulations include (i) finding points inside versus outside a circle; (ii) learning to tell two spirals apart, (iii) incremental approximation of a piecewise-continuous function; and (iv) a letter recognition database. The fuzzy ARTMAP system is also compared with Salzberg's NGE system and with Simpson's FMMC system.
引用
收藏
页码:698 / 713
页数:16
相关论文
共 23 条
[1]  
[Anonymous], 1981, PATTERN RECOGN
[2]  
Carpenter G., 1991, PATTERN RECOGNITION
[3]   ARTMAP - SUPERVISED REAL-TIME LEARNING AND CLASSIFICATION OF NONSTATIONARY DATA BY A SELF-ORGANIZING NEURAL NETWORK [J].
CARPENTER, GA ;
GROSSBERG, S ;
REYNOLDS, JH .
NEURAL NETWORKS, 1991, 4 (05) :565-588
[4]   A MASSIVELY PARALLEL ARCHITECTURE FOR A SELF-ORGANIZING NEURAL PATTERN-RECOGNITION MACHINE [J].
CARPENTER, GA ;
GROSSBERG, S .
COMPUTER VISION GRAPHICS AND IMAGE PROCESSING, 1987, 37 (01) :54-115
[5]  
CARPENTER GA, P INT JOINT C NEURAL, V2, P411
[6]  
CARPENTER GA, 1991, CASCNSTR91021 BOST U
[7]  
CARPENTER GA, CASCNSTR91015 BOST U
[8]   LETTER RECOGNITION USING HOLLAND-STYLE ADAPTIVE CLASSIFIERS [J].
FREY, PW ;
SLATE, DJ .
MACHINE LEARNING, 1991, 6 (02) :161-182
[9]  
Holland J. H., 1986, MACHINE LEARNING ART, V2
[10]  
HOLLAND JH, 1975, ADAPTATION NATURAL A