DIAGNOSIS USING BACKPROPAGATION NEURAL NETWORKS - ANALYSIS AND CRITICISM

被引:89
作者
KRAMER, MA
LEONARD, JA
机构
[1] Laboratory for Intelligent Systems in Process Engineering, Department of Chemical Engineering, Massachusetts Institute of Technology, Cambridge
关键词
D O I
10.1016/0098-1354(90)80015-4
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Artificial neural networks based on a feedforward architecture and trained by the backpropagation technique have recently been applied to static fault diagnosis problems. The networks are used to classify measurement vectors into a set of predefined categories that represent the various functional and malfunctional states of the process. While the networks can usually produce decision surfaces that correctly classify the training examples, regions of the input space not occupied by training data are classified arbitrarily. As a result, the networks may not accurately extrapolate from the training data. Although extrapolation is not required under ideal circumstances, in practice the network may be required to extrapolate when undersized training sets are used, when parent distributions of fault classes undergo shifts subsequent to training, and when the input data is corrupted by missing or biased sensors. These situations cause relatively high error rates for the neural classifier. A related probem is that the networks cannot detect when they lack the data for a reliable classification, a serious deficiency in many practical applications. Classifiers based on distance metrics assign regions of the input space according to their proximity to the training data, and thus extrapolation is not arbitrary but based on the most relevant data. Distance-based classifiers perform better under nonideal conditions and are to be preferred to neural network classifiers in diagnostic applications.
引用
收藏
页码:1323 / 1338
页数:16
相关论文
共 23 条
[1]   What Size Net Gives Valid Generalization? [J].
Baum, Eric B. ;
Haussler, David .
NEURAL COMPUTATION, 1989, 1 (01) :151-160
[2]  
BERENBLUT BJ, 1977, CHEM ENG-LONDON, V318, P175
[3]   NEAREST NEIGHBOR PATTERN CLASSIFICATION [J].
COVER, TM ;
HART, PE .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1967, 13 (01) :21-+
[4]  
Cybenko G., 1989, Mathematics of Control, Signals, and Systems, V2, P303, DOI 10.1007/BF02551274
[5]  
Duda R. O., 1973, PATTERN CLASSIFICATI, V3
[6]  
FERRADA JJ, 1989, AICHE NATL MTG SAN F
[7]   ANALYSIS OF HIDDEN UNITS IN A LAYERED NETWORK TRAINED TO CLASSIFY SONAR TARGETS [J].
GORMAN, RP ;
SEJNOWSKI, TJ .
NEURAL NETWORKS, 1988, 1 (01) :75-89
[8]   ARTIFICIAL NEURAL NETWORK MODELS OF KNOWLEDGE REPRESENTATION IN CHEMICAL-ENGINEERING [J].
HOSKINS, JC ;
HIMMELBLAU, DM .
COMPUTERS & CHEMICAL ENGINEERING, 1988, 12 (9-10) :881-890
[9]  
HOSKINS JC, 1988, SPR AICHE M HOUST
[10]  
HUANG W, 1987, 1ST IEE INT C NEUR N