Statistical independence and novelty detection with information preserving nonlinear maps

被引:85
作者
Parra, L
Deco, G
Miesbach, S
机构
[1] Siemens AG, Corporate Research and Development, 81739 Munich
关键词
D O I
10.1162/neco.1996.8.2.260
中图分类号
TP18 [人工智能理论];
学科分类号
081104 [模式识别与智能系统]; 0812 [计算机科学与技术]; 0835 [软件工程]; 1405 [智能科学与技术];
摘要
According to Barlow (1989), feature extraction can be understood as finding a statistically independent representation of the probability distribution underlying the measured signals. The search for a statistically independent representation can be formulated by the criterion of minimal mutual information, which reduces to decorrelation in the case of gaussian distributions. If nongaussian distributions are to be considered, minimal mutual information is the appropriate generalization of decorrelation as used in linear Principal Component Analyses (PCA). We also generalize to nonlinear transformations by only demanding perfect transmission of information. This leads to a general class of nonlinear transformations, namely symplectic maps. Conservation of information allows us to consider only the statistics of single coordinates. The resulting factorial representation of the joint probability distribution gives a density estimation. We apply this concept to the real world problem of electrical motor fault detection treated as a novelty detection task.
引用
收藏
页码:260 / 269
页数:10
相关论文
共 19 条
[1]
ABRAHAM R, 1978, F MECHANICS
[2]
[Anonymous], LECT NOTES MATH
[3]
Unsupervised Learning [J].
Barlow, H. B. .
NEURAL COMPUTATION, 1989, 1 (03) :295-311
[4]
AN INFORMATION MAXIMIZATION APPROACH TO BLIND SEPARATION AND BLIND DECONVOLUTION [J].
BELL, AJ ;
SEJNOWSKI, TJ .
NEURAL COMPUTATION, 1995, 7 (06) :1129-1159
[5]
INDEPENDENT COMPONENT ANALYSIS, A NEW CONCEPT [J].
COMON, P .
SIGNAL PROCESSING, 1994, 36 (03) :287-314
[6]
LEARNING TIME-SERIES EVOLUTION BY UNSUPERVISED EXTRACTION OF CORRELATIONS [J].
DECO, G ;
SCHURMANN, B .
PHYSICAL REVIEW E, 1995, 51 (03) :1780-1790
[7]
DECO G, 1996, IN PRESS NEURAL NETW
[8]
DECO G, 1995, IN PRESS NEURAL NETW
[9]
Duda R. O., 1973, PATTERN CLASSIFICATI, V3
[10]
MULTILAYER FEEDFORWARD NETWORKS ARE UNIVERSAL APPROXIMATORS [J].
HORNIK, K ;
STINCHCOMBE, M ;
WHITE, H .
NEURAL NETWORKS, 1989, 2 (05) :359-366