REDUCING DATA DIMENSIONALITY THROUGH OPTIMIZING NEURAL-NETWORK INPUTS

被引:134
作者
TAN, SF [1 ]
MAVROVOUNIOTIS, ML [1 ]
机构
[1] NORTHWESTERN UNIV,DEPT CHEM ENGN,EVANSTON,IL 60208
关键词
D O I
10.1002/aic.690410612
中图分类号
TQ [化学工业];
学科分类号
0817 ;
摘要
A neural network method for reducing data dimensionality based on the concept of input training, in which each input pattern is not fired but adjusted along with internal network parameters to reproduce its corresponding output pattern, is presented. With input adjustment, a properly configured network can be trained to reproduce a given data set with minimum distortion; the trained network inputs provide reduced data. A three-layer network with input training can perform all functions of a five-layer autoassociative network, essentially capturing nonlinear correlations among data. In addition, simultaneous training of a network and its inputs is shown to be significantly more efficient in reducing data dimensionality than training an autoassociative network. The concept of input training is closely related to principal component analysis (PCA) and the principal curve method, which is a nonlinear extension of PCA.
引用
收藏
页码:1471 / 1480
页数:10
相关论文
共 19 条
[1]  
ABBAS HM, 1993, P I ELEC ENG I COMMU, V2, P135
[2]  
ACKLEY DH, 1985, COGNITIVE SCI, V9, P147
[3]  
CARROLL DJ, 1969, P ANN CONVENTION AM, P103
[4]  
COTTRELL GW, 1987, 9TH P ANN C COGN SCI, P461
[5]  
Cybenko G., 1989, Mathematics of Control, Signals, and Systems, V2, P303, DOI 10.1007/BF02551274
[6]  
DONG D, 1994, UNPUB COMPUT CHEM EN
[7]   PRINCIPAL CURVES [J].
HASTIE, T ;
STUETZLE, W .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1989, 84 (406) :502-516
[8]  
HERTZ J, 1991, INTRO THEORY NEURAL, P198
[9]  
JOLLIFFE IT, 1986, PRINCIPAL COMPONENT, P122
[10]   NONLINEAR PRINCIPAL COMPONENT ANALYSIS USING AUTOASSOCIATIVE NEURAL NETWORKS [J].
KRAMER, MA .
AICHE JOURNAL, 1991, 37 (02) :233-243