GENERALIZATIONS OF PRINCIPAL COMPONENT ANALYSIS, OPTIMIZATION PROBLEMS, AND NEURAL NETWORKS

被引:163
作者
KARHUNEN, J
JOUTSENSALO, J
机构
关键词
PRINCIPAL COMPONENTS; OPTIMIZATION; NEURAL NETWORK; UNSUPERVISED LEARNING; NONLINEARITY; ROBUST STATISTICS; GENERALIZED HEBBIAN ALGORITHM; OJAS RULE;
D O I
10.1016/0893-6080(94)00098-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We derive and discuss various generalizations of neural PCA (Principal Component Analysis)-type learning algorithms containing nonlinearities using optimization-based approach. Standard PCA arises as an optimal solution to several different information representation problems. We justify that this is essentially die to the fact that the solution is based on the second-order statistics only. If the respective optimization problems are generalized for nonquadratic criteria so that higher-order statistics are taken into account, their solutions will in general be different. The solutions define in a natural way several meaningful extensions of PCA and give a solid foundation for them. In this framework, we study more closely generalizations of the problems of variance maximization and mean-square error minimization. For these problems, we derive gradient-type neural learning algorithms both for symmetric and hierarchic PCA-type networks. As an important special case, the well-known Sanger's generalized Hebbian algorithm (GHA) is shown to emerge from natural optimization problems.
引用
收藏
页码:549 / 562
页数:14
相关论文
共 48 条