Singularities affect dynamics of learning in neuromanifolds

被引:79
作者
Amari, Shun-ichi [1 ]
Park, Hyeyoung [1 ]
Ozeki, Tomoko [1 ]
机构
[1] Kyungpook Natl Univ, Kyungpook, South Korea
关键词
D O I
10.1162/neco.2006.18.5.1007
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The parameter spaces of hierarchical systems such as multilayer perceptrons include singularities due to the symmetry and degeneration of hidden units. A parameter space forms a geometrical manifold, called the neuromanifold in the case of neural networks. Such a model is identified with a statistical model, and a Riemannian metric is given by the Fisher information matrix. However, the matrix degenerates at singularities. Such a singular structure is ubiquitous not only in multilayer perceptrons but also in the gaussian mixture probability densities, ARMA time-series model, and many other cases. The standard statistical paradigm of the Cramer-Rao theorem does not hold, and the singularity gives rise to strange behaviors in parameter estimation, hypothesis testing, Bayesian inference, model selection, and in particular, the dynamics of learning from examples. Prevailing theories so far have not paid much attention to the problem caused by singularity, relying only on ordinary statistical theories developed for regular (nonsingular) models. Only recently have researchers remarked on the effects of singularity, and theories are now being developed. This article gives an overview of the phenomena caused by the singularities of statistical manifolds related to multilayer perceptrons and gaussian mixtures. We demonstrate our recent results on these problems. Simple toy models are also used to show explicit solutions. We explain that the maximum likelihood estimator is no longer subject to the gaussian distribution even asymptotically, because the Fisher information matrix degenerates, that the model selection criteria such as AIC, BIC, and MDL fail to hold in these models, that a smooth Bayesian prior becomes singular in such models, and that the trajectories of dynamics of learning are strongly affected by the singularity, causing plateaus or slow manifolds in the parameter space. The natural gradient method is shown to perform well because it takes the singular geometrical structure into account. The generalization error and the training error are studied in some examples.
引用
收藏
页码:1007 / 1065
页数:59
相关论文
共 56 条
[51]   Algebraic analysis for nonidentifiable learning machines [J].
Watanabe, S .
NEURAL COMPUTATION, 2001, 13 (04) :899-933
[52]   On the volume of tubes [J].
Weyl, H .
AMERICAN JOURNAL OF MATHEMATICS, 1939, 61 :461-472
[53]   Population coding and decoding in a neural field: A computational study [J].
Wu, S ;
Amari, S ;
Nakahara, H .
NEURAL COMPUTATION, 2002, 14 (05) :999-1026
[54]   Population coding with correlation and an unfaithful model [J].
Wu, S ;
Nakahara, H ;
Amari, S .
NEURAL COMPUTATION, 2001, 13 (04) :775-797
[55]  
Yamazaki K., 2002, Transactions of the Institute of Electronics, Information and Communication Engineers D-II, VJ85D-II, P363
[56]   Singularities in mixture models and upper bounds of stochastic complexity [J].
Yamazaki, K ;
Watanabe, S .
NEURAL NETWORKS, 2003, 16 (07) :1029-1038