Efficient minimisation of the KL distance for the approximation of posterior conditional probabilities

被引:4
作者
Battisti, M [1 ]
Burrascano, P [1 ]
Pirollo, D [1 ]
机构
[1] UNIV PERUGIA,IST ELETTRON,I-06143 PERUGIA,ITALY
关键词
conditional probabilities estimate; Kullback-Leibler distance; MLP classifier training;
D O I
10.1023/A:1009605310499
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The minimisation of a least mean squares cost function produces poor results in the ranges of the input variable where the quantity to be approximated takes on relatively low values. This can be a problem if an accurate approximation is required in a wide dynamic range. The present paper approaches this problem in the case of multilayer perceptrons trained to approximate the posterior conditional probabilities in a multicategory classification problem. The use of a cost function derived from the Kullback-Leibler information distance measure is proposed and a computationally light algorithm is derived for its minimisation. The effectiveness of the procedure is experimentally verified.
引用
收藏
页码:47 / 55
页数:9
相关论文
共 9 条
[1]   BACKPROPAGATION AND STOCHASTIC GRADIENT DESCENT METHOD [J].
AMARI, S .
NEUROCOMPUTING, 1993, 5 (4-5) :185-196
[2]   A NORM SELECTION CRITERION FOR THE GENERALIZED DELTA RULE [J].
BURRASCANO, P .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1991, 2 (01) :125-130
[3]  
BURRASCANO P, 1996, NEUROCOMPUTING, V13
[4]   BAYES STATISTICAL BEHAVIOR AND VALID GENERALIZATION OF PATTERN CLASSIFYING NEURAL NETWORKS [J].
KANAYA, F ;
MIYAKE, S .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1991, 2 (04) :471-475
[5]  
KULLBACK S, 1959, INFORMATION THEORY S
[6]   A NEURAL NETWORK APPROACH TO A BAYESIAN STATISTICAL DECISION PROBLEM [J].
MIYAKE, S ;
KANAYA, F .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1991, 2 (05) :538-540
[7]  
Ruck D W, 1990, IEEE Trans Neural Netw, V1, P296, DOI 10.1109/72.80266
[8]  
Rumelhart D.E., 1987, Parallel Distributed Processing: Explorations in the Microstructure of Cognition, P318
[9]  
Wan E A, 1990, IEEE Trans Neural Netw, V1, P303, DOI 10.1109/72.80269