Connectionist-based Dempster-Shafer evidential reasoning for data fusion

被引:46
作者
Basir, O [1 ]
Karray, F [1 ]
Zhu, HW [1 ]
机构
[1] Univ Waterloo, Pattern Anal & Machine Intelligence Res Grp, Dept Elect & Comp Engn, Waterloo, ON N2L 3G1, Canada
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2005年 / 16卷 / 06期
关键词
data fusion; Dempster-Shafer evidence theory (DSET); DSET-based neural network (DSETNN); neural network;
D O I
10.1109/TNN.2005.853337
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Dempster-Shafer evidence theory (DSET) is a popular paradigm for dealing with uncertainty and imprecision. Its corresponding evidential reasoning framework is theoretically attractive. However, there are outstanding issues that hinder its use in real-life applications. Two prominent issues in this regard are 1) the issue of basic probability assignments (masses) and 2) the issue of dependence among information sources. This paper attempts to deal with these issues by utilizing neural networks in the context of pattern classification application. First, a multilayer perceptron neural network with the mean squared error as a cost function is implemented to calculate, for each information source, posteriori probabilities for all classes. Second, an evidence structure construction scheme is developed for transferring the estimated posteriori probabilities to a set of masses along with the corresponding focal elements, from a Bayesian decision point of view. Third, a network realization of the Dempster-Shafer evidential reasoning is designed and analyzed, and it is further extended to a DSET-based neural network, referred to as DSETNN, to manipulate the evidence structures. In order to tackle the issue of dependence between sources, DSETNN is tuned for optimal performance through a supervised learning process. To demonstrate the effectiveness of the proposed approach, we apply it to three benchmark pattern classification problems. Experiments reveal that the DSETNN outperforms DSET and provide encouraging results in terms of classification accuracy and the speed of learning convergence.
引用
收藏
页码:1513 / 1530
页数:18
相关论文
共 46 条
[21]  
HECHTNIELSEN R, 1989, P INT JOINT C NEUR N, V3, P11
[22]   A constructive algorithm for training cooperative neural network ensembles [J].
Islam, M ;
Yao, X ;
Murase, K .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2003, 14 (04) :820-834
[23]   PROBABILITY-POSSIBILITY TRANSFORMATIONS - A COMPARISON [J].
KLIR, GJ ;
PARVIZ, B .
INTERNATIONAL JOURNAL OF GENERAL SYSTEMS, 1992, 21 (03) :291-310
[24]   Introduction of neighborhood information in evidence theory and application to data fusion of radar and optical images with partial cloud cover [J].
Le Hegarat-Mascle, S ;
Bloch, I ;
Vidal-Madjar, D .
PATTERN RECOGNITION, 1998, 31 (11) :1811-1823
[25]   Application of Dempster-Shafer evidence theory to unsupervised classification in multisource remote sensing [J].
LeHegaratMascle, S ;
Bloch, I ;
VidalMadjar, D .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 1997, 35 (04) :1018-1031
[26]  
POGGIO T, 1988, 1140 MIT
[27]   Classification capacity of a modular neural network implementing neurally inspired architecture and training rules [J].
Poirazi, P ;
Neocleous, C ;
Pattichis, CS ;
Schizas, CN .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2004, 15 (03) :597-612
[28]   Neural Network Classifiers Estimate Bayesian a posteriori Probabilities [J].
Richard, Michael D. ;
Lippmann, Richard P. .
NEURAL COMPUTATION, 1991, 3 (04) :461-483
[29]   Study of Dempster-Shafer theory for image segmentation applications [J].
Rombaut, M ;
Zhu, YM .
IMAGE AND VISION COMPUTING, 2002, 20 (01) :15-23
[30]  
Rumelhart DE, 1986, PARALLEL DISTRIBUTED, V1, DOI DOI 10.7551/MITPRESS/5236.001.0001