Divergence based feature selection for multimodal class densities

被引:67
作者
Novovicova, J [1 ]
Pudil, P [1 ]
Kittler, J [1 ]
机构
[1] UNIV SURREY,DEPT ELECTR & ELECT ENGN,GUILDFORD GU2 5XH,SURREY,ENGLAND
关键词
feature selection; feature ordering; mixture distribution; maximum likelihood; EM algorithm; Kullback J-divergence;
D O I
10.1109/34.481557
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A new feature selection procedure based on the Kullback J-divergence between two class conditional density functions approximated by a finite mixture of parameterized densities of a special type is presented. This procedure is suitable especially for multimodal data. Apart from finding a feature subset of any cardinality without involving any search procedure, it also simultaneously yields a pseudo-Bayes decision rule. Its performance is tested on real data.
引用
收藏
页码:218 / 223
页数:6
相关论文
共 11 条
[1]  
[Anonymous], 1982, Pattern recognition: A statistical approach
[2]   SOME ASPECTS OF ERROR-BOUNDS IN FEATURE-SELECTION [J].
BOEKEE, DE ;
VANDERLUBBE, JCA .
PATTERN RECOGNITION, 1979, 11 (5-6) :353-360
[3]   MAXIMUM LIKELIHOOD FROM INCOMPLETE DATA VIA EM ALGORITHM [J].
DEMPSTER, AP ;
LAIRD, NM ;
RUBIN, DB .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-METHODOLOGICAL, 1977, 39 (01) :1-38
[4]  
Duda R. O., 1973, PATTERN CLASSIFICATI, V3
[5]  
GRIM J, 1986, KYBERNETIKA, V22, P142
[6]  
GRIM J, 1982, KYBERNETIKA, V18, P173
[7]  
JAIN AK, 1987, PATTERN RECOGN, P1
[8]   FLOATING SEARCH METHODS IN FEATURE-SELECTION [J].
PUDIL, P ;
NOVOVICOVA, J ;
KITTLER, J .
PATTERN RECOGNITION LETTERS, 1994, 15 (11) :1119-1125
[9]   SIMULTANEOUS LEARNING OF DECISION RULES AND IMPORTANT ATTRIBUTES FOR CLASSIFICATION PROBLEMS IN IMAGE-ANALYSIS [J].
PUDIL, P ;
NOVOVICOVA, J ;
KITTLER, J .
IMAGE AND VISION COMPUTING, 1994, 12 (03) :193-198
[10]  
PUDIL P, 1993, P BMVC 93, V1, P15