Feature selection using Joint Mutual Information Maximisation

被引:526
作者
Bennasar, Mohamed [1 ]
Hicks, Yulia [1 ]
Setchi, Rossitza [1 ]
机构
[1] Cardiff Univ, Sch Engn, Cardiff CF24 3AA, S Glam, Wales
关键词
Feature selection; Mutual information; Joint mutual information; Conditional mutual information; Subset feature selection; Classification; Dimensionality reduction; Feature selection stability; RELEVANCE; CLASSIFICATION; ALGORITHM;
D O I
10.1016/j.eswa.2015.07.007
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection is used in many application areas relevant to expert and intelligent systems, such as data mining and machine learning, image processing, anomaly detection, bioinformatics and natural language processing. Feature selection based on information theory is a popular approach due its computational efficiency, scalability in terms of the dataset dimensionality, and independence from the classifier. Common drawbacks of this approach are the lack of information about the interaction between the features and the classifier, and the selection of redundant and irrelevant features. The latter is due to the limitations of the employed goal functions leading to overestimation of the feature significance. To address this problem, this article introduces two new nonlinear feature selection methods, namely Joint Mutual Information Maximisation (JMIM) and Normalised Joint Mutual Information Maximisation (NJMIM); both these methods use mutual information and the 'maximum of the minimum' criterion, which alleviates the problem of overestimation of the feature significance as demonstrated both theoretically and experimentally. The proposed methods are compared using eleven publically available datasets with five competing methods. The results demonstrate that the JMIM method outperforms the other methods on most tested public datasets, reducing the relative average classification error by almost 6% in comparison to the next best performing method. The statistical significance of the results is confirmed by the ANOVA test. Moreover, this method produces the best trade-off between accuracy and stability. (C) 2015 The Authors. Published by Elsevier Ltd.
引用
收藏
页码:8520 / 8532
页数:13
相关论文
共 48 条
[1]  
[Anonymous], 2003, LECT NOTES COMPUTER
[2]  
[Anonymous], 2008, P 14 ACM SIGKDD INT
[3]  
[Anonymous], 2006, Stud Fuzziness Soft Comput
[4]  
Asif M.N., 2009, ICGST Int. J. Graph. Vis. Image Process. GVIP, V9, P11
[5]  
Bache K, 2013, UCI Machine Learning Repository
[6]   USING MUTUAL INFORMATION FOR SELECTING FEATURES IN SUPERVISED NEURAL-NET LEARNING [J].
BATTITI, R .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (04) :537-550
[7]   A review of feature selection methods on synthetic data [J].
Bolon-Canedo, Veronica ;
Sanchez-Marono, Noelia ;
Alonso-Betanzos, Amparo .
KNOWLEDGE AND INFORMATION SYSTEMS, 2013, 34 (03) :483-519
[8]  
Brown G, 2012, J MACH LEARN RES, V13, P27
[9]   A survey on feature selection methods [J].
Chandrashekar, Girish ;
Sahin, Ferat .
COMPUTERS & ELECTRICAL ENGINEERING, 2014, 40 (01) :16-28
[10]   Conditional Mutual Information-Based Feature Selection Analyzing for Synergy and Redundancy [J].
Cheng, Hongrong ;
Qin, Zhiguang ;
Feng, Chaosheng ;
Wang, Yong ;
Li, Fagen .
ETRI JOURNAL, 2011, 33 (02) :210-218