Variable selection with neural networks

被引:56
作者
Cibas, T
Soulie, FF
Gallinari, P
Raudys, S
机构
[1] UNIV PARIS 06, IBP, LAFORIA, F-75252 PARIS 05, FRANCE
[2] UNIV PARIS 11, LRI, F-91405 ORSAY, FRANCE
[3] SLIGOS, F-92142 CLAMART, FRANCE
[4] INST MATH INFORMAT, DEPT DATA ANAL, LT-2600 VILNIUS, LITHUANIA
关键词
variable selection; regularization; neural network pruning; dimensionality reduction;
D O I
10.1016/0925-2312(95)00121-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we present 3 different neural network-based methods to perform variable selection. OCD - Optimal Cell Damage - is a pruning method, which evaluates the usefulness of a variable and prunes the least useful ones (it is related to the Optimal Brain Damage method of Le Cun et al.). Regularization theory proposes to constrain estimators by adding a term to the cost function used to train a neural network. In the Bayesian framework, this additional term can be interpreted as the log prior to the weights distribution. We propose to use two priors (a Gaussian and a Gaussian mixture) and show that this regularization approach allows to select efficient subsets of variables. Our methods are compared to conventional statistical selection procedures and are shown to significantly improve on that.
引用
收藏
页码:223 / 248
页数:26
相关论文
共 38 条