Attribute reduction in decision-theoretic rough set models

被引:549
作者
Yao, Yiyu [1 ]
Zhao, Yan [1 ]
机构
[1] Univ Regina, Dept Comp Sci, Regina, SK S4S 0A2, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
attribute reduction; decision-theoretic rough set model; Pawlak rough set model;
D O I
10.1016/j.ins.2008.05.010
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Rough set theory can be applied to rule induction. There are two different types of classification rules, positive and boundary rules, leading to different decisions and consequences. They can be distinguished not only from the syntax measures such as confidence, coverage and generality, but also the semantic measures such as decision-monotocity, cost and risk. The classification rules can be evaluated locally for each individual rule, or globally for a set of rules. Both the two types of classification rules can be generated from, and interpreted by, a decision-theoretic model, which is a probabilistic extension of the Pawlak rough set model. As an important concept of rough set theory, an attribute reduct is a subset of attributes that are jointly sufficient and individually necessary for preserving a particular property of the given information table. This paper addresses attribute reduction in decision-theoretic rough set models regarding different classification properties, such as: decision-monotocity, confidence, coverage, generality and cost. It is important to note that many of these properties can be truthfully reflected by a single measure gamma in the Pawlak rough set model. On the other hand, they need to be considered separately in probabilistic models. A straightforward extension of the gamma measure is unable to evaluate these properties. This study provides a new insight into the problem of attribute reduction. Crown Copyright (c) 2008 Published by Elsevier Inc. All rights reserved.
引用
收藏
页码:3356 / 3373
页数:18
相关论文
共 52 条
[1]   Discovering rules for water demand prediction: An enhanced rough-set approach (Reprinted from Proceedings of the International Joint Conference on Artificial Intelligence) [J].
An, AJ ;
Shan, N ;
Chan, C ;
Cercone, N ;
Ziarko, W .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 1996, 9 (06) :645-653
[2]  
[Anonymous], 1992, INTELLIGENT DECISION
[3]  
Bazan JG, 2000, STUD FUZZ SOFT COMP, V56, P49
[4]   Information-theoretic measures of uncertainty for rough sets and rough relational databases [J].
Beaubouef, T ;
Petry, FE ;
Arora, G .
INFORMATION SCIENCES, 1998, 109 (1-4) :185-195
[5]   Reducts within the variable precision rough sets model: A further investigation [J].
Beynon, M .
EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2001, 134 (03) :592-605
[6]  
Clark P., 1993, P 10 INT C MACH LEAR, P49
[7]  
Duda R. O., 1973, Pattern Classification
[8]  
DUNTSCH I, 1998, ARTIF INTELL, V106, P77107
[9]   Can Bayesian confirmation measures be useful for rough set decision rules? [J].
Greco, S ;
Pawlak, Z ;
Slowinski, R .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2004, 17 (04) :345-361
[10]  
Greco S, 2005, LECT NOTES ARTIF INT, V3641, P314, DOI 10.1007/11548669_33