IMPROVING GENERALIZATION WITH ACTIVE LEARNING

被引:455
作者
COHN, D
ATLAS, L
LADNER, R
机构
[1] UNIV WASHINGTON,DEPT ELECT ENGN,SEATTLE,WA 98195
[2] UNIV WASHINGTON,DEPT COMP SCI & ENGN,SEATTLE,WA 98195
关键词
QUERIES; ACTIVE LEARNING; GENERALIZATION; VERSION SPACE; NEURAL NETWORKS;
D O I
10.1023/A:1022673506211
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Active learning differs from ''learning from examples'' in that the learning algorithm assumes at least some control over what pan of the input domain it receives information about. In some situations, active learning is provably more powerful than learning from examples alone, giving better generalization for a fixed number of training examples. In this article, we consider the problem of learning a binary concept in the absence of noise. We describe a formalism for active concept learning called selective sampling and show how it may be approximately implemented by a neural network. In selective sampling, a learner receives distribution information from the environment and queries an oracle on parts of the domain it considers ''useful.'' We test our implementation, called an SG-network, on three domains and observe significant improvement in generalization.
引用
收藏
页码:201 / 221
页数:21
相关论文
共 23 条
[1]  
AGGOUNE M, 1989, P INT S CIRCUITS SYS
[2]  
ANGLUIN D, 1986, YALEUDCSTR64 YAL U D
[3]  
ASH T, 1989, ICS8901 U CAL I COGN
[4]  
AUM E, 1989, ADV NEURAL INFORMATI, V1
[5]  
BAUM EB, 1991, ADV NEURAL INFORMATI, V3
[6]  
BLUM A, 1989, ADV NEURAL INFORMATI, V1
[7]   LEARNABILITY AND THE VAPNIK-CHERVONENKIS DIMENSION [J].
BLUMER, A ;
EHRENFEUCHT, A ;
HAUSSLER, D ;
WARMUTH, MK .
JOURNAL OF THE ACM, 1989, 36 (04) :929-965
[8]   HOW TIGHT ARE THE VAPNIK-CHERVONENKIS BOUNDS [J].
COHN, D ;
TESAURO, G .
NEURAL COMPUTATION, 1992, 4 (02) :249-269
[9]  
COHN D, 1990, ADV NEURAL INFORMATI, V2
[10]  
EISENBERG B, 1990, 3RD ACM ANN WORKSH C