A RESULT OF VAPNIK WITH APPLICATIONS

被引:36
作者
ANTHONY, M [1 ]
SHAWETAYLOR, J [1 ]
机构
[1] UNIV LONDON ROYAL HOLLOWAY & BEDFORD NEW COLL,DEPT COMP SCI,EGHAM TW20 0EX,SURREY,ENGLAND
关键词
D O I
10.1016/0166-218X(93)90126-9
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
A new proof of a result due to Vapnik is given. Its implications for the theory of PAC learnability are discussed, with particular reference to the learnability of functions taking values in a countable set. An application to the theory of artificial neural networks is then given.
引用
收藏
页码:207 / 217
页数:11
相关论文
共 16 条
[1]  
ANTHONY M, 1990, COLT 90, P246
[2]   What Size Net Gives Valid Generalization? [J].
Baum, Eric B. ;
Haussler, David .
NEURAL COMPUTATION, 1989, 1 (01) :151-160
[3]  
BIGGS N, IN PRESS DISCRETE MA
[4]   LEARNABILITY AND THE VAPNIK-CHERVONENKIS DIMENSION [J].
BLUMER, A ;
EHRENFEUCHT, A ;
HAUSSLER, D ;
WARMUTH, MK .
JOURNAL OF THE ACM, 1989, 36 (04) :929-965
[5]   DECISION THEORETIC GENERALIZATIONS OF THE PAC MODEL FOR NEURAL NET AND OTHER LEARNING APPLICATIONS [J].
HAUSSLER, D .
INFORMATION AND COMPUTATION, 1992, 100 (01) :78-150
[6]   QUANTIFYING INDUCTIVE BIAS - AI LEARNING ALGORITHMS AND VALIANTS LEARNING FRAMEWORK [J].
HAUSSLER, D .
ARTIFICIAL INTELLIGENCE, 1988, 36 (02) :177-221
[7]  
HAUSSLER D, 1989, UCSCCRL8930 U CAL CO
[8]  
McClelland J. L., 1986, PARALLEL DISTRIBUTED, V1
[9]  
Natarajan B. K., 1989, Machine Learning, V4, P67, DOI 10.1023/A:1022605311895
[10]  
Pollard D., 1984, CONVERGENCE STOCHAST