Computational sample complexity

被引:13
作者
Decatur, SE [1 ]
Goldreich, O
Ron, D
机构
[1] Rutgers State Univ, DIMACS Ctr, Piscataway, NJ 08855 USA
[2] MIT, Comp Sci Lab, Cambridge, MA 02139 USA
关键词
computational learning theory; information vs. efficient computation; pseudorandom functions; error correcting codes; wire-tap channel;
D O I
10.1137/S0097539797325648
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
In a variety of PAC learning models, a trade-off between time and information seems to exist: with unlimited time, a small amount of information suffices, but with time restrictions, more information sometimes seems to be required. In addition, it has long been known that there are concept classes that can be learned in the absence of computational restrictions, but (under standard cryptographic assumptions) cannot be learned in polynomial time (regardless of sample size). Yet, these results do not answer the question of whether there are classes for which learning from a small set of examples is computationally infeasible, but becomes feasible when the learner has access to (polynomially) more examples. To address this question, we introduce a new measure of learning complexity called computational sample complexity that represents the number of examples sufficient for polynomial time learning with respect to a fixed distribution. We then show concept classes that (under similar cryptographic assumptions) possess arbitrarily sized gaps between their standard (information-theoretic) sample complexity and their computational sample complexity. We also demonstrate such gaps for learning from membership queries and learning from noisy examples.
引用
收藏
页码:854 / 879
页数:26
相关论文
共 28 条
[1]   LEARNING REGULAR SETS FROM QUERIES AND COUNTEREXAMPLES [J].
ANGLUIN, D .
INFORMATION AND COMPUTATION, 1987, 75 (02) :87-106
[2]  
[Anonymous], 1982, 23 ANN S FDN COMPUTE, DOI DOI 10.1109/SFCS.1982.45
[3]  
[Anonymous], P 23 ANN ACM S THEOR
[4]   On the sample complexity of noise-tolerant learning [J].
Aslam, JA ;
Decatur, SE .
INFORMATION PROCESSING LETTERS, 1996, 57 (04) :189-195
[5]   Generalized privacy amplification [J].
Bennett, CH ;
Brassard, G ;
Crepeau, C ;
Maurer, UM .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1995, 41 (06) :1915-1923
[6]   HOW TO GENERATE CRYPTOGRAPHICALLY STRONG SEQUENCES OF PSEUDO-RANDOM BITS [J].
BLUM, M ;
MICALI, S .
SIAM JOURNAL ON COMPUTING, 1984, 13 (04) :850-864
[7]   OCCAM RAZOR [J].
BLUMER, A ;
EHRENFEUCHT, A ;
HAUSSLER, D ;
WARMUTH, MK .
INFORMATION PROCESSING LETTERS, 1987, 24 (06) :377-380
[8]   LEARNABILITY AND THE VAPNIK-CHERVONENKIS DIMENSION [J].
BLUMER, A ;
EHRENFEUCHT, A ;
HAUSSLER, D ;
WARMUTH, MK .
JOURNAL OF THE ACM, 1989, 36 (04) :929-965
[9]   Linking information reconciliation and privacy amplification [J].
Cachin, C ;
Maurer, UM .
JOURNAL OF CRYPTOLOGY, 1997, 10 (02) :97-110
[10]  
Cover T. M., 2005, ELEM INF THEORY, DOI 10.1002/047174882X