Algorithmic stability and sanity-check bounds for leave-one-out cross-validation

被引:325
作者
Kearns, M [1 ]
Ron, D
机构
[1] AT&T Labs Res, Florham Park, NJ 07932 USA
[2] Tel Aviv Univ, Dept EE Syst, IL-69978 Ramat Aviv, Israel
关键词
D O I
10.1162/089976699300016304
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this article we prove sanity-check bounds for the error of the leave-one-out cross-validation estimate of the generalization error: that is, bounds showing that the worst-case error of this estimate is not much worse than that of the training error estimate. The name sanity check refers to the fact that although we often expect the leave-one-out estimate to perform considerably better than the training error estimate, we are here only seeking assurance that its performance will not be considerably worse. Perhaps surprisingly, such assurance has been given only for limited cases in the prior literature on cross-validation. Any nontrivial bound on the error of leave-one-out must rely on some notion of algorithmic stability. Previous bounds relied on the rather strong notion of hypothesis stability, whose application was primarily limited to nearest-neighbor and other local algorithms. Here we introduce the new and weaker notion of error stability and apply it to obtain sanity-check bounds for leave-one-out for other classes of learning algorithms, including training error minimization procedures and Bayesian algorithms. We also provide lower bounds demonstrating the necessity of some form of error stability for proving bounds on the error of the leave-one-out estimate, and the fact that for training error minimization algorithms, in the worst case such bounds must still depend on the Vapnik-Chervonenkis dimension of the hypothesis class.
引用
收藏
页码:1427 / 1453
页数:27
相关论文
共 17 条
[1]  
[Anonymous], 1982, ESTIMATION DEPENDENC
[2]  
[Anonymous], 1990, SUBSET SELECTION REG, DOI DOI 10.1007/978-1-4899-2939-6
[3]  
Devroye L., 1996, A probabilistic theory of pattern recognition
[4]   DISTRIBUTION-FREE INEQUALITIES FOR THE DELETED AND HOLDOUT ERROR ESTIMATES [J].
DEVROYE, LP ;
WAGNER, TJ .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1979, 25 (02) :202-207
[5]   DISTRIBUTION-FREE PERFORMANCE BOUNDS FOR POTENTIAL FUNCTION RULES [J].
DEVROYE, LP ;
WAGNER, TJ .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1979, 25 (05) :601-604
[6]   STOCHASTIC RELAXATION, GIBBS DISTRIBUTIONS, AND THE BAYESIAN RESTORATION OF IMAGES [J].
GEMAN, S ;
GEMAN, D .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1984, 6 (06) :721-741
[7]  
Hart P.E., 1973, Pattern recognition and scene analysis
[8]  
Haussler D, 1996, MACH LEARN, V25, P195, DOI 10.1007/BF00114010
[9]   DECISION THEORETIC GENERALIZATIONS OF THE PAC MODEL FOR NEURAL NET AND OTHER LEARNING APPLICATIONS [J].
HAUSSLER, D .
INFORMATION AND COMPUTATION, 1992, 100 (01) :78-150
[10]  
Holden S. B., 1996, Proceedings of the Ninth Annual Conference on Computational Learning Theory, P41, DOI 10.1145/238061.238067