The Perceptron Algorithm is Fast for Nonmalicious Distributions

被引:34
作者
Baum, Eric B. [1 ]
机构
[1] NEC Res Inst, 4 Independence Way, Princeton, NJ 08540 USA
关键词
D O I
10.1162/neco.1990.2.2.248
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Within the context of Valiant's protocol for learning, the perceptron algorithm is shown to learn an arbitrary half-space in time O(n(2)/epsilon(3)) if D, the probability distribution of examples, is taken uniform over the unit sphere S-n. Here e is the accuracy parameter. This is surprisingly fast, as "standard" approaches involve solution of a linear programming problem involving Omega(n/epsilon) constraints in n dimensions. A modification of Valiant's distribution-independent protocol for learning is proposed in which the distribution and the function to be learned may be chosen by adversaries, however these adversaries may not communicate. It is argued that this definition is more reasonable and applicable to real world learning than Valiant's. Under this definition, the perceptron algorithm is shown to be a distribution-independent learning algorithm. In an appendix we show that, for uniform distributions, some classes of infinite V-C dimension including convex sets and a class of nested differences of convex sets are learnable.
引用
收藏
页码:248 / 260
页数:13
相关论文
共 10 条