Universal learning curves of support vector machines

被引:30
作者
Opper, M [1 ]
Urbanczik, R
机构
[1] Aston Univ, Dept Comp Sci & Appl Math, Birmingham B4 7ET, W Midlands, England
[2] Univ Wurzburg, Inst Theoret Phys, D-97074 Wurzburg, Germany
关键词
Computational complexity - Error analysis - Gaussian noise (electronic) - Learning systems - Neural networks - Oscillations;
D O I
10.1103/PhysRevLett.86.4410
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Using methods of statistical physics, we investigate the role of model complexity in learning with support vector machines (SVMs), which are an important alternative to neural networks. We show the advantages of using SVMs with kernels of infinite complexity on noisy target rules, which, in contrast to common theoretical beliefs, are found to achieve optimal generalization error although the training error does not converge to the generalization error. Moreover, we find a universal asymptotics of the learning curves which depend only on the target rule but not on the SVM kernel.
引用
收藏
页码:4410 / 4413
页数:4
相关论文
共 5 条