Incremental online learning in high dimensions

被引:343
作者
Vijayakumar, S [1 ]
D'Souza, A
Schaal, S
机构
[1] Univ Edinburgh, Sch Informat, Edinburgh EH9 3JZ, Midlothian, Scotland
[2] Univ So Calif, Dept Comp Sci, Los Angeles, CA 90089 USA
关键词
D O I
10.1162/089976605774320557
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Locally weighted projection regression (LWPR) is a new algorithm for incremental nonlinear function approximation in high-dimensional spaces with redundant and irrelevant input dimensions. At its core, it employs nonparametric regression with locally linear models. In order to stay computationally efficient and numerically robust, each local model performs the regression analysis with a small number of univariate regressions in selected directions in input space in the spirit of partial least squares regression. We discuss when and how local learning techniques can successfully work in high-dimensional spaces and review the various techniques for local dimensionality reduction before finally deriving the LWPR algorithm. The properties of LWPR are that it (1) learns rapidly with second-order learning methods based on incremental training, (2) uses statistically sound stochastic leave-one-out cross validation for learning without the need to memorize training data, (3) adjusts its weighting kernels based on only local information in order to minimize the danger of negative interference of incremental learning, (4) has a computational complexity that is linear in the number of inputs, and (5) can deal with a large number of-possibly redundant-inputs, as shown in various empirical evaluations with up to 90 dimensional data sets. For a probabilistic interpretation, predictive variance and confidence intervals are derived. To our knowledge, LWPR is the first truly incremental spatially localized learning method that can successfully and efficiently operate in very high-dimensional spaces.
引用
收藏
页码:2602 / 2634
页数:33
相关论文
共 44 条
  • [1] An C.H., 1988, Model-Based Control of a Robot Manipulator
  • [2] [Anonymous], 1986, THEORY PRACTICE RECU
  • [3] Atkeson CG, 1997, ARTIF INTELL REV, V11, P75, DOI 10.1023/A:1006511328852
  • [4] Belsley DA, 1980, Regression Diagnostics: Identifying Influential Data and Sources of Collinearity
  • [5] DSOUZA A, 2001, SOC NEUR ABSTR
  • [6] Everitt BS., 1984, INTRO LATENT VARIABL
  • [7] Fahlman S. E., 1990, ADV NEURAL INFORMATI, P524, DOI DOI 10.1190/1.1821929
  • [8] A STATISTICAL VIEW OF SOME CHEMOMETRICS REGRESSION TOOLS
    FRANK, IE
    FRIEDMAN, JH
    [J]. TECHNOMETRICS, 1993, 35 (02) : 109 - 135
  • [9] PROJECTION PURSUIT REGRESSION
    FRIEDMAN, JH
    STUETZLE, W
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1981, 76 (376) : 817 - 823
  • [10] Gelman A., 1995, Bayesian Data Analysis