Incremental least squares methods and the extended Kalman filter

被引:82
作者
Bertsekas, DP
机构
[1] Dept. of Elec. Eng. and Comp. Sci., Massachusetts Inst. of Technology, Cambridge
关键词
optimization; least squares; Kalman filter;
D O I
10.1137/S1052623494268522
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this paper we propose and analyze nonlinear least squares methods which process the data incrementally, one data block at a time. Such methods are well suited for large data sets and real time operation and have received much attention in the context of neural network training problems. We focus on the extended Kalman filter, which may be viewed as an incremental version of the Gauss-Newton method. We provide a nonstochastic analysis of its convergence properties, and we discuss variants aimed at accelerating its convergence.
引用
收藏
页码:807 / 822
页数:16
相关论文
共 34 条
[1]  
Anderson B. D. O., 1979, OPTIMAL FILTERING
[2]  
[Anonymous], 1960, 1960 IRE WESCON CONV
[3]   SUBOPTIMAL STATE ESTIMATION FOR CONTINUOUS-TIME NONLINEAR SYSTEMS FOR DISCRETE NOISY MEASUREMENTS [J].
ATHANS, M ;
WISHNER, RP ;
BERTOLINI, A .
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 1968, AC13 (05) :504-+
[4]   THE ITERATED KALMAN SMOOTHER AS A GAUSS-NEWTON METHOD [J].
BELL, BM .
SIAM JOURNAL ON OPTIMIZATION, 1994, 4 (03) :626-636
[5]  
Bertsekas D., 1996, NEURO DYNAMIC PROGRA, V1st
[6]  
Bertsekas D. P., 2019, Reinforcement learning and optimal control
[7]  
Bertsekas Dimitri P., 1989, PARALLEL DISTRIBUTED
[9]  
BERTSEKAS DP, 1995, NONLINEAR PROGRAMMIN
[10]   NEW LEAST-SQUARE ALGORITHMS [J].
DAVIDON, WC .
JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 1976, 18 (02) :187-197