Abstract
In this paper we propose and analyze nonlinear least squares methods which process the data incrementally, one data block at a time. Such methods are well suited for large data sets and real time operation and have received much attention in the context of neural network training problems. We focus on the extended Kalman filter, which may be viewed as an incremental version of the Gauss-Newton method. We provide a nonstochastic analysis of its convergence properties, and we discuss variants aimed at accelerating its convergence.
Original language | English (US) |
---|---|
Pages (from-to) | 807-822 |
Number of pages | 16 |
Journal | SIAM Journal on Optimization |
Volume | 6 |
Issue number | 3 |
DOIs | |
State | Published - Aug 1996 |
Externally published | Yes |
Keywords
- Kalman filter
- Least squares
- Optimization
ASJC Scopus subject areas
- Software
- Theoretical Computer Science