Abstract
We consider the gradient method cursive Greek chit+1 = cursive Greek chit + γt(st + wt), where st is a descent direction of a function f : ℛn → ℛ and wt is a deterministic or stochastic error. We assume that ∇f is Lipschitz continuous, that the stepsize γt diminishes to 0, and that st and wt satisfy standard conditions. We show that either f(cursive Greek chit) → -∞ or f(cursive Greek chit) converges to a finite value and ∇ f(cursive Greek chit) → 0 (with probability 1 in the stochastic case), and in doing so, we remove various boundedness conditions that are assumed in existing results, such as boundedness from below of f, boundedness of ∇ f(cursive Greek chit), or boundedness of cursive Greek chit.
Original language | English (US) |
---|---|
Pages (from-to) | 627-642 |
Number of pages | 16 |
Journal | SIAM Journal on Optimization |
Volume | 10 |
Issue number | 3 |
DOIs | |
State | Published - 2000 |
Externally published | Yes |
Keywords
- Gradient convergence
- Gradient methods
- Incremental gradient methods
- Stochastic approximation
ASJC Scopus subject areas
- Software
- Theoretical Computer Science