Abstract
We consider the minimization of a sum Σmi=1 fi (x) consisting of a large number of convex component functions fi . For this problem, incremental methods consisting of gradient or subgradient iterations applied to single components have proved very effective. We propose new incremental methods, consisting of proximal iterations applied to single components, as well as combinations of gradient, subgradient, and proximal iterations. We provide a convergence and rate of convergence analysis of a variety of such methods, including some that involve randomization in the selection of components.We also discuss applications in a few contexts, including signal processing and inference/machine learning.
Original language | English (US) |
---|---|
Pages (from-to) | 163-195 |
Number of pages | 33 |
Journal | Mathematical Programming |
Volume | 129 |
Issue number | 2 |
DOIs | |
State | Published - Oct 2011 |
Externally published | Yes |
Keywords
- Convex
- Gradient method
- Incremental method
- Proximal algorithm
ASJC Scopus subject areas
- Software
- Mathematics(all)