Incremental stochastic subgradient algorithms for convex optimization

S. Sundhar Ram, A. Nedić, V. V. Veeravalli

Research output: Contribution to journalArticlepeer-review

175 Scopus citations

Abstract

This paper studies the effect of stochastic err ors on two constrained incremental subgradient algorithms. The incremental subgradient algorithms are viewed as decentralized network optimization algorithms as applied to minimize a sum of functions, when each component function is known only to a particular agent of a distributed network. First, the standard cyclic incremental subgradient algorithm is studied. In this, the agents form a ring structure and pass the iterate in a cycle. When there are stochastic errors in the subgradient evaluations, sufficient conditions on the moments of the stochastic errors are obtained that guarantee almost sure convergence when a diminishing step-size is used. In addition, almost sure bounds on the algorithm's performance with a constant step-size are also obtained. Next, the Markov randomized incremental subgradient method is studied. This is a noncyclic version of the incremental algorithm where the sequence of computing agents is modeled as a time nonhomogeneous Markov chain. Such a model is appropriate for mobile networks, as the network topology changes across time in these networks. Convergence results and error bounds for the Markov randomized method in the presence of stochastic errors for diminishing and constant step-sizes are obtained.

Original languageEnglish (US)
Pages (from-to)691-717
Number of pages27
JournalSIAM Journal on Optimization
Volume20
Issue number2
DOIs
StatePublished - 2009
Externally publishedYes

Keywords

  • Convex optimization
  • Incremental optimization
  • Network optimization
  • Random networks
  • Stochastic approximation
  • Subgradient

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Incremental stochastic subgradient algorithms for convex optimization'. Together they form a unique fingerprint.

Cite this