Push-Pull Gradient Methods for Distributed Optimization in Networks

Shi Pu, Wei Shi, Jinming Xu, Angelia Nedic

Research output: Contribution to journalArticlepeer-review

143 Scopus citations


In this article, we focus on solving a distributed convex optimization problem in a network, where each agent has its own convex cost function and the goal is to minimize the sum of the agents'cost functions while obeying the network connectivity structure. In order to minimize the sum of the cost functions, we consider new distributed gradient-based methods where each node maintains two estimates, namely an estimate of the optimal decision variable and an estimate of the gradient for the average of the agents' objective functions. From the viewpoint of an agent, the information about the gradients is pushed to the neighbors, whereas the information about the decision variable ispulled from the neighbors, hence giving the name 'push-pull gradient methods.' The methods utilize two different graphs for the information exchange among agents and, as such, unify the algorithms with different types of distributed architecture, including decentralized (peer to peer), centralized (master-slave), and semicentralized (leader-follower) architectures. We show that the proposed algorithms and their many variants converge linearly for strongly convex and smooth objective functions over a network (possibly with unidirectional data links) in both synchronous and asynchronous random-gossip settings. In particular, under the random-gossip setting, 'push-pull' is the first class of algorithms for distributed optimization over directed graphs. Moreover, we numerically evaluate our proposed algorithms in both scenarios, and show that they outperform other existing linearly convergent schemes, especially for ill-conditioned problems and networks that are not well balanced.

Original languageEnglish (US)
Article number8988200
Pages (from-to)1-16
Number of pages16
JournalIEEE Transactions on Automatic Control
Issue number1
StatePublished - Jan 2021


  • Convex optimization
  • directed graph
  • distributed optimization
  • linear convergence
  • network structure
  • random-gossip algorithm
  • spanning tree

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Computer Science Applications
  • Electrical and Electronic Engineering


Dive into the research topics of 'Push-Pull Gradient Methods for Distributed Optimization in Networks'. Together they form a unique fingerprint.

Cite this