Decentralized Gradient Methods With Time-Varying Uncoordinated Stepsizes: Convergence Analysis and Privacy Design

Yongqiang Wang, Angelia Nedic

Research output: Contribution to journalArticlepeer-review

Abstract

Decentralized optimization enables a network of agents to cooperatively optimize an overall objective function without a central coordinator and is gaining increased attention in domains as diverse as control, sensor networks, data mining, and robotics. However, the information sharing among agents in decentralized optimization also discloses agents&#x0027; information, which is undesirable or even unacceptable when involved data are sensitive. This paper proposes two gradient based decentralized optimization algorithms that can protect participating agents&#x0027; privacy without compromising optimization accuracy or incurring heavy communication/computational overhead. Both algorithms leverage a judiciously designed mixing matrix and time-varying uncoordinated stepsizes to enable privacy, one using diminishing stepsizes while the other using non-diminishing stepsizes. In both algorithms, when interacting with any one of its neighbors, a participating agent only needs to share <italic>one</italic> message in each iteration, which is in contrast to most gradient-tracking based algorithms requiring every agent to share two messages (an optimization variable and a gradient-tracking variable) under non-diminishing stepsizes. Furthermore, both algorithms can guarantee the privacy of a participating agent even when all information shared by the agent are accessible to an adversary, a scenario in which most existing accuracy-maintaining privacy approaches will fail to protect privacy. Simulation results confirm the effectiveness of the proposed algorithms.

Original languageEnglish (US)
Pages (from-to)1-16
Number of pages16
JournalIEEE Transactions on Automatic Control
DOIs
StateAccepted/In press - 2023
Externally publishedYes

Keywords

  • Convergence
  • Gradient methods
  • Linear programming
  • Machine learning
  • Optimization
  • Privacy
  • Training data

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Computer Science Applications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Decentralized Gradient Methods With Time-Varying Uncoordinated Stepsizes: Convergence Analysis and Privacy Design'. Together they form a unique fingerprint.

Cite this