Tailoring Gradient Methods for Differentially Private Distributed Optimization

Yongqiang Wang, Angelia Nedic

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Decentralized optimization is gaining increased traction due to its widespread applications in large-scale machine learning and multiagent systems. The same mechanism that enables its success, i.e., information sharing among participating agents, however, also leads to the disclosure of individual agents' private information, which is unacceptable when sensitive data are involved. As differential privacy is becoming a de facto standard for privacy preservation, recently results have emerged integrating differential privacy with distributed optimization. However, directly incorporating differential privacy design in existing distributed optimization approaches significantly compromises optimization accuracy. In this article, we propose to redesign and tailor gradient methods for differentially private distributed optimization, and propose two differential-privacy oriented gradient methods that can ensure both rigorous $\epsilon$-differential privacy and optimality. The first algorithm is based on static-consensus-based gradient methods, and the second algorithm is based on dynamic-consensus (gradient-Tracking) based distributed optimization methods and, hence, is applicable to general directed interaction graph topologies. Both algorithms can simultaneously ensure almost sure convergence to an optimal solution and a finite privacy budget, even when the number of iterations goes to infinity. To the best of authors' knowledge, this is the first time that both goals are achieved simultaneously. Numerical simulations using a distributed estimation problem and experimental results on a benchmark dataset confirm the effectiveness of the proposed approaches.

Original languageEnglish (US)
Pages (from-to)872-887
Number of pages16
JournalIEEE Transactions on Automatic Control
Volume69
Issue number2
DOIs
StatePublished - Feb 1 2024

Keywords

  • Decentralized learning
  • decentralized optimization
  • differential privacy
  • gradient methods

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Control and Systems Engineering
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Tailoring Gradient Methods for Differentially Private Distributed Optimization'. Together they form a unique fingerprint.

Cite this