Delay-Aware Hierarchical Federated Learning

Frank Po Chen Lin, Seyyedali Hosseinalipour, Nicolo Michelusi, Christopher G. Brinton

Research output: Contribution to journalArticlepeer-review

Abstract

Federated learning has gained popularity as a means of training models distributed across the wireless edge. The paper introduces delay-aware hierarchical federated learning (DFL) to improve the efficiency of distributed machine learning (ML) model training by accounting for communication delays between edge and cloud. Different from traditional federated learning, DFL leverages multiple stochastic gradient descent iterations on local datasets within each global aggregation period and intermittently aggregates model parameters through edge servers in local subnetworks. During global synchronization, the cloud server consolidates local models with the outdated global model using a local-global combiner, thus preserving crucial elements of both, enhancing learning efficiency under the presence of delay. A set of conditions is obtained to achieve the sub-linear convergence rate of O(1/k) for strongly convex and smooth loss functions. Based on these findings, an adaptive control algorithm is developed for DFL, implementing policies to mitigate energy consumption and communication latency while aiming for sublinear convergenc. Numerical evaluations show DFL’s superior performance in terms of faster global model convergence, reduced resource consumption, and robustness against communication delays compared to existing FL algorithms. In summary, this proposed method offers improved efficiency and results when dealing with both convex and nonconvex loss functions.

Original languageEnglish (US)
Pages (from-to)1
Number of pages1
JournalIEEE Transactions on Cognitive Communications and Networking
DOIs
StateAccepted/In press - 2023

Keywords

  • Adaptation models
  • Convergence
  • Data models
  • Delays
  • Federated learning
  • Federated learning
  • Servers
  • Training
  • convergence analysis
  • edge intelligence
  • hierarchical architecture
  • network optimization

ASJC Scopus subject areas

  • Hardware and Architecture
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Delay-Aware Hierarchical Federated Learning'. Together they form a unique fingerprint.

Cite this