A Contribution-Based Device Selection Scheme in Federated Learning

Shashi Raj Pandey*, Lam D. Nguyen, Petar Popovski

*Corresponding author for this work

Research output: Contribution to journalJournal articleResearchpeer-review

4 Citations (Scopus)
58 Downloads (Pure)

Abstract

In a Federated Learning (FL) setup, a number of devices contribute to the training of a common model. We present a method for selecting the devices that provide updates in order to achieve improved generalization, fast convergence, and better device-level performance. We formulate a min-max optimization problem and decompose it into a primal-dual setup, where the duality gap is used to quantify the device-level performance. Our strategy combines exploration of data freshness through a random device selection with exploitation through simplified estimates of device contributions. This improves the performance of the trained model both in terms of generalization and personalization. A modified Truncated Monte-Carlo (TMC) method is applied during the exploitation phase to estimate the device's contribution and lower the communication overhead. The experimental results show that the proposed approach has a competitive performance, with lower communication overhead and competitive personalization performance against the baseline schemes.

Original languageEnglish
JournalIEEE Communications Letters
Volume26
Issue number9
Pages (from-to)2057-2061
Number of pages5
ISSN1089-7798
DOIs
Publication statusPublished - 1 Sept 2022

Keywords

  • device selection
  • exploitation
  • exploration
  • Federated learning
  • generalization
  • personalization

Fingerprint

Dive into the research topics of 'A Contribution-Based Device Selection Scheme in Federated Learning'. Together they form a unique fingerprint.

Cite this