An Efficient Federated Learning Framework for Training Semantic Communication System

Loc X. Nguyen, Huy Q. Le, Ye Lin Tun, Pyae Sone Aung, Yan Kyaw Tun, Zhu Han, Choong Seon Hong

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningpeer review


Semantic communication has emerged as a pillar for
the next generation of communication systems due to its capabilities in alleviating data redundancy. Most semantic communication systems are built upon advanced deep learning models whose
training performance heavily relies on data availability. Existing
studies often make unrealistic assumptions of a readily accessible
data source, where in practice, data is mainly created on the client
side. Due to privacy and security concerns, the transmission of
data is restricted, which is necessary for conventional centralized
training schemes. To address this challenge, we explore semantic
communication in a federated learning (FL) setting that utilizes
client data without leaking privacy. Additionally, we design
our system to tackle the communication overhead by reducing
the quantity of information delivered in each global round.
In this way, we can save significant bandwidth for resourcelimited devices and reduce overall network traffic. Finally, we
introduce a mechanism to aggregate the global model from
clients, called FedLol. Extensive simulation results demonstrate
the effectiveness of our proposed technique compared to baseline
TidsskriftI E E E Transactions on Vehicular Technology
Antal sider5
StatusAccepteret/In press - 2024


Dyk ned i forskningsemnerne om 'An Efficient Federated Learning Framework for Training Semantic Communication System'. Sammen danner de et unikt fingeraftryk.