TY - JOUR
T1 - Client-Level Fault-Tolerant Federated Semi-Supervised Learning for Unlabeled Clients in Internet of Vehicles
AU - Xiong, Meng
AU - Tian, Shengwei
AU - Guan, Peiyuan
AU - Pei, Xinjun
AU - Li, Yushuai
AU - Taherkordi, Amir
PY - 2025
Y1 - 2025
N2 - Federated learning (FL) has emerged as a promising solution to address the inherent limitations of centralized data management in distributed systems, such as the Internet of Vehicles (IoV). High-quality labels are crucial for developing safety-critical IoV models. However, the high cost and effort involved in labeling distributed vehicle data make mainstream FL approaches impractical in most IoV applications. While federated semi-supervised learning can offer stong supervision to establish safety-critical IoV models, the pseudo-labels and noisy data can introduce errors in the training of local models, potentially leading to unreliable models, which we refer to as “damaged local models”. To address this issue, it is crucial to develop fault-tolerant mechanisms that can identify and exclude damaged local models from the aggregation process. This paper proposes a novel Client-Level Fault-Tolerant Federated Semi-Supervised Learning (CFT-FSSL) framework to ensure that global aggregation is made based on the majority of reliable local models. The framework leverages a knowledge transfer-based semi-supervised learning to integrate the knowledge extracted from locally private and unlabeled data into the FL model, thereby improving the model's ability to generalize when data is scarce. To address the challenge of data heterogeneity, we introduce a client-level reweighting approach that strategically enhances the contributions of individual clients based on the volume and variance of their local datasets. A fault-tolerant method then evaluates the distinctive contributions of each client and strategically aggregates only the most reliable local model updates, thus minimizing the risk associated with model bias. Extensive experiments carried out in this paper demonstrate that the proposed CFT-FSSL framework outperforms other state-of-the-art semi-supervised FL methods and can significantly accelerate model convergence.
AB - Federated learning (FL) has emerged as a promising solution to address the inherent limitations of centralized data management in distributed systems, such as the Internet of Vehicles (IoV). High-quality labels are crucial for developing safety-critical IoV models. However, the high cost and effort involved in labeling distributed vehicle data make mainstream FL approaches impractical in most IoV applications. While federated semi-supervised learning can offer stong supervision to establish safety-critical IoV models, the pseudo-labels and noisy data can introduce errors in the training of local models, potentially leading to unreliable models, which we refer to as “damaged local models”. To address this issue, it is crucial to develop fault-tolerant mechanisms that can identify and exclude damaged local models from the aggregation process. This paper proposes a novel Client-Level Fault-Tolerant Federated Semi-Supervised Learning (CFT-FSSL) framework to ensure that global aggregation is made based on the majority of reliable local models. The framework leverages a knowledge transfer-based semi-supervised learning to integrate the knowledge extracted from locally private and unlabeled data into the FL model, thereby improving the model's ability to generalize when data is scarce. To address the challenge of data heterogeneity, we introduce a client-level reweighting approach that strategically enhances the contributions of individual clients based on the volume and variance of their local datasets. A fault-tolerant method then evaluates the distinctive contributions of each client and strategically aggregates only the most reliable local model updates, thus minimizing the risk associated with model bias. Extensive experiments carried out in this paper demonstrate that the proposed CFT-FSSL framework outperforms other state-of-the-art semi-supervised FL methods and can significantly accelerate model convergence.
KW - Federated Learning
KW - Internet of Vehicles
KW - Privacy-Preserving
KW - Semi-Supervised Learning
UR - http://www.scopus.com/inward/record.url?scp=105006923027&partnerID=8YFLogxK
U2 - 10.1109/tnse.2025.3574118
DO - 10.1109/tnse.2025.3574118
M3 - Journal article
SN - 2327-4697
VL - 12
SP - 4657
EP - 4670
JO - IEEE Transactions on Network Science and Engineering
JF - IEEE Transactions on Network Science and Engineering
IS - 6
ER -