PERSONALIZED FEDERATED LEARNING WITH ADAPTIVE CLIENT SELECTION AND CONCEPT DRIFT HANDLING FOR DYNAMIC DISTRIBUTED ENVIRONMENTS

Contenido principal del artículo

Senthil Kumar P
Nithya R
Ancy R
Sowmya R

Resumen

Federated Learning (FL) has emerged as a promising paradigm for training machine learning models across distributed clients while preserving data privacy. However, traditional FL frameworks assume homogeneous data distribution and stable environments, which rarely hold in real-world applications. Clients often exhibit heterogeneous data distributions, varying computational capabilities, and dynamic data patterns over time, leading to degraded model performance. Additionally, concept drift—where data distributions evolve—poses a significant challenge for maintaining model accuracy in non-stationary environments. To address these limitations, this paper proposes a Personalized Federated Learning framework with Adaptive Client Selection and Drift Handling (PFL-ACSD). The proposed approach integrates dynamic client selection based on resource availability, data quality, and contribution significance, along with a drift detection mechanism to identify changes in local data distributions. Personalized local models are maintained for each client while a global model is adaptively updated using weighted aggregation. A drift-aware retraining strategy ensures continuous model adaptation in evolving environments. Experimental evaluation demonstrates that the proposed framework improves model accuracy, reduces communication overhead, and enhances robustness against data heterogeneity compared to conventional FL approaches. The results highlight the effectiveness of combining personalization, adaptive participation, and drift awareness in federated learning systems. The proposed framework is suitable for applications such as healthcare, mobile intelligence, and IoT systems where privacy, adaptability, and scalability are critical.

Detalles del artículo

Sección
Articles