/FedSiKD

we introduce FedSiKD, which incorporates knowledge distillation (KD) within a similarity-based federated learning framework

Primary LanguagePython

FedSiKD: Clients Similarity and KnowledgeDistillation: Addressing Non-i.i.d. and Constraints in Federated Learning

FedSiKD, which incorporates knowledge distillation (KD) within a similarity-based federated learning framework. As clients join the system, they securely share relevant statistics about their data distribution, promoting intra-cluster homogeneity. This enhances optimization efficiency and accelerates the learning process, effectively transferring knowledge between teacher and student models and addressing device constraints.