Federated Learning Advancements

The field of federated learning is moving towards addressing key challenges such as catastrophic forgetting, imbalanced covariate shift, and fairness. Researchers are exploring innovative solutions, including discrepancy-aware multi-teacher knowledge distillation, asynchronous knowledge distillation, and feature distillation, to improve model training efficacy and mitigate the effects of heterogeneous data distributions. Notable papers include SFedKD, which proposes a sequential federated learning framework with discrepancy-aware multi-teacher knowledge distillation to overcome catastrophic forgetting, and FedAKD, which introduces a novel asynchronous knowledge distillation strategy to balance accurate prediction with collaborative fairness. Additionally, FedFD and FedGSCA demonstrate promising results in model-heterogeneous federated learning and medical federated learning under label noise, respectively. FedGA also shows effectiveness in improving fairness metrics in horizontal federated settings.

Sources

SFedKD: Sequential Federated Learning with Discrepancy-Aware Multi-Teacher Knowledge Distillation

Towards Collaborative Fairness in Federated Learning Under Imbalanced Covariate Shift

Feature Distillation is the Better Choice for Model-Heterogeneous Federated Learning

FedGSCA: Medical Federated Learning with Global Sample Selector and Client Adaptive Adjuster under Label Noise

FedGA: A Fair Federated Learning Framework Based on the Gini Coefficient

Built with on top of