Federated Learning Advances

The field of federated learning is moving towards addressing the challenges of communication overhead, data heterogeneity, and privacy preservation. Researchers are exploring innovative strategies such as explanation-guided pruning, graph federated learning, and hierarchical federated learning to improve the efficiency and effectiveness of federated learning models. Noteworthy papers include FedX, which proposes a novel pruning strategy to reduce communication overhead, and GFed-PP, which adapts to different privacy requirements while improving recommendation performance. Other notable works include HeteRo-Select, which maintains high performance and ensures long-term training stability, and SHeRL-FL, which integrates split learning and hierarchical model aggregation to reduce communication overhead.

Sources

FedX: Explanation-Guided Pruning for Communication-Efficient Federated Learning in Remote Sensing

Graph Federated Learning for Personalized Privacy Recommendation

Stabilizing Federated Learning under Extreme Heterogeneity with HeteRo-Select

Federated Learning for Epileptic Seizure Prediction Across Heterogeneous EEG Datasets

SHeRL-FL: When Representation Learning Meets Split Learning in Hierarchical Federated Learning

Biased Local SGD for Efficient Deep Learning on Heterogeneous Systems

SHEFL: Resource-Aware Aggregation and Sparsification in Heterogeneous Ensemble Federated Learning

Long-Term Client Selection for Federated Learning with Non-IID Data: A Truthful Auction Approach

Proxy Model-Guided Reinforcement Learning for Client Selection in Federated Recommendation

Built with on top of