Federated Learning Advances

The field of federated learning is moving towards more efficient and privacy-preserving methods. Researchers are exploring new techniques to reduce communication overhead, improve model accuracy, and enhance the security of Federated Learning (FL) systems. Notably, the development of adaptive methods, such as those utilizing gradient difference-based error modeling and second-order optimization, are gaining attention for their potential to accelerate training and improve convergence rates. Furthermore, there is a growing interest in multimodal federated learning, which involves leveraging multiple data modalities to improve downstream inference performance while preserving privacy.

In addition to these advancements, researchers are also focusing on the development of cost-aware and serverless workflows to optimize resource utilization and reduce expenses in FL environments. The exploration of novel algorithms and frameworks, such as those designed for joint cloud FaaS systems and federated split learning, is also underway to address the challenges of vendor lock-in, communication overhead, and client heterogeneity.

Some noteworthy papers in this area include: FedCostAware, which introduces a cost-aware scheduling algorithm to optimize synchronous FL on cloud spot instances, reducing cloud computing costs. Jointλ, a distributed runtime system that orchestrates serverless workflows on multiple FaaS systems without relying on a centralized orchestrator, achieving significant reductions in latency and cost. The Panaceas for Improving Low-Rank Decomposition, which proposes novel techniques to enhance the performance of low-rank decomposition methods in communication-efficient FL, achieving faster convergence and superior accuracy. FSL-SAGE, a federated split learning algorithm that estimates server-side gradient feedback via auxiliary models, reducing communication costs and client memory requirements while achieving state-of-the-art convergence rates.

Sources

AMSFL: Adaptive Multi-Step Federated Learning via Gradient Difference-Based Error Modeling

FedCostAware: Enabling Cost-Aware Federated Learning on the Cloud

Multimodal Federated Learning: A Survey through the Lens of Different FL Paradigms

Joint$\lambda$: Orchestrating Serverless Workflows on Jointcloud FaaS Systems

Privacy-preserving Prompt Personalization in Federated Learning for Multimodal Large Language Models

An Empirical Study of Federated Prompt Learning for Vision Language Model

The Panaceas for Improving Low-Rank Decomposition in Communication-Efficient Federated Learning

FSL-SAGE: Accelerating Federated Split Learning via Smashed Activation Gradient Estimation

On Global Convergence Rates for Federated Policy Gradient under Heterogeneous Environment

Adaptive Federated LoRA in Heterogeneous Wireless Networks with Independent Sampling

Accelerated Training of Federated Learning via Second-Order Methods

Position: Federated Foundation Language Model Post-Training Should Focus on Open-Source Models

Built with on top of