Advances in Federated Learning

The field of federated learning is moving towards more efficient and decentralized solutions, with a focus on reducing communication overhead and improving model performance in non-IID data scenarios. Recent developments have introduced novel frameworks and algorithms that enable faster convergence, improved accuracy, and enhanced privacy preservation. Notably, the use of operator-theoretic frameworks, gravitational potential fields, and kernel machines have shown promising results in addressing the challenges of federated learning. Additionally, the integration of techniques such as contrastive learning, differential privacy, and adaptive layer-freezing have further improved the efficiency and effectiveness of federated learning models. Overall, these advancements have the potential to enable more widespread adoption of federated learning in various applications, including healthcare and virtual reality. Noteworthy papers include: Multi-Server FL with Overlapping Clients, which proposes a cloud-free multi-server FL framework that leverages overlapping clients as relays for inter-server model exchange. Prediction-space knowledge markets for communication-efficient federated learning on multimedia tasks, which introduces a prediction-space knowledge trading market for FL that reduces communication overhead while maintaining model performance. Topological Federated Clustering via Gravitational Potential Fields under Local Differential Privacy, which presents a novel approach to privacy-preserving federated clustering that overcomes the limitations of distance-based methods under varying LDP.

Sources

Multi-Server FL with Overlapping Clients: A Latency-Aware Relay Framework

Prediction-space knowledge markets for communication-efficient federated learning on multimedia tasks

Topological Federated Clustering via Gravitational Potential Fields under Local Differential Privacy

Operator-Theoretic Framework for Gradient-Free Federated Learning

Delta Sum Learning: an approach for fast and global convergence in Gossip Learning

Feature-Based Semantics-Aware Scheduling for Energy-Harvesting Federated Learning

FDRMFL:Multi-modal Federated Feature Extraction Model Based on Information Maximization and Contrastive Learning

Decentralized Fairness Aware Multi Task Federated Learning for VR Network

Energy-Efficient Federated Learning via Adaptive Encoder Freezing for MRI-to-CT Conversion: A Green AI-Guided Research

Single-Round Scalable Analytic Federated Learning

Built with on top of