The field of federated learning is moving towards more efficient and decentralized solutions, with a focus on reducing communication overhead and improving model performance in non-IID data scenarios. Recent developments have introduced novel frameworks and algorithms that enable faster convergence, improved accuracy, and enhanced privacy preservation. Notably, the use of operator-theoretic frameworks, gravitational potential fields, and kernel machines have shown promising results in addressing the challenges of federated learning. Additionally, the integration of techniques such as contrastive learning, differential privacy, and adaptive layer-freezing have further improved the efficiency and effectiveness of federated learning models. Overall, these advancements have the potential to enable more widespread adoption of federated learning in various applications, including healthcare and virtual reality. Noteworthy papers include: Multi-Server FL with Overlapping Clients, which proposes a cloud-free multi-server FL framework that leverages overlapping clients as relays for inter-server model exchange. Prediction-space knowledge markets for communication-efficient federated learning on multimedia tasks, which introduces a prediction-space knowledge trading market for FL that reduces communication overhead while maintaining model performance. Topological Federated Clustering via Gravitational Potential Fields under Local Differential Privacy, which presents a novel approach to privacy-preserving federated clustering that overcomes the limitations of distance-based methods under varying LDP.
Advances in Federated Learning
Sources
Prediction-space knowledge markets for communication-efficient federated learning on multimedia tasks
Topological Federated Clustering via Gravitational Potential Fields under Local Differential Privacy
FDRMFL:Multi-modal Federated Feature Extraction Model Based on Information Maximization and Contrastive Learning