Advancements in Federated Learning and Distributed Systems

The field of federated learning and distributed systems is moving towards more efficient, privacy-preserving, and adaptable solutions. Researchers are focusing on developing novel techniques to reduce communication costs, improve model accuracy, and enhance system scalability. Recent developments have introduced innovative methods for split learning, federated graph learning, and decentralized AI-driven architectures. These advancements have significant implications for various applications, including healthcare, smart grids, and edge computing. Notably, papers such as Checkmate, FedSkipTwin, and ACME have made substantial contributions to the field by introducing new approaches to model checkpointing, client skipping, and adaptive customization of large models. Overall, the field is progressing towards more efficient, secure, and scalable solutions for distributed machine learning.

Sources

Checkmate: Zero-Overhead Model Checkpointing via Network Gradient Replication

FedSkipTwin: Digital-Twin-Guided Client Skipping for Communication-Efficient Federated Learning

Semi-Supervised Federated Learning via Dual Contrastive Learning and Soft Labeling for Intelligent Fault Diagnosis

On Splitting Lightweight Semantic Image Segmentation for Wireless Communications

ACME: Adaptive Customization of Large Models via Distributed Systems

Optimal Batch-Size Control for Low-Latency Federated Learning with Device Heterogeneity

Federated Split Learning with Improved Communication and Storage Efficiency

Decentralized AI-driven IoT Architecture for Privacy-Preserving and Latency-Optimized Healthcare in Pandemic and Critical Care Scenarios

Multi-Agent Reinforcement Learning for Sample-Efficient Deep Neural Network Mapping

A Comprehensive Data-centric Overview of Federated Graph Learning

An Experimental Study of Split-Learning TinyML on Ultra-Low-Power Edge/IoT Nodes

FOGNITE: Federated Learning-Enhanced Fog-Cloud Architecture

P3SL: Personalized Privacy-Preserving Split Learning on Heterogeneous Edge Devices

Eco-Friendly AI: Unleashing Data Power for Green Federated Learning

Caching Techniques for Reducing the Communication Cost of Federated Learning in IoT Environments

PowerTrip: Exploiting Federated Heterogeneous Datacenter Power for Distributed ML Training

A Novel Coded Computing Approach for Distributed Multi-Task Learning

FedSA-GCL: A Semi-Asynchronous Federated Graph Learning Framework with Personalized Aggregation and Cluster-Aware Broadcasting

Built with on top of