Advances in Federated Learning and Graph Neural Networks

The field of federated learning and graph neural networks is rapidly evolving, with a focus on improving communication efficiency, addressing data heterogeneity, and enhancing security. Researchers are exploring novel approaches to optimize model training and update sharing, such as using condensed graphs, spherical linear interpolation, and blockchain technology. These advancements aim to enable more efficient, secure, and robust collaborative learning across distributed clients and graphs. Notably, some papers have introduced innovative frameworks, such as post-quantum secure blockchain-based federated learning and decentralized federated prototype learning, which demonstrate significant improvements in performance and communication efficiency. Noteworthy papers include: PQS-BFL, which integrates post-quantum cryptography with blockchain verification to secure federated learning against quantum adversaries. Plexus, which proposes a 3D parallel approach for full-graph training that scales to billion-edge graphs and achieves unprecedented speedups. FRAIN, which introduces a fast-and-reliable asynchronous federated learning method that mitigates client drift and staleness. FedBWO, which enhances communication efficiency by transmitting only a performance score rather than local model weights. DFPL, which proposes a decentralized federated prototype learning framework that improves performance across heterogeneous data distributions.

Sources

PQS-BFL: A Post-Quantum Secure Blockchain-based Federated Learning Framework

Rethinking Federated Graph Learning: A Data Condensation Perspective

Plexus: Taming Billion-edge Graphs with 3D Parallel GNN Training

FRAIN to Train: A Fast-and-Reliable Solution for Decentralized Federated Learning

FedBWO: Enhancing Communication Efficiency in Federated Learning

DFPL: Decentralized Federated Prototype Learning Across Heterogeneous Data Distributions

Built with on top of