The field of federated learning and graph neural networks is rapidly evolving, with a focus on improving communication efficiency, addressing data heterogeneity, and enhancing security. Researchers are exploring novel approaches to optimize model training and update sharing, such as using condensed graphs, spherical linear interpolation, and blockchain technology. These advancements aim to enable more efficient, secure, and robust collaborative learning across distributed clients and graphs. Notably, some papers have introduced innovative frameworks, such as post-quantum secure blockchain-based federated learning and decentralized federated prototype learning, which demonstrate significant improvements in performance and communication efficiency. Noteworthy papers include: PQS-BFL, which integrates post-quantum cryptography with blockchain verification to secure federated learning against quantum adversaries. Plexus, which proposes a 3D parallel approach for full-graph training that scales to billion-edge graphs and achieves unprecedented speedups. FRAIN, which introduces a fast-and-reliable asynchronous federated learning method that mitigates client drift and staleness. FedBWO, which enhances communication efficiency by transmitting only a performance score rather than local model weights. DFPL, which proposes a decentralized federated prototype learning framework that improves performance across heterogeneous data distributions.