The field of graph neural networks (GNNs) is rapidly evolving, with a focus on improving efficiency, expressivity, and scalability. Recent developments have centered around addressing the limitations of traditional GNNs, such as the neighbor explosion problem, over-squashing, and poor performance on heterophilic graphs. Researchers have proposed innovative solutions, including pre-computation-based methods, neighborhood-contextualized message-passing, and staleness-aware training algorithms. Additionally, there is a growing interest in exploring new architectures, such as graph Transformers and complex-weighted convolutional networks, which have shown promising results in enhancing GNN expressiveness. Noteworthy papers in this area include Echoless Label-Based Pre-computation for Memory-Efficient Heterogeneous Graph Learning, which eliminates training label leakage, and VISAGNN, which dynamically incorporates staleness criteria into the training process. Other notable works include Adaptive Graph Rewiring to Mitigate Over-Squashing in Mesh-Based GNNs and Connectivity-Guided Sparsification of 2-FWL GNNs, which demonstrate the potential for more efficient and expressive GNNs.
Advances in Graph Neural Networks
Sources
Adaptive Graph Rewiring to Mitigate Over-Squashing in Mesh-Based GNNs for Fluid Dynamics Simulations
Connectivity-Guided Sparsification of 2-FWL GNNs: Preserving Full Expressivity with Improved Efficiency