The field of graph neural networks is moving towards addressing long-range dependencies and capturing complex structural patterns. Researchers are exploring innovative approaches to model long-range interactions, including adaptive random walks, second-order tensorial partial differential equations, and graph wavelet networks. These methods aim to efficiently process large graphs while effectively capturing high-frequency information and preserving global structure. Notable papers include:
- Learn to Jump, which proposes a novel approach exploiting hierarchical graph structures and adaptive random walks to address long-range dependencies.
- Second-Order Tensorial Partial Differential Equations on Graphs, which introduces a theoretically grounded framework for second-order continuous product graph neural networks.
- Learning Laplacian Eigenvectors, which proposes a novel framework for pre-training Graph Neural Networks by inductively learning Laplacian eigenvectors.
- Long-Range Graph Wavelet Networks, which decompose wavelet filters into complementary local and global components to capture both local and global structures.