The field of graph neural networks is moving towards more efficient and scalable models, with a focus on reducing computational and time overhead. This is driven by the need to handle large-scale graphs, which are becoming increasingly common in real-world applications. Researchers are exploring new architectures and techniques, such as spiking graph neural networks and hierarchical representations, to improve the performance and efficiency of graph neural networks. Notable papers in this area include SGNNBench, which provides a comprehensive evaluation of spiking graph neural networks, and SHAKE-GNN, which introduces a scalable hierarchical graph neural network framework. WIRE: Wavelet-Induced Rotary Encodings is also a noteworthy paper, as it extends rotary position encodings to graph-structured data and demonstrates desirable theoretical properties.