The field of graph neural networks (GNNs) is rapidly advancing, with a focus on addressing the limitations of existing models and developing more effective and efficient architectures. One of the key areas of research is the development of new attention mechanisms and message passing schemes that can better capture the complex structures and relationships in graph data. Another important direction is the study of homophily and heterophily in graphs, and the development of models that can effectively handle graphs with different node neighbor distribution patterns. Additionally, there is a growing interest in federated graph learning and the development of models that can learn from multiple graphs while preserving data privacy. Overall, the field is moving towards the development of more robust, flexible, and generalizable models that can effectively handle a wide range of graph-based tasks. Notable papers in this area include: Graph Fourier Transformer with Structure-Frequency Information, which proposes a novel attention mechanism that combines graph structural information and node frequency information. Heterophily-informed Message Passing, which introduces a new scheme that regulates the aggregation of messages to mitigate oversmoothing. Learning Laplacian Positional Encodings for Heterophilous Graphs, which proposes a new positional encoding that leverages the full spectrum of the graph Laplacian. FedHERO, which proposes a federated learning approach for node classification tasks on heterophilic graphs.