The field of graph learning is rapidly advancing, with a focus on developing foundation models that can learn generalizable representations from large-scale graph datasets. Recent work has demonstrated the effectiveness of pretraining graph foundation models on synthetic graphs, allowing them to capture complex graph structural dependencies and achieve state-of-the-art results on diverse real-world graph datasets. Another key area of research is graph optimization, where the pretrain-transfer paradigm has been successfully applied to solve distance-based optimization problems on graph structures. Noteworthy papers in this area include GraphPFN, which proposes a prior-data fitted graph foundation model that achieves strong in-context learning performance and state-of-the-art results after finetuning, and Graph Foundation Models, which introduces a framework for solving all distance-based optimization problems on graph structures using a pre-trained graph foundation model. Additionally, UniPrompt is a novel graph prompt learning method that adapts any pretrained models to downstream scenarios, and GUIDE is a framework that integrates Large Language Model-generated adjacency matrices with observational data for DAG estimation, achieving significant improvements in computational efficiency and accuracy.