Large Language Models in Graph Learning

The field of graph learning is rapidly advancing with the integration of large language models (LLMs). Recent developments have shown that LLMs can be effective in learning graph representations, particularly in scenarios where graph structures are complex or dynamic. Researchers are exploring various approaches to combine LLMs with graph neural networks (GNNs) to leverage the strengths of both paradigms. One key direction is the use of LLMs to generate text-based representations of graph nodes and edges, which can then be used as input to GNNs. Another approach involves using LLMs as a preprocessing step to extract relevant information from graph-structured data before applying GNNs. These innovations have the potential to improve the performance of graph learning models in a wide range of applications, including recommendation systems, social network analysis, and natural language processing. Noteworthy papers include: Temporal Graph Talker, which introduces a novel framework for temporal graph learning using LLMs; Rel-LLM, which proposes a GNN-based encoder to generate structured relational prompts for LLMs; and Graph-MLLM, which presents a comprehensive benchmark for multimodal graph learning using MLLMs.

Sources

Are Large Language Models Good Temporal Graph Learners?

Large Language Models are Good Relational Learners

Discrete Minds in a Continuous World: Do Language Models Know Time Passes?

Research on Personalized Financial Product Recommendation by Integrating Large Language Models and Graph Neural Networks

MATP-BENCH: Can MLLM Be a Good Automated Theorem Prover for Multimodal Problems?

Masked Language Models are Good Heterogeneous Graph Generalizers

H$^2$GFM: Towards unifying Homogeneity and Heterogeneity on Text-Attributed Graphs

Graph Prompting for Graph Learning Models: Recent Advances and Future Directions

HSG-12M: A Large-Scale Spatial Multigraph Dataset

Transaction Categorization with Relational Deep Learning in QuickBooks

Towards Multi-modal Graph Large Language Model

NOCL: Node-Oriented Conceptualization LLM for Graph Tasks without Message Passing

Graph-MLLM: Harnessing Multimodal Large Language Models for Multimodal Graph Learning

Macro Graph of Experts for Billion-Scale Multi-Task Recommendation

Built with on top of