Advances in Integrating Large Language Models with Graphs

The field of natural language processing and graph learning is witnessing a significant shift towards integrating large language models (LLMs) with graphs, enabling the combination of semantic understanding and structured reasoning. This integration has the potential to enhance the performance of various applications, including recommendation systems, biomedical analysis, and knowledge-intensive question answering. Recent research has focused on developing novel frameworks and architectures that can effectively combine the strengths of LLMs and graphs, such as sequential, parallel, and multi-module frameworks. Additionally, there is a growing interest in addressing challenges related to over-aggregating, scalability, and interpretability in graph learning. Noteworthy papers in this area include: Large Language Models Meet Text-Attributed Graphs, which provides a comprehensive survey of LLM-TAG integration frameworks and applications. Relieving the Over-Aggregating Effect in Graph Transformers, which proposes a plug-and-play method to mitigate over-aggregating in graph attention. ATOM, which introduces a few-shot and scalable approach for building and continuously updating temporal knowledge graphs from unstructured texts. BambooKG, which presents a neurobiologically-inspired frequency-weight knowledge graph that enhances retrieval-augmented generation. LINK-KG, which proposes a modular framework for constructing coreference-resolved knowledge graphs for human smuggling networks.

Sources

Large Language Models Meet Text-Attributed Graphs: A Survey of Integration Frameworks and Applications

Relieving the Over-Aggregating Effect in Graph Transformers

Understanding Network Behaviors through Natural Language Question-Answering

ATOM: AdapTive and OptiMized dynamic temporal knowledge graph construction using LLMs

LightKGG: Simple and Efficient Knowledge Graph Generation from Textual Data

Bridging the Divide: End-to-End Sequence-Graph Learning

Parameter Averaging in Link Prediction

Monitoring Transformative Technological Convergence Through LLM-Extracted Semantic Entity Triple Graphs

Fine-Tuned Language Models for Domain-Specific Summarization and Tagging

Transformers Provably Learn Directed Acyclic Graphs via Kernel-Guided Mutual Information

BambooKG: A Neurobiologically-inspired Frequency-Weight Knowledge Graph

LINK-KG: LLM-Driven Coreference-Resolved Knowledge Graphs for Human Smuggling Networks

Inside CORE-KG: Evaluating Structured Prompting and Coreference Resolution for Knowledge Graphs

Built with on top of