Advances in Knowledge Graph Construction and Large Language Models

The field of natural language processing and knowledge graph construction is moving towards more accurate and efficient methods for analyzing complex networks and generating informative responses. Researchers are exploring new approaches to improve the quality of knowledge graphs, such as using large language models (LLMs) to resolve coreferences and extract entities and relationships from unstructured texts. Additionally, there is a growing interest in developing interactive systems that can provide users with deeper cultural insights and more professional responses. Noteworthy papers include:

  • CORE-KG, which proposes a modular framework for building interpretable knowledge graphs from legal texts, reducing node duplication and legal noise.
  • RiverEcho, which designs a real-time interactive system for ancient Yellow River culture, leveraging Retrieval-Augmented Generation (RAG) to enhance response quality.
  • Weak-to-Strong GraphRAG, which aligns weak retrievers with LLMs for graph-based RAG, improving supervision quality and reducing hallucinations.

Sources

CORE-KG: An LLM-Driven Knowledge Graph Construction Framework for Human Smuggling Networks

RiverEcho: Real-Time Interactive Digital System for Ancient Yellow River Culture

The Missing Link: Joint Legal Citation Prediction using Heterogeneous Graph Enrichment

Weak-to-Strong GraphRAG: Aligning Weak Retrievers with Large Language Models for Graph-based Retrieval Augmented Generation

Knowledge Augmented Finetuning Matters in both RAG and Agent Based Dialog Systems

Built with on top of