The field of knowledge graph-based machine learning and reasoning is moving towards increased integration of large language models (LLMs) with knowledge graphs (KGs) to enhance reasoning and decision-making capabilities. This integration enables the development of more accurate and interpretable models, as well as more efficient and context-aware reasoning processes. Notable advancements include the use of LLMs to guide symbolic search and path evaluation in KG question answering, and the development of frameworks that couple joint inference and dynamic KG refinement with LLMs. These innovations have the potential to significantly improve the performance of KG-based systems in various applications, including mathematical reasoning, code generation, and visual graph query building.
Noteworthy papers include: ExeKGLib, which presents a Python library that allows users with minimal ML knowledge to build ML pipelines using knowledge graphs. Dynamically Adaptive Reasoning via LLM-Guided MCTS, which proposes a novel framework for efficient and context-aware KGQA. AGENTiGraph, which enables intuitive interaction and management of domain-specific data through the manipulation of knowledge graphs in natural language. KG-Augmented Executable CoT, which enhances code generation through knowledge graphs and improves mathematical reasoning via executable code. TRAIL, which proposes a unified framework for joint inference and dynamic KG refinement with LLMs. Difference Views for Visual Graph Query Building, which communicates changes between iterative steps in the query building process using graph differences. GRAIL, which integrates LLM-guided random exploration with path filtering to establish a data synthesis pipeline for retrieval-augmented reasoning.