The field of Natural Language Processing (NLP) is moving towards incorporating temporal reasoning and knowledge graph evolution to improve the performance of large language models. Recent developments have focused on augmenting language models with knowledge graphs that can evolve over time, allowing for more accurate and up-to-date information. This has led to advancements in temporal question answering, semantic parsing, and multi-hop reasoning. Notably, researchers have proposed novel frameworks and algorithms that can efficiently update knowledge graphs, perform incremental updates, and capture temporal semantics. These innovations have the potential to unlock scalable and accurate NLP applications. Noteworthy papers include: Temporal Reasoning with Large Language Models Augmented by Evolving Knowledge Graphs, which proposes a temporal-aware multi-hop reasoning algorithm and a noise-tolerant KG evolution module. HSGM: Hierarchical Segment-Graph Memory for Scalable Long-Text Semantics, which introduces a novel framework for semantic parsing of long documents. Global-Recent Semantic Reasoning on Dynamic Text-Attributed Graphs with Large Language Models, which proposes a method that leverages LLMs and temporal GNNs to efficiently reason on dynamic text-attributed graphs.