The field of time series forecasting and semantic communications is rapidly evolving, with a focus on developing innovative approaches that prioritize conveying the meaning of a message rather than merely transmitting raw data. Recent research has explored the use of large language models, tokenization techniques, and multimodal learning to improve forecasting performance and communication efficiency. Notably, the integration of contextual information and task adaptation has led to significant advancements in semantic communication frameworks. Additionally, the application of foundation models to time series forecasting has shown promising results, with some studies demonstrating state-of-the-art performance across diverse benchmarks. The development of novel tokenization schemes, such as pattern-centric tokenization, has also improved forecasting performance and efficiency. Furthermore, research has emphasized the importance of interpretability and explainability in time series forecasting models, particularly in critical applications like healthcare. Overall, the field is moving towards more sophisticated and generalizable models that can effectively capture complex patterns and relationships in time series data. Noteworthy papers include TACO, which introduces a novel semantic communication framework with task adaptation and context embedding, and Logo-LLM, which proposes a framework for local and global modeling with large language models for time series forecasting.