Advances in Time Series Forecasting and Semantic Communications

The field of time series forecasting and semantic communications is rapidly evolving, with a focus on developing innovative approaches that prioritize conveying the meaning of a message rather than merely transmitting raw data. Recent research has explored the use of large language models, tokenization techniques, and multimodal learning to improve forecasting performance and communication efficiency. Notably, the integration of contextual information and task adaptation has led to significant advancements in semantic communication frameworks. Additionally, the application of foundation models to time series forecasting has shown promising results, with some studies demonstrating state-of-the-art performance across diverse benchmarks. The development of novel tokenization schemes, such as pattern-centric tokenization, has also improved forecasting performance and efficiency. Furthermore, research has emphasized the importance of interpretability and explainability in time series forecasting models, particularly in critical applications like healthcare. Overall, the field is moving towards more sophisticated and generalizable models that can effectively capture complex patterns and relationships in time series data. Noteworthy papers include TACO, which introduces a novel semantic communication framework with task adaptation and context embedding, and Logo-LLM, which proposes a framework for local and global modeling with large language models for time series forecasting.

Sources

TACO: Rethinking Semantic Communications with Task Adaptation and Context Embedding

Context-Aware Probabilistic Modeling with LLM for Multimodal Time Series Forecasting

ToDMA: Large Model-Driven Token-Domain Multiple Access for Semantic Communications

Logo-LLM: Local and Global Modeling with Large Language Models for Time Series Forecasting

Context parroting: A simple but tough-to-beat baseline for foundation models in scientific machine learning

A Set-Sequence Model for Time Series

Zero-Shot Forecasting Mortality Rates: A Global Study

MSDformer: Multi-scale Discrete Transformer For Time Series Generation

Byte Pair Encoding for Efficient Time Series Forecasting

Interpretable Dual-Stream Learning for Local Wind Hazard Prediction in Vulnerable Communities

Time to Embed: Unlocking Foundation Models for Time Series with Channel Descriptions

Towards a Foundation Model for Communication Systems

This Time is Different: An Observability Perspective on Time Series Foundation Models

Text embedding models can be great data engineers

MoTime: A Dataset Suite for Multimodal Time Series Forecasting

Robust Multi-Modal Forecasting: Integrating Static and Dynamic Features

From Local Patterns to Global Understanding: Cross-Stock Trend Integration for Enhanced Predictive Modeling

Built with on top of