Advances in Time Series Forecasting with Large Language Models

The field of time series forecasting is witnessing significant developments with the integration of large language models (LLMs). Researchers are exploring the potential of LLMs to improve forecasting accuracy and robustness. One notable direction is the use of non-causal, bidirectional attention encoder-only transformers, which have shown state-of-the-art performance in certain tasks. Additionally, there is a growing interest in multimodal time series forecasting, where LLMs are used to incorporate visual data, such as satellite imagery, into forecasting models. Another area of research focuses on developing more efficient and scalable time series foundation models, which can be fine-tuned for specific downstream tasks. Noteworthy papers include Output Scaling: YingLong-Delayed Chain of Thought, which presents a joint forecasting framework for time series prediction, and Large Language models for Time Series Analysis: Techniques, Applications, and Challenges, which provides a systematic review of pre-trained LLM-driven time series analysis.

Sources

Output Scaling: YingLong-Delayed Chain of Thought in a Large Pretrained Time Series Forecasting Model

Large Language models for Time Series Analysis: Techniques, Applications, and Challenges

NSW-EPNews: A News-Augmented Benchmark for Electricity Price Forecasting with LLMs

Can Time-Series Foundation Models Perform Building Energy Management Tasks?

Prioritizing Alignment Paradigms over Task-Specific Model Customization in Time-Series LLMs

FAA Framework: A Large Language Model-Based Approach for Credit Card Fraud Investigations

HAELT: A Hybrid Attentive Ensemble Learning Transformer Framework for High-Frequency Stock Price Forecasting

Extracting transient Koopman modes from short-term weather simulations with sparsity-promoting dynamic mode decomposition

Multi-Scale Finetuning for Encoder-based Time Series Foundation Models

SKOLR: Structured Koopman Operator Linear RNN for Time-Series Forecasting

PIPE: Physics-Informed Position Encoding for Alignment of Satellite Images and Time Series

ss-Mamba: Semantic-Spline Selective State-Space Model

Explain First, Trust Later: LLM-Augmented Explanations for Graph-Based Crypto Anomaly Detection

Advanced Prediction of Hypersonic Missile Trajectories with CNN-LSTM-GRU Architectures

Conditional Generative Modeling for Enhanced Credit Risk Management in Supply Chain Finance

Built with on top of