Temporal Information Modeling and Generative Time Series

The field of temporal information modeling is witnessing significant advancements, driven by the need to effectively capture and incorporate complex time patterns into various applications. A key direction in this area is the development of learnable transformation functions that can model generalized time patterns, including diverse and complex temporal dynamics. This approach enables the seamless integration of time encoding into a wide range of tasks. Another area of focus is the generation of high-quality time-series data through innovative data augmentation methods, which is crucial for training deep neural networks. Additionally, there is a growing interest in temporal interaction graph representation learning, which aims to embed nodes in temporal interaction graphs into low-dimensional representations that preserve both structural and temporal information. Furthermore, generative models for long time series are being developed, leveraging techniques such as variational autoencoders and recurrent layers to efficiently model long-range temporal dependencies. Noteworthy papers in this area include:

  • A paper that proposes a learnable transformation-based generalized time encoding method, enabling the modeling of complex temporal dynamics.
  • A paper that introduces a time-series data augmentation model through the integration of diffusion and transformer models, demonstrating its effectiveness in generating high-quality augmented data.
  • A survey paper that provides a comprehensive taxonomy of state-of-the-art temporal interaction graph representation learning methods and explores promising research directions.
  • A paper that presents a generative model for long time series based on a variational autoencoder with recurrent layers, showing its ability to match or outperform state-of-the-art models on several benchmark datasets.

Sources

Rethinking Time Encoding via Learnable Transformation Functions

A Time-Series Data Augmentation Model through Diffusion and Transformer Integration

A Survey on Temporal Interaction Graph Representation Learning: Progress, Challenges, and Opportunities

Generative Models for Long Time Series: Approximately Equivariant Recurrent Network Structures for an Adjusted Training Scheme

A Neuro-Symbolic Framework for Sequence Classification with Relational and Temporal Knowledge

Built with on top of