Advances in Time Series Analysis and Causal Discovery

The field of time series analysis and causal discovery is rapidly evolving, with a focus on developing innovative methods to capture complex nonlinear dependencies and spurious correlations. Recent research has explored the use of Transformer-based architectures, such as multi-layer time-series forecasters and attention-inspired gated Mixture-of-Experts, to improve forecasting accuracy and efficiency. Additionally, there is a growing interest in integrating prior knowledge and causal discovery into beam management pipelines to enhance reliability and interpretability. Noteworthy papers in this area include Transforming Causality, which introduces a novel framework for temporal causal discovery and inference, and Causal Beam Selection, which proposes a causally-aware deep learning framework for beam management. Other notable works, such as GateTS and FreezeTST, have demonstrated the potential of simplified training processes and parameter-efficient designs for univariate time series forecasting. The development of open-source libraries, like WHAR datasets and pyFAST, is also facilitating more efficient and reproducible research in this field. Overall, these advances are paving the way for more accurate, reliable, and interpretable time series analysis and causal discovery methods.

Sources

Transforming Causality: Transformer-Based Temporal Causal Discovery with Prior Knowledge Integration

Causal Beam Selection for Reliable Initial Access in AI-driven Beam Management

WHAR Datasets: An Open Source Library for Wearable Human Activity Recognition

Enhancing Transformer-Based Foundation Models for Time Series Forecasting via Bagging, Boosting and Statistical Ensembles

GateTS: Versatile and Efficient Forecasting via Attention-Inspired routed Mixture-of-Experts

Frozen in Time: Parameter-Efficient Time Series Transformers via Reservoir-Induced Feature Expansion and Fixed Random Dynamics

HypER: Hyperbolic Echo State Networks for Capturing Stretch-and-Fold Dynamics in Chaotic Flows

MOCHA: Discovering Multi-Order Dynamic Causality in Temporal Point Processes

pyFAST: A Modular PyTorch Framework for Time Series Modeling with Multi-source and Sparse Data

HierCVAE: Hierarchical Attention-Driven Conditional Variational Autoencoders for Multi-Scale Temporal Modeling

FinCast: A Foundation Model for Financial Time-Series Forecasting

Reverse Designing Ferroelectric Capacitors with Machine Learning-based Compact Modeling

Compositionality in Time Series: A Proof of Concept using Symbolic Dynamics and Compositional Data Augmentation

GPT-FT: An Efficient Automated Feature Transformation Using GPT for Sequence Reconstruction and Performance Enhancement

ATM-GAD: Adaptive Temporal Motif Graph Anomaly Detection for Financial Transaction Networks

Built with on top of