The field of time series analysis and forecasting is moving towards more robust and adaptive methods, with a focus on handling high-dimensional settings, noise, and non-stationarity. Researchers are exploring new architectures and techniques, such as implicit neural representations, continuous-time signal decomposition, and filter equivariant functions, to improve forecasting accuracy and scalability. These innovations have the potential to advance applications in areas like conformal prediction, mobile health interventions, and astrophysical research. Noteworthy papers include:
- Adaptive Nonlinear Vector Autoregression, which proposes a novel adaptive model that combines delay-embedded linear inputs with features generated by a shallow learnable multi-layer perceptron, improving predictive accuracy and robustness under noisy conditions.
- Foundation models for time series forecasting, which demonstrates the potential of foundation models in improving conformal prediction reliability, particularly in data-constrained cases.
- NeuTSFlow, which introduces a novel framework that leverages Neural Operators to facilitate flow matching for learning the transition path in function space from historical to future function families, showing superior accuracy and robustness in forecasting tasks.