Time Series Forecasting: Unlocking New Frontiers with Multimodal AI and Quantum-Inspired Models
Latest 50 papers on time series forecasting: Sep. 8, 2025
Time series forecasting is the bedrock of decision-making across countless domains, from predicting stock prices and energy demand to understanding complex system behaviors in microservices. As the world generates data at an unprecedented rate, the challenges of accuracy, robustness, and interpretability in time series predictions only grow. Fortunately, recent breakthroughs in AI and ML are pushing the boundaries, blending sophisticated neural architectures, multimodal data fusion, and even quantum-inspired approaches.
The Big Idea(s) & Core Innovations
Many recent efforts are centered around enhancing existing models, leveraging the power of Large Language Models (LLMs), and developing novel architectures to handle the inherent complexities of time series data. A key trend is the integration of diverse data modalities (text, vision, numerical) to provide richer context for forecasting. For instance, CC-Time: Cross-Model and Cross-Modality Time Series Forecasting from East China Normal University and Huawei Noah’s Ark Lab, leverages pre-trained language models (PLMs) and a cross-model fusion block to capture temporal dependencies and channel correlations from both time series and their text descriptions. Similarly, UniCast: A Unified Multimodal Prompting Framework for Time Series Forecasting by researchers from Pohang University of Science and Technology and The University of Melbourne, introduces a parameter-efficient multimodal soft prompt tuning strategy that extends time series foundation models (TSFMs) to incorporate vision and text, significantly boosting accuracy.
Further exploring multimodal integration, VisionTS++: Cross-Modal Time Series Foundation Model with Continual Pre-trained Visual Backbones from Zhejiang University introduces a colorized multivariate conversion method that transforms multivariate time series into multi-subfigure RGB images, enabling visual backbones to model complex inter-variate dependencies. This innovative approach allows for flexible approximation of arbitrary output distributions, outperforming specialized models by up to 44% in MSE reduction. Another notable contribution is T3Time: Tri-Modal Time Series Forecasting via Adaptive Multi-Head Alignment and Residual Fusion by Abdul Monaf Chowdhury et al. from the University of Dhaka, which integrates temporal, spectral, and prompt-based representations with adaptive multi-head alignment, showing strong generalization even in few-shot learning scenarios.
The challenge of non-stationary data and real-time adaptation is tackled by approaches like Online time series prediction using feature adjustment by Xiannan Huang et al. from Tongji University, which proposes ADAPT-Z, an online adaptation method using current features and historical gradients to address delayed feedback. This focuses on updating latent factor representations rather than model parameters, proving more effective for distribution shifts. On a similar note, Adaptive Fine-Tuning via Pattern Specialization for Deep Time Series Forecasting by Saadallah and Al-Ademi introduces a dynamic framework that selects models based on current temporal patterns, integrating concept drift detection for robustness against non-stationary behavior.
Probabilistic forecasting and uncertainty quantification are also receiving significant attention. RDIT: Residual-based Diffusion Implicit Models for Probabilistic Time Series Forecasting from MIT and Harvard University, decouples point estimation from residual modeling using diffusion processes with bidirectional Mamba networks. This framework achieves state-of-the-art performance by minimizing CRPS and aligning predictive coverage. Another work, Probabilistic QoS Metric Forecasting in Delay-Tolerant Networks Using Conditional Diffusion Models on Latent Dynamics by John Trunix et al., leverages conditional diffusion models to capture latent dynamics for QoS metrics in Delay-Tolerant Networks, providing more robust probabilistic forecasts than traditional mean regression. Furthermore, PriceFM: Foundation Model for Probabilistic Electricity Price Forecasting from Delft University of Technology and Austrian Institute of Technology, introduces a spatiotemporal foundation model with graph-based inductive biases to forecast electricity prices probabilistically across 24 European countries.
Addressing the critical issue of privacy and security, Privacy Risks in Time Series Forecasting: User- and Record-Level Membership Inference by researchers from NIST and UC Berkeley, demonstrates the vulnerability of time series models to membership inference attacks, highlighting the need for careful design to prevent data leakage. Complementing this, Q-DPTS: Quantum Differentially Private Time Series Forecasting via Variational Quantum Circuits proposes a hybrid quantum-classical framework for differentially private time series forecasting, showcasing the feasibility of combining quantum computing with privacy mechanisms.
Under the Hood: Models, Datasets, & Benchmarks
Recent research has introduced or heavily utilized several innovative models, datasets, and benchmarking tools to drive advancements:
- CHRONOGRAPH Dataset: ChronoGraph: A Real-World Graph-Based Multivariate Time Series Dataset by Adrian Catalin Lutu et al. (Bitdefender, University of Bucharest) offers the first graph-based multivariate time series dataset from real production microservices, complete with expert-annotated incident windows for anomaly detection. Code is publicly available.
- ADAPT-Z: Introduced in Online time series prediction using feature adjustment (Tongji University), ADAPT-Z is an online adaptation method for multi-step forecasting that leverages current features and historical gradients. Public code is available at https://github.com/xiannanhuang/ADAPT-Z.
- RDIT Framework: RDIT: Residual-based Diffusion Implicit Models for Probabilistic Time Series Forecasting (MIT, Harvard University) combines diffusion processes with bidirectional Mamba networks for probabilistic forecasting. Code is available at https://anonymous.4open.science/r/RDIT-16BB/.
- TeR-TSF Framework: Text Reinforcement for Multimodal Time Series Forecasting (SynLP Research Group, Tsinghua University) is an RL-driven data augmentation framework for multimodal time series forecasting. Code is available at https://github.com/synlp/TeR-TSF.
- BALM-TSF: BALM-TSF: Balanced Multimodal Alignment for LLM-Based Time Series Forecasting (University of Birmingham, Siemens AG) is a lightweight dual-branch framework for LLM-based time series forecasting. Code is available at https://github.com/ShiqiaoZhou/BALM-TSF.
- Quantum-Optimized Selective State Space Model: Quantum-Optimized Selective State Space Model for Efficient Time Series Prediction (University of California, Berkeley) integrates quantum optimization into classical time series forecasting. Code is available at https://github.com/stephanjura27/quantum.
- iTFKAN: iTFKAN: Interpretable Time Series Forecasting with Kolmogorov-Arnold Network (Hong Kong Polytechnic University) provides an interpretable framework using KANs for time series forecasting. Code is available at https://github.com/ziyangliang/iTFKAN.
- DeepEDM: LETS Forecast: Learning Embedology for Time Series Forecasting (University of Wisconsin-Madison) combines empirical dynamic modeling with deep learning. Code: https://abrarmajeedi.github.io/deep_edm.
- FlowState: FlowState: Sampling Rate Invariant Time Series Forecasting (IBM Research Europe, ETH Zurich) is an SSM-based TSFM that dynamically adjusts to varying sampling rates. Code: https://github.com/IBMResearchZurich/FlowState.
- PriceFM Dataset and Model: PriceFM: Foundation Model for Probabilistic Electricity Price Forecasting (Delft University of Technology, Austrian Institute of Technology) introduces the largest open dataset for European electricity markets and a spatiotemporal foundation model. Code: https://github.com/runyao-yu/PriceFM.
- DLTransformer: Distributed Lag Transformer based on Time-Variable-Aware Learning for Explainable Multivariate Time Series Forecasting introduces an explainable model for multivariate forecasting. Code: https://github.com/kYounghwi/DLFormer_official.
- TLCCSP: TLCCSP: A Scalable Framework for Enhancing Time Series Forecasting with Time-Lagged Cross-Correlations (Beijing Normal University) incorporates time-lagged cross-correlations for improved accuracy. Code: https://arxiv.org/abs/2412.10104.
- QuiZSF: QuiZSF: An efficient data-model interaction framework for zero-shot time-series forecasting (University of Science and Technology of China) is a retrieval-augmented generation framework for zero-shot forecasting. Code is available.
- TFB Benchmark: TFB: Towards Comprehensive and Fair Benchmarking of Time Series Forecasting Methods (East China Normal University, Huawei, Aalborg University) provides an automated benchmark to address shortcomings in existing TSF evaluations. Code: https://github.com/decisionintelligence/TFB.
Impact & The Road Ahead
These advancements herald a new era for time series forecasting, offering more accurate, robust, and interpretable models for critical applications. The move towards multimodal foundation models marks a significant paradigm shift, allowing AI systems to reason with diverse information sources—numerical, textual, and visual—just as humans do. This will empower better predictions in complex, real-world scenarios like financial markets, smart cities, and healthcare, where context is king. The emphasis on explainable AI (XAI), exemplified by models like iTFKAN and PAX-TS, will foster trust and facilitate adoption in high-stakes domains, enabling domain experts to understand why a forecast was made.
Furthermore, the exploration of quantum-inspired techniques and differential privacy in time series forecasting, as seen in Q-DPTS and QTFT, hints at a future where powerful predictions are not only accurate but also inherently secure and privacy-preserving. The development of robust benchmarking frameworks like TFB is crucial to ensure that these new methods are evaluated fairly and comprehensively, preventing biases and accelerating genuine progress.
Looking forward, the integration of LLM-agents for data refinement, as shown by DCATS, suggests a future where autonomous AI systems can not only forecast but also actively improve the data they learn from. This could lead to more self-optimizing and resilient forecasting systems. While challenges remain in balancing performance with computational efficiency and interpretability, the collective progress outlined in these papers paints a vivid picture of a future where time series forecasting is more intelligent, adaptive, and trustworthy than ever before. The journey to unlock the full potential of temporal data continues with exhilarating momentum!
Post Comment