Transformer-Modulated Diffusion Models for Probabilistic Multivariate Time Series Forecasting
Published in ICLR, 2024
Transformers have gained widespread usage in multivariate time series (MTS) forecasting, delivering impressive performance. Nonetheless, these existing transformer-based methods often neglect an essential aspect: the incorporation of uncertainty into the predicted series, which holds significant value in decision- making. In this paper, we introduce a Transformer-Modulated Diffusion Model (TMDM), uniting conditional diffusion generative process with transformers into a unified framework to enable precise distribution forecasting for MTS. TMDM harnesses the power of transformers to extract essential insights from historical time series data. This information is then utilized as prior knowledge, capturing covariate-dependence in both the forward and reverse processes within the dif- fusion model. Furthermore, we seamlessly integrate well-designed transformer- based forecasting methods into TMDM to enhance its overall performance. Ad- ditionally, we introduce two novel metrics for evaluating uncertainty estimation performance. Through extensive experiments on six datasets using four evalua- tion metrics, we establish the effectiveness of TMDM in probabilistic MTS fore- casting.
Recommended citation: Li, Y., Chen, W., Hu, X., Chen, B., & Zhou, M. (2023, October). Transformer-Modulated Diffusion Models for Probabilistic Multivariate Time Series Forecasting. In The Twelfth International Conference on Learning Representations. https://openreview.net/pdf?id=qae04YACHs