Multiplicative Attention Mechanism for Multi-horizon Time Series Forecasting

2021 
Multi-horizon time series forecasting plays an important role in many industrial and business decision processes. To grasp complex and various patterns across different time series is the crucial step in achieving promising performance. However, most deep learning-based forecasting approaches simply take series-specific static (i.e. time-invariant) covariates as input features, which can fail to capture the complex pattern variation for each possible time series. In this paper, we propose a novel multiplicative attention-based architecture to tackle such forecasting problem. Our modification to multi-head attention layers leverages the series-specific covariates to build flexible attention functions for each possible time series. This improvement contributes to greater representation capacity to grasp different patterns across related time series. Experiment results demonstrate that our approach achieves state-of-the-art performance on a variety of real-world datasets.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    32
    References
    0
    Citations
    NaN
    KQI
    []