Memory-based Transformer with shorter window and longer horizon for multivariate time series forecasting

2022 
forecasting is an important problem that spans many fields. One challenge of this problem is the complex and non-linear interdependence between time steps and different variables. Recent studies have shown that Transformer has potential in capturing long-term dependencies. However, in the field of time series forecasting, Transformer still has some problems to solve, such as prediction fragmentation and insensitivity to data scale. In addition, traditional forecasting models often require a large amount of input data to support the training of the model when predicting long-term data. However, it is hard to provide sufficient time series input data due to equipment damage or weather situation. To solve these limitations, a memory-based Transformer with shorter window and longer horizon is proposed, called SWLHT. It uses the memory mechanism to make the model no longer only rely on a single input, but can combine the previous forecast results to assist in capturing long-term dependencies, thereby avoiding the requirement of excessively long input sequence. Furthermore, the memory mechanism can alleviate the prediction fragmentation to some extent. The experimental results and comparison of baselines on several real-world multivariate time series datasets have verified the effectiveness of the proposed model.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []