Triple-Stage Attention-Based Multiple Parallel Connection Hybrid Neural Network Model for Conditional Time Series Forecasting

2021 
The attention-based SeriesNet (A-SeriesNet) combined augmented attention residual learning module-based convolutional neural network (augmented ARLM-CNN) subnetwork with hidden state attention module-based recurrent neural network (HSAM-RNN) subnetwork for conditional time series prediction with high accuracy. The augmented ARLM-CNN subnetwork has defects in extracting latent features of the multi-condition series. The forecasting accuracy will decrease when the feature dimension of the multi-condition series becomes high. The same problem also occurs in the HSAM-RNN subnetwork of A-SeriesNet. The dual-stage attention recurrent neural network (DA-RNN) proved that the attention-based encoder-decoder framework is an effective model for dealing with the above problem. This paper applies the DA-RNN to the HSAM-RNN subnetwork of A-SeriesNet and presents the triple-stage attention-based recurrent neural network (TA-RNN) subnetworks. Furthermore, this paper considers a CNN-based encoder-decoder structure named dual attention residual learning module-based convolutional neural network (DARLM-CNN) subnetwork to improve the augmented ARLM-CNN subnetwork of A-SeriesNet. Finally, this paper presents the triple-stage attention-based SeriesNet (TA-SeriesNet), which uses a new concatenation method instead of the element-wise multiplication of A-SeriesNet to parallel connect the proposed subnetworks and reduce the dependence of forecasting results on a certain subnetwork. The experimental results show our TA-SeriesNet is superior to other deep learning models in forecasting accuracy evaluation metrics for high feature dimensional time series datasets.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    25
    References
    0
    Citations
    NaN
    KQI
    []