Time sensitivity and self-organisation in Multi-recurrent Neural Networks

2020 
Model optimisation is a key step in model development and traditionally this was limited to parameter tuning. However, recent developments and enhanced understanding of internal dynamics of model architectures have led to various exploration to optimise and enhance performance through model extension and development. In this paper, we extend the architecture of the Multi-recurrent Neural Network (MRN) to incorporate self-learning recurrent link ratios and periodically attentive hidden units. We contrast and show the superiority of these extensions to the standard MRN for a complex financial prediction task. The superiority is attributed to i) the ability of the self-learning recurrent link ratios to dynamically utilise data to identify optimal parameters of its memory mechanism and ii) the periodically attentive units enabling the hidden layer capture temporal features that are sensitive to different periods of time. Finally, we evaluate our extended MRNs (Self-Learning MRN (SL-MRN) and Periodically Attentive MRN (PA-MRN)), against two current state-of the-art models (Long-Short Term Memory and Support Vector Machines) for an eye state detection task. Our preliminary results demonstrate that the PA-MRN and SL-MRN outperform both state-of-the-art models. These results demonstrate that the MRN extensions are suitable models for machine learning applications and these findings would be further explored.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    31
    References
    2
    Citations
    NaN
    KQI
    []