Universal transformer Hawkes process with adaptive recursive iteration

2021 
Abstract Asynchronous events sequences are widely distributed in the natural world and human activities, such as earthquakes records, users’ activities in social media, and so on. How to distill the information from these seemingly disorganized data is a persistent topic that researchers focus on. One of the most useful models is the point process model, and on the basis, the researchers obtain many noticeable results. Moreover, in recent years, point process models on the foundation of neural networks, especially recurrent neural networks (RNN) are proposed and compare with the traditional models, their performance is greatly improved. Enlighten by transformer model, which can learn sequence data efficiently without recurrent and convolutional structure, transformer Hawkes process comes out, and achieves state-of-the-art performance. However, there is some research proving that the re-introduction of recursive calculations in transformer can further improve transformer’s performance. Thus, we come out with a new kind of transformer Hawkes process model, universal transformer Hawkes process (UTHP), which contains both recursive mechanism and self-attention mechanism, and to improve the local perception ability of the model, we also introduce convolutional neural network (CNN) in the position-wise-feed-forward part. We conduct experiments on several datasets to validate the effectiveness of UTHP and explore the changes after the introduction of the recursive mechanism. These experiments on multiple datasets demonstrate that the performance of our proposed new model has a certain improvement compared with the previous state-of-the-art models.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    36
    References
    0
    Citations
    NaN
    KQI
    []