Learning Concept Prerequisite Relations from Educational Data via Multi-Head Attention Variational Graph Auto-Encoders

2022 
Recently, the topic of learning concept prerequisite relations has gained the attention of many researchers, which is crucial in the learning process for a learner to decide an optimal study order. However, the existing work still ignores three key factors. (1) People's cognitive differences could make a difference for annotating the prerequisite relation between resources (e.g., courses, textbooks) or concepts (e.g., binary tree). (2) The current vertex (resources or concepts) can be affected by the feature of the neighbor vertex in the resource or concept graph. (3) The feature information of the resource graph may affect the concept graph. To integrate the above factors, we propose an end-to-end graph network-based model called Multi-Head Attention Variational Graph Auto-Encoders (MHAVGAE ) to learn the prerequisite relation between concepts via a resource-concept graph. To address the first two problems, we introduce the multi-head attention mechanism to operate and compute the hidden representations of each vertex over the resource-concept graph. Then, we design a gated fusion mechanism to integrate the feature information of the resource and concept graphs to enrich concept content features. Finally, we conduct numerous experiments to demonstrate the effectiveness of the MHAVGAE across multiple widely used metrics compared with the state-of-the-art methods. The experimental results show that the performance of the MHAVGAE almost outperforms all the baseline methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    29
    References
    0
    Citations
    NaN
    KQI
    []