Towards a data-driven method for RGB video-based hand action quality assessment in real time

2020 
In recent years, the research community has begun to explore Video-Based Action Quality Assessment on Human Body (VB-AQA), while few work focuses on Video-Based Action Quality Assessment on Human Hand (VH-AQA) yet. The current work on VB-AQA fails to deal with the inconsistency between captured features and the reality due to the changing angles of the camera, leaving a huge gap between VB-AQA and VH-AQA, while the computational efficiency is another critical problem. In this paper, a novel data-driven method for real-time VH-AQA is proposed. Features are formulated as spatio-temporal hand poses in this method and extracted via four steps: hand segmentation, 2D hand pose estimation, 3D hand pose estimation and hand pose organization. Based on the extracted features an assessment model is applied to evaluate the performance of actions and indicate the most promising adjustment as the feedback. We demonstrate the evaluation accuracy and computational efficiency of our method using our own Origami Video Dataset. For the latter, two new metrics are designed. It turns out that our method provides opportunities for real-time digital reconstruction of physical world activities and timely assessment.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    11
    References
    2
    Citations
    NaN
    KQI
    []