Real-time avatar animation with dynamic face texturing

2016 
In this paper, we present a system to capture and animate a highly realistic avatar model of a user in real-time. The animated human model consists of a rigged 3D mesh and a texture map. The system is based on KinectV2 input which captures the skeleton of the current pose of the subject in order to animate the human shape model. An additional high-resolution RGB camera is used to capture the face for updating the texture map on each frame. With this combination of image based rendering with computer graphics we achieve photo-realistic animations in real-time. Additionally, this approach is well suited for networked scenarios, because of the low per frame amount of data to animate the model, which consists of motion capture parameters and a video frame. With experimental results, we demonstrate the high degree of realism of the presented approach.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    22
    References
    9
    Citations
    NaN
    KQI
    []