Human Motion Tracking Algorithm Based on Image Segmentation Algorithm and Kinect Depth Information

2021 
Human motion recognition has an important application value in scenarios such as intelligent monitoring and advanced human-computer interaction, and it is an important research direction in the field of computer vision. Traditional human motion recognition algorithms based on two-dimensional cameras are susceptible to changes in light intensity and texture. The advent of depth sensors, especially the Kinect series with good performance and low price released by Microsoft, enables extensive research based on depth information. However, to a large extent, the depth information has not overcome these problems based on two-dimensional images. This article introduces the research background and significance of human motion recognition technology based on depth information, introduces in detail the research methods of human motion recognition algorithms based on depth information at home and abroad, and analyzes their advantages and disadvantages. The public dataset is introduced. Then, based on the depth information, a method of human motion recognition is proposed and optimized. A moving human body image segmentation method based on an improved two-dimensional Otsu method is proposed to solve the problem of inaccurate and slow segmentation of moving human body images using the two-dimensional Otsu method. In the process of constructing the threshold recognition function, this algorithm not only uses the cohesion of the pixels within the class but also considers the maximum variance between the target class and the background class. Then, the quantum particle swarm algorithm is used to find the optimal threshold solution of the threshold recognition function. Finally, the optimal solution is used to achieve accurate and fast image segmentation, which increases the accuracy of human body motion tracking by more than 30%.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    22
    References
    0
    Citations
    NaN
    KQI
    []