Motion Capturing with Inertial Measurement Units and Kinect

2014 
This paper presents an approach for the tracking of limb movements using orientation information acquired from Inertial Measurement Units (IMUs) and optical information from a Kinect sensor. A new algorithm that uses a Kalman filter to fuse the Kinect and IMU data is presented. By fusing optical and orientation information we are able to track the movement of limb joints precisely, and almost drift-free. First, the IMU data is processed using the gradient descent algorithm proposed in (Madgwick et al., 2011) which calculates the orientation information of the IMU using acceleration and velocity data. Measurements made with IMUs tend to drift over time, so in a second stage we compensate for the drift using absolute position information obtained from a Microsoft Kinect sensor. The fusion of sensor data also allows to compensate for faulty or missing measurements. We have carried out some initial experiments on arm tracking. The first results show that our technique for data fusion has the potential to be used to record common medical exercises for clinical movement analysis.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    3
    Citations
    NaN
    KQI
    []