The KIT Bimanual Manipulation Dataset

2021 
Learning models of bimanual manipulation tasks from human demonstration requires capturing human body and hand motions, as well as the objects involved in the demonstration, to provide all the information needed for learning manipulation task models on symbolic and subsymbolic level. We provide a new multi-modal dataset of bimanual manipulation actions consisting of accurate human whole-body motion data, full configuration of both hands, and the 6D pose and trajectories of all objects involved in the task. The data is collected using five different sensor systems: a motion capture system, two data gloves, three RGB-D cameras, a headmounted egocentric camera and three inertial measurement units (IMUs). The dataset includes 12 actions of bimanual daily household activities performed by two healthy subjects with a large number of intra-action variations and three repetitions of each action variation, resulting in 588 recorded demonstrations. A total of 21 household items are used to perform the various actions. In addition to the data collection, we developed tools and methods for the standardized representation and organization of multi-modal sensor data in large-scale human motion databases. We extended our Master Motor Map (MMM) framework to allow the mapping of collected demonstrations to a reference model of the human body as well as the segmentation and annotation of recorded manipulation tasks. The dataset includes raw sensor data, normalized data in the MMM format and annotations, and is made publicly available in the KIT Whole-Body Human Motion Database.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    44
    References
    0
    Citations
    NaN
    KQI
    []