Anipose: a toolkit for robust markerless 3D pose estimation

2020 
Quantifying movement is critical for understanding animal behavior. Advances in computer vision now enable markerless tracking from 2D video, but most animals live and move in 3D. Here, we introduce Anipose, a Python toolkit for robust markerless 3D pose estimation. Anipose consists of four components: (1) a 3D calibration module, (2) filters to resolve 2D tracking errors, (3) a triangulation module that integrates temporal and spatial constraints, and (4) a pipeline for processing large numbers of videos. We evaluate Anipose on four datasets: a moving calibration board, fruit flies walking on a treadmill, mice reaching for a pellet, and humans performing various actions. Because Anipose is built on popular 2D tracking methods (e.g., DeepLabCut), users can expand their existing experimental setups to incorporate robust 3D tracking. We hope this open-source software and accompanying tutorials (www.anipose.org) will facilitate the analysis of 3D animal behavior and the biology that underlies it.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    96
    References
    27
    Citations
    NaN
    KQI
    []