DotSCN: Group Re-Identification via Domain-Transferred Single and Couple Representation Learning

2021 
Group re-identification (G-ReID) is an important yet less-studied task. Its challenges not only lie in appearance changes of individuals, but also involve group layout and membership changes. To address these issues, the key task of G-ReID is to learn group representations robust to such changes. Nevertheless, unlike ReID tasks, there still lacks comprehensive publicly available G-ReID datasets, making it difficult to learn effective representations using deep learning models. In this article, we propose a Domain-Transferred Single and Couple Representation Learning Network (DotSCN). Its merits are two aspects: 1) Owing to the lack of labelled training samples for G-ReID, existing G-ReID methods mainly rely on unsatisfactory hand-crafted features. To gain the power of deep learning models in representation learning, we first treat a group as a collection of multiple individuals and propose transferring the representation of individuals learned from an existing labeled ReID dataset to a target G-ReID domain without a suitable training dataset. 2) Taking into account the neighborhood relationship in a group, we further propose learning a novel couple representation between two group members, that achieves better discriminative power in G-ReID tasks. In addition, we propose a weight learning method to adaptively fuse the domain-transferred individual and couple representations based on an L-shape prior. Extensive experimental results demonstrate the effectiveness of our approach that significantly outperforms state-of-the-art methods by 11.7% CMC-1 on the Road Group dataset and by 39.0% CMC-1 on the DukeMCMT dataset.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    33
    References
    5
    Citations
    NaN
    KQI
    []