Unsupervised Deep Quadruplet Hashing with Isometric Quantization for image retrieval

2021 
Abstract Numerous studies have shown deep hashing can facilitate large-scale image retrieval since it employs neural networks to learn feature representations and binary codes simultaneously. Despite supervised deep hashing has made great achievements under the guidance of label information, it is hardly applicable to a real-world image retrieval application because of its reliance on extensive human-annotated data. Furthermore, the pair-wise or triplet-wise unsupervised hashing can hardly achieve satisfactory performance due to the absence of local similarity of image pairs. To solve those problems, we propose a novel unsupervised deep hashing framework to learn compact binary codes, which takes the quadruplet forms as input units, called Unsupervised Deep Quadruplet Hashing with Isometric Quantization (UDQH-IQ). Specifically, by introducing the rotation invariance of images, the novel quadruplet-based loss is designed to explore the underlying semantic similarity of image pairs, which could preserve local similarity with its neighbors in Hamming space. To decrease the quantization errors, Hamming-isometric quantization is exploited to maximize the consistency of semantic similarity between binary-like embedding and corresponding binary codes. To alleviate redundancy in different bits, an orthogonality constraint is developed to decorrelate different bits in binary codes. Experimental results on three benchmark datasets indicate that our UDQH-IQ achieves promising performance.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    50
    References
    1
    Citations
    NaN
    KQI
    []