Joint Metric Learning on Riemannian Manifold of Global Gaussian Distributions

2019 
In many computer vision tasks, images or image sets can be modeled as a Gaussian distribution to capture the underlying data distribution. The challenge of using Gaussians to model the vision data is that the space of Gaussians is not a linear space. From the perspective of information geometry, the Gaussians lie on a specific Riemannian Manifold. In this paper, we present a joint metric learning (JML) model on Riemannian Manifold of Gaussian distributions. The distance between two Gaussians is defined as the sum of the Mahalanobis distance between the mean vectors and the log-Euclidean distance (LED) between the covariance matrices. We formulate the multi-metric learning model by jointly learning the Mahalanobis distance and the log-Euclidean distance with pairwise constraints. Sample pair weights are embedded to select the most informative pairs to learn the discriminative distance metric. Experiments on video based face recognition, object recognition and material classification show that JML is superior to the state-of-the-art metric learning algorithms for Gaussians.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    31
    References
    0
    Citations
    NaN
    KQI
    []