Disturbance Grassmann Kernels For Subspace-Based Learning

Authors:
Junyuan Hong University of Science and Technology of China
Huanhuan Chen University of Science and Technology of China
Feng Lin University of Science and Technology of China

Introduction:

In this paper, the authors focus on subspace-based learning problems, where data elements are linear subspaces instead of vectors. Grassmann kernels were proposed to measure the space structure and used with classifiers.

Abstract:

In this paper, we focus on subspace-based learning problems, where data elements are linear subspaces instead of vectors. To handle this kind of data, Grassmann kernels were proposed to measure the space structure and used with classifiers, e.g., Support Vector Machines (SVMs). However, the existing discriminative algorithms mostly ignore the instability of subspaces, which would cause the classifiers to be misled by disturbed instances. Thus we propose considering all potential disturbances of subspaces in learning processes to obtain more robust classifiers. Firstly, we derive the dual optimization of linear classifiers with disturbances subject to a known distribution, resulting in a new kernel, Disturbance Grassmann (DG) kernel. Secondly, we research into two kinds of disturbance, relevant to the subspace matrix and singular values of bases, with which we extend the Projection kernel on Grassmann manifolds to two new kernels. Experiments on action data indicate that the proposed kernels perform better compared to state-of-the-art subspace-based methods, even in a worse environment.

You may want to know: