Learning Efficient Tensor Representations with Ring-structured Networks

2019 
Tensor train decomposition is a powerful representation for high-order tensors, which has been successfully applied to various machine learning tasks in recent years. In this paper, we study a more generalized tensor decomposition with a ring-structured network by employing circular multilinear products over a sequence of lower-order core tensors. We refer to such tensor decomposition as tensor ring (TR) representation. Our goal is to introduce learning algorithms including sequential singular value decompositions and blockwise alternating least squares with adaptive tensor ranks. Experimental results demonstrate the effectiveness of the TR model and the learning algorithms. In particular, we show that the structure information and high-order correlations within a 2D image can be captured efficiently by employing an appropriate tensorization and TR decomposition.
    • Correction
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    2
    Citations
    NaN
    KQI
    []