Robust Low-Rank Tensor Completion Based on Tensor Ring Rank via $\ell _{p,\epsilon }$ -Norm

2021 
Tensor completion aims to recover missing entries given incomplete multi-dimensional data by making use of the prior low-rank information, and has various applications because many real-world data can be modeled as low-rank tensors. Most of the existing methods are designed for noiseless or Gaussian noise scenarios, and thus they are not robust to outliers. One popular approach to resist outliers is to employ $\ell _p$ -norm. Yet nonsmoothness and nonconvexity of $\ell _p$ -norm with $0 bring challenges to optimization. In this paper, a new norm, named $\ell _{p,\epsilon }$ -norm, is devised where $\epsilon >0$ can adjust the convexity of $\ell _{p,\epsilon }$ -norm. Compared with $\ell _p$ -norm, $\ell _{p,\epsilon }$ -norm is smooth and convex even for $0 , which converts an intractable nonsmooth and nonconvex optimization problem into a much simpler convex and smooth one. Then, combining tensor ring rank and $\ell _{p,\epsilon }$ -norm, a robust tensor completion formulation is proposed, which achieves outstanding robustness. The resultant robust tensor completion problem is decomposed into a number of robust linear regression (RLR) subproblems, and two algorithms are devised to tackle RLR. The first method adopts gradient descent, which has a low computational complexity. While the second one employs alternating direction method of multipliers to yield a fast convergence rate. Numerical simulations show that the two proposed methods have better performance than those based on the $\ell _p$ -norm in RLR. Experimental results from applications of image inpainting, video restoration and target estimation demonstrate that our robust tensor completion approach outperforms state-of-the-art methods in terms of recovery accuracy.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    51
    References
    0
    Citations
    NaN
    KQI
    []