Recently, tensor fibered rank has demonstrated impressive performance by effectively leveraging the global low-rank property in all directions for low-rank tensor completion (LRTC). However, it still has some limitations. Firstly, the typical tensor fibered rank approximation based on tensor nuclear norm (TNN) processes fixed and data-independent transformation, which may not be optimal for the underlying tensor structure. Secondly, it ignores the local piecewise smoothness of the dataset. To address these limitations, we present a nonconvex learnable transformed fibered nuclear norm (NLTFNN) model for LRTC,which uses a learnable transformed fibered nuclear norm with Log-Determinant (LTFNNLog) as tensor fibered rank approximation, and employs a total variation (TV) regularization to explore local piecewise smoothness. An efficient algorithm based on the alternating direction method of multipliers (ADMM) is developed to solve NLTFNN and the convergence of the algorithm is proved theoretically. Experiments on various datasets show the superiority of NLTFNN over several existing methods.
翻译:暂无翻译