Publication

Riemannian Optimization For High-Dimensional Tensor Completion

Abstract

Tensor completion aims to reconstruct a high-dimensional data set where the vast majority of entries is missing. The assumption of low-rank structure in the underlying original data allows us to cast the completion problem into an optimization problem restricted to the manifold of fixed-rank tensors. Elements of this smooth embedded submanifold can be efficiently represented in the tensor train or matrix product states format with storage complexity scaling linearly with the number of dimensions. We present a nonlinear conjugate gradient scheme within the framework of Riemannian optimization which exploits this favorable scaling. Numerical experiments and comparison to existing methods show the effectiveness of our approach for the approximation of multivariate functions. Finally, we show that our algorithm can obtain competitive reconstructions from uniform random sampling of few entries compared to adaptive sampling techniques such as cross-approximation.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.