Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
We extend results on the dynamical low-rank approximation for the treatment of time-dependent matrices and tensors (Koch and Lubich; see [SIAM J. Matrix Anal. Appl., 29 (2007), pp. 434-454], [SIAM J. Matrix Anal. Appl., 31 (2010), pp. 2360-2375]) to the recently proposed hierarchical Tucker (HT) tensor format (Hackbusch and Kuhn; see [J. Fourier Anal. Appl., 15 (2009), pp. 706-722]) and the tensor train (TT) format (Oseledets; see [SIAM J. Sci. Comput., 33 (2011), pp. 2295-2317]), which are closely related to tensor decomposition methods used in quantum physics and chemistry. In this dynamical approximation approach, the time derivative of the tensor to be approximated is projected onto the time-dependent tangent space of the approximation manifold along the solution trajectory. This approach can be used to approximate the solutions to tensor differential equations in the HT or TT format and to compute updates in optimization algorithms within these reduced tensor formats. By deriving and analyzing the tangent space projector for the manifold of HT/TT tensors of fixed rank, we obtain curvature estimates, which allow us to obtain quasi-best approximation properties for the dynamical approximation, showing that the prospects and limitations of the ansatz are similar to those of the dynamical low rank approximation for matrices. Our results are exemplified by numerical experiments.