Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Gaussian random fields are widely used as building blocks for modeling stochastic processes. This paper is concerned with the efficient representation of d-point correlations for such fields, which in turn enables the representation of more general stochastic processes that can be expressed as a function of one (or several) Gaussian random fields. Our representation consists of two ingredients. In the first step, we replace the random field by a truncated Karhunen-Loève expansion and analyze the resulting error. The parameters describing the d-point correlation can be arranged in a tensor, but its storage grows exponentially in d. To avoid this, the second step consists of approximating the tensor in a low-rank tensor format, the so called Tensor Train decomposition. By exploiting the particular structure of the tensor, an approximation algorithm is derived that does not need to form this tensor explicitly and allows to process correlations of order as high as d = 20. The resulting representation is very compact and its use is illustrated for elliptic partial differential equations with random Gaussian forcing terms.