Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?
Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur Graph Search.
This thesis focuses on two kinds of statistical inference problems in signal processing and data science. The first problem is the estimation of a structured informative tensor from the observation of a noisy tensor in which it is buried. The structure comes from the possibility to decompose the informative tensor as the sum of a small number of rank-one tensors (small compared to its size). Such structure has applications in data science where data, organized into arrays, can often be explained by the interaction between a few features characteristic of the problem. The second problem is the estimation of a signal input to a feedforward neural network whose output is observed. It is relevant for many applications (phase retrieval, quantized signals) where the relation between the measurements and the quantities of interest is not linear.
We look at these two statistical models in different high-dimensional limits corresponding to situations where the amount of observations and size of the signal become infinitely large. These asymptotic regimes are motivated by the ever-increasing computational power and storage capacity that make possible the processing and handling of large data sets. We take an information-theoretic approach in order to establish fundamental statistical limits of estimation in high-dimensional regimes. In particular, the main contributions of this thesis are the proofs of exact formulas for the asymptotic normalized mutual information associated with these inference problems. These are low-dimensional variational formulas that can nonetheless capture the behavior of a large system where each component interacts with all the others. Owing to the relationship between mutual information and Bayesian inference, we use the solutions to these variational problems to rigorously predict the asymptotic minimum mean-square error (MMSE), the error achieved by the (Bayes) optimal estimator. We can thus compare algorithmic performances to the statistical limit given by the MMSE.
Variational formulas for the mutual information are referred to as replica symmetric (RS) ansätze due to the predictions of the heuristic replica method from statistical physics. In the past decade proofs of the validity of these predictions started to emerge. The general strategy is to show that the RS ansatz is both an upper and lower bound on the asymptotic normalized mutual information. The present work leverages on the adaptive interpolation method that proposes a unified way to prove the two bounds. We extend the adaptive interpolation to situations where the order parameter of the problem is not a scalar but a matrix, and to high-dimensional regimes that differ from the one for which the RS formula is usually conjectured. Our proofs also demonstrate the modularity of the method. Indeed, using statistical models previously studied in the literature as building blocks of more complex ones (e.g., estimated signal with a model of structure), we derive RS formulas for the normalized mutual information associated with estimation problems that are relevant to modern applications.
Jean-Paul Richard Kneib, Huanyuan Shan