**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Publication# Spectral Estimators for High-Dimensional Matrix Inference

Abstract

A key challenge across many disciplines is to extract meaningful information from data which is often obscured by noise. These datasets are typically represented as large matrices. Given the current trend of ever-increasing data volumes, with datasets growing larger and more complex, it is necessary to develop matrix inference methodologies which provide us with the tools to deal with high-dimensional matrices.This thesis presents a theoretical exploration of high-dimensional matrix inference problems. The high-dimensional nature of the matrices makes them amenable to the application of statistical methods in the high-dimensional limit. We primarily investigate spectral estimators, which are based on the spectral properties of matrices and constructed using their singular vectors or eigenvectors. The methodologies employed are rooted in random matrix theory and statistical physics, alongside results from the high-dimensional limits of spherical integrals. This approach provides a comprehensive theoretical framework for understanding matrix inference in the context of large-scale data.We begin by studying low-rank estimation problems in the mismatched setting, where perfect knowledge of the priors for both signal and noise is not available. In this scenario, we derive the exact analytic expression for the asymptotic mean squared error (MSE) in the large system size limit for the particular case of Gaussian priors and additive noise for both symmetric and non-symmetric signals. Our formulas demonstrate that in the mismatched case, effective estimation is achievable, and the minimum MSE (MMSE) can be attained by selecting a non-trivial set of parameters beyond the matched parameters. Furthermore, we compare the performance of the spectral algorithms and Approximate Message Passing (AMP) in the mismatched setting. In the latter part of the thesis, we explore the extensive-rank matrix inference problems using the framework of rotationally invariant estimators (RIEs). In the symmetric case, we study the asymptotic mutual information and MMSE of denoising problem under Gaussian noise. Moreover, we extend RIEs to accommodate rectangular matrices for general rotational invariant noise matrices. Consequently, we derive the asymptotic MMSE in this setting. Finally, we investigate a statistical model for matrix factorization, and derive analytical formulas for the optimal RIE to reconstruct the two matrix factors, given the noisy observation of their product.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related concepts (39)

Related MOOCs (23)

Related publications (197)

Matrix (mathematics)

In mathematics, a matrix (plural matrices) is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object. For example, is a matrix with two rows and three columns. This is often referred to as a "two by three matrix", a " matrix", or a matrix of dimension . Without further specifications, matrices represent linear maps, and allow explicit computations in linear algebra.

Mean squared error

In statistics, the mean squared error (MSE) or mean squared deviation (MSD) of an estimator (of a procedure for estimating an unobserved quantity) measures the average of the squares of the errors—that is, the average squared difference between the estimated values and the actual value. MSE is a risk function, corresponding to the expected value of the squared error loss. The fact that MSE is almost always strictly positive (and not zero) is because of randomness or because the estimator does not account for information that could produce a more accurate estimate.

Estimator

In statistics, an estimator is a rule for calculating an estimate of a given quantity based on observed data: thus the rule (the estimator), the quantity of interest (the estimand) and its result (the estimate) are distinguished. For example, the sample mean is a commonly used estimator of the population mean. There are point and interval estimators. The point estimators yield single-valued results. This is in contrast to an interval estimator, where the result would be a range of plausible values.

Algebra (part 1)

Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.

Algebra (part 1)

Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.

Algebra (part 2)

Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.

Jean-Paul Richard Kneib, Huanyuan Shan

In certain cases of astronomical data analysis, the meaningful physical quantity to extract is the ratio R between two data sets. Examples include the lensing ratio, the interloper rate in spectroscopic redshift samples, and the decay rate of gravitational ...

In the rapidly evolving landscape of machine learning research, neural networks stand out with their ever-expanding number of parameters and reliance on increasingly large datasets. The financial cost and computational resources required for the training p ...

,

Given a family of nearly commuting symmetric matrices, we consider the task of computing an orthogonal matrix that nearly diagonalizes every matrix in the family. In this paper, we propose and analyze randomized joint diagonalization (RJD) for performing t ...